Wednesday, March 31, 2010

Adding trust assurance to SSL through web of trust

See the previous post for background if you're not familiar with why SSL can't solidly be trusted.

UPDATE: I created an additional post for existing solutions.

One method which has been proposed for SSL certification path trust is DNSSEC (SECure DNS).  If DNS is "trustworthy", then why not add an extension to DNSSEC to leverage the hierarchy for SSL certs?  RSnake has a good post on this method.

Personally, I don't like it.  DNS is still a distributed hierarchy, and can be subverted at the individual server level.  If my DNS server is compromised, then I can't trust the addresses it gives me, and so I can't trust the certs it tells me.  "Lawful Intercept" is also a problem here: governments can assert authority over companies, and can likely compromise DNSSEC in the same way they're apparently compromising SSL.

In my opinion, the better solution is the old "Web of Trust".  Suppose I publish a blog entry saying "mail.google.com is using a SSL cert from Thawte".  You read that post.  Then, if you visit mail.google.com and find that it's using a different cert, it's an indication that something weird is happening.  Conceptually, it's similar in this application to Crowdsourcing.

Here are the requirements I see:

Global diverse distributionPublish SSL cert paths for large numbers of websites on a large number of different Trust servers, so my browser can randomly select a few for its check. Large numbers of Trust servers will make take-downs difficult to orchestrate, and global diversity will make legal countermeasures difficult to implement.
Client convenienceBuild it into the browser so the action happens automatically on page load. Reduce the barriers to client adoption.  Browser plug-ins are a good start.
Server convenienceMake it very easy for random internet users to set up Trust servers, to increase the number of servers available, like WordPress plugin easy.
Self-PolicingTrust servers should monitor the data from each other, to detect changes and "outside interference".
Cross-linkingIt must be easy to find Trust servers. The naive answer is for servers to recommend other servers - but then a compromised Trust server could direct the browser to other compromised Trust servers.
DynamicTrust servers should track client queries as updated data points. It will build the Trust tables quicker, and larger samples make anomaly detection easier.

Now come the hard problems to solve: the design conflicts.
  • Anonymity vs. Data Integrity:  Anonymous usage is a strong design goal for a Trust server, but anonymity makes it easier to poison the data through malicious updates.
  • Anonymity vs. Anomaly Detection:  Trending information will be incredibly useful to find out whether certain jurisdictions are applying SSL trust spoofing.
  • Trust Coverage vs. Server Resources:  How to make each Trust server able to store large number of website cert paths, while not requiring multiple GB of data.  This is probably the easiest "hard" problem.
  • Legitimate Updates vs. Illegitimate Actions:  If a website legitimately changes its SSL cert, how does that information propagate?

Other questions to consider:
  • Should Trust leverage p2p techniques like Torrent?  Clients can cross-communicate, cache other client IP addresses...
  • What kind of statistical history should be kept, and what kind of analysis performed?  Should anomalies be tracked?
  • Can anti-spam techniques such as reputation be applied?
  • Should there be different levels of trust between servers?
  • Is it worth the effort to implement?
  • What's a good name?  How about p2pki?
Comments welcome as always.

The problem with SSL certs.

SSL as a method of creating trust is broken.  It should be no news by now that SSL can be subverted by using a cert which appears completely valid because it uses an alternate certification path.

Quick SSL certification primer
SSL uses "certificates" to identify https websites.  When the website webmaster sets up the site, s/he purchases a certificate from a Certificate Authority (CA).  The CA should make the webmaster "prove" that the site is "real" - it would be bad for a CA to sell me a cert alleging that I'm google.com if I'm not actually Google.

The reason that it's important for a CA to verify the website is because web browsers "trust" the CA.  When a web browser loads the cert for the https website, the cert contains the URL and the certification path - basically, the list of which CA issued the cert.  Web browsers come pre-loaded with a list of CAs, and a method to identify that the cert really was issued by that CA.

Bottom line: if my web browser trusts a CA, then any cert issued by that CA looks like a valid ID for the website.  If a different CA issues a different cert for the same site, it looks equally valid to the web browser - which means that someone doing something sneaky can fool my browser, and fool me.

How to fake a certificate
WildcardUse wildcard characters to create SSL certs which are valid for multiple sites. Announced at DefCon in 2009, the vulnerability was quickly patched by many browsers.
"Lawful Intercept"There are appliances targeted at law enforcement which can present alternate SSL certs for sites. Governments presumably have the legal means to order Certificate Authorities to generate these look-alike certs - and hopefully will only use them in the course of legal criminal investigations.


Many Internet users assume that the system is designed in a way that makes everything work.  For the most part, they're right.  DNS, for example, generally allows web browsers to find a website based on a hierarchy: I don't know where www.google.com is, so my browser asks my DNS server, who doesn't know.  Using a technique known as recursion, my DNS server will go ask the root server for the .com name space, who doesn't specifically know, but it tells my server where to find the authoritative server for the google.com domain.  My server asks the google.com server where to find www.google.com, gets the response, and tells my web browser.  I know I can trust the answer because I know I can trust the DNS .com root server.

Here's the problem:  There is no SSL root CA.  Given the current political and commercial state of the Internet, there's not likely to be any agreement about establishing a root CA.  There is not currently a convenient nor reliable method to check that the certification path for a SSL cert is the "real" path.

The next blog post will cover potential solutions.

Monday, March 29, 2010

One-Time Passwords via cell phone?

Recap for new viewers: a replacement method for passwords as authticators must be Convenient, Secure, and Ephemeral.

If the primary security weakness of passwords is due to their static nature, then use of a dynamic string of characters, aka One-Time Password (OTP), may address this problem.  A OTP cannot, by definition, be used more than once - which neatly addresses most problems of password interception.  If, however, the OTP is generated based on entering a static password on an untrusted machine, then I argue that there is no additional security, merely security theater.  If I'm at an internet kiosk, or any other case in which I don't control the machine I'm using, then I shouldn't enter my password on that machine.  If the machine might store the static data I use for authentication, whether password or biometric or RFID etc., then the owner/pwner of that machine can pretend to be me.  There must be external generation of the OTP.

Comments from the previous post seem to imply that a OTP may be a decent password replacement as long as it is externally generated.  From the "something I [ know | have | am ]" set, only "something I have" seems to be able to provide a dynamic and potentially secure authentication.  The complaint about "something I have" is that it's inconvenient.  However, a potential external generator is a cell phone.  In our current society, a cell phone has become almost a necessity - which implies that it is sufficiently convenient to carry.  (More specifically, forgetting a cell phone generally causes more pain than forgetting a password, so we tend to remember to bring them.)

I'm familiar with two methods of using cell phones as OTP generators.  One method takes input of a PIN and returns the OTP.  ("PIN" is fundamentally just another term for "password".)  The "advantage" of this type of app is generally that it does not require any connection from the phone to the network, whether voice, SMS, or data - saving on cost.  If the phone is not "sync'd" in some way to the service, or if the app returns the same string each time, then it's just a password manager - which doesn't solve the problem examined by this post.  In some way, the phone should be set so it's "unique", and can reliably be an authenticator of  me and only me.

The other method of using a phone for OTP is to simply to present the server-generated OTP to me, via phone call or SMS.  I like the simplicity of this solution, as it doesn't rely on software loaded on the phone, but only on the uniqueness of my phone number.  (My plan gives me unlimited SMS messages, so I personally don't have any cost concerns.)

In either method, if I lose the phone, I want to be able to revoke its status as an authentication token - its representation of "me" must be ephemeral.  Even if my first thought isn't "oops, better de-register my phone as an OTP generator", it's still more likely that I'll take action than if someone else starts using my password.

Since Google is edging into the VOIP market through GOOG411, Google Voice, etc., it seems like an obvious step for them to take to add SMS OTP delivery for their services.

Here's what I see for a method to register for phone-based OTP delivery, just in case Google hasn't learned from their Buzz experience:
  1. User sets preference for "Strong authentication".
  2. Google lists a phone number and registration code, with instructions to SMS.
  3. Google sends back a reply code (SMS).
  4. User enters the reply code in the webUI.
Problems:
  • DoS: attacker constantly tries to log on as user, creating large number of SMS / calls.
  • Need alternate login method for "lost my phone" situation: automatically de-registers phone.
  • SMS isn't always "timely" in its delivery.
Question: is it sufficient to have a fallback to password if the SMS method isn't available?  The "password" login method should arguably trigger an SMS message if it's used.  Of course, the other question is whether, by removing the need to enter the password every time, the user will forget the password.

Good comments so far... I'd reply to them if I could figure out how to make Blogger let me.  Until then, more posts!

Sunday, March 28, 2010

A better mousetrap password

Thoughts on what a "good" password replacement should include:

Convenience: For a replacement to be as convenient as a password, I should be able to login to a service even if I forget my keys, wallet, badge, or  PC. If I'm using a friend's laptop while at a coffee shop, logging in should not depend on a cert stored on any media, or installation of any proprietary software.
Security: This is the classic weakness of passwords.  My password shouldn't be exposed even if someone shoulder-surfs me, intercepts my packets, steals my cookie, or has pwn'd the machine I'm using to login.
Ephemeral: The nature of a password is that I can replace it easily, but I might forget it. A great password replacement shouldn't be as fragile as a password I can forget, but should still allow me to perform a useful operation in case I believe my account has been compromised.

In the realm of authentication, it's common to discuss multi-factor authentication, wherein proof of identity involves more than one of the following:
  • Something I know
  • Something I have
  • Something I am
The classic example is the ATM card: withdrawing money from an ATM involves the card (something I have) and the PIN (something I know).

From a convenience point of view, "Something I have" seems to be disqualified.  If I have to have something, but I don't have it with me, then I can't login.  There's the edge case of an implant - but getting one doesn't necessarily qualify as convenient.  Furthermore, an implant arguably blurs the line between "something I have" into "something I am".

Classic passwords are "Something I know" - which leads to both the benefits and problems.  If it's something that can be guessed by someone else, it's not secure - it becomes something that other people know.  However, if the password requirements are stringent, it's likely that I'll forget it - in which case, it's not really something I know.

The third classic factor for authentication, "something I am", is generally viewed as synonymous with biometrics.  The current cost/benefit of biometrics is generally regarded as insufficient - either it's not good enough, or too expensive.  Furthermore, once the biometric information is read, it becomes a pattern of bits, which is effectively a static authentication token. (Malicious software on the PC I use can store my fingerprint / face / retina / etc and pretend to be me.)

My view right now is that the best currently-available solution is the common  physical token which provides a dynamic authentication each time.  Costs are coming down: Blizzard sells these for World of Warcraft for $6.

The second best solution would likely be a system which generated an auth token in response to a login-time challenge.  However, the method of generation is still questionable: if one of the inputs to the generator is a password (e.g. S/KEY or OPIE), then the security of this solution is still reduced to that of passwords.

However, both of these methods seems to require a physical device - which conflicts with the requirement for convenience.

Any suggestions from the field?

Sunday, March 14, 2010

Doing passwords wrong

Following up on my previous post regarding the password alternatives discussion, there are clearly advantages to using passwords, mostly in terms of convenience for the user and the site or service operator.  Passwords are easy to set up, and act as a persistent shared-secret auth correlator token for identity.

However, as a single datum for identity correlation, there are several failure cases for passwords. 

First, 2 quick definitions:
a) losing a password: my password is no longer accessible to me, e.g. I forgot it, or it was changed.

b) password is compromised: someone else can use my password.  (The amazing Jeremiah Grossman has a recent blog post on how password compromises can happen, as well as some potential work-arounds.)


Now, the failure cases:

1) Password protects something important.  What will happen if my password is lost or compromised?  Will I lose money from someone compromising my online banking password?  Will one of my friends post as me on Twitter if I leave my computer unlocked?  Or did the only person who knew the code to the armory door just get shot during an attack?  If there is harm that can be avoided by using other security measures, is it worth the probable inconvenience to overcome the potential for that harm?

2) Password can be intercepted.  Can someone else compromise my password by watching it in its path between me and the site/service?  The classic response to this question is "We use SSL/TLS" - which encrypts the network traffic so it can't be read if it's intercepted.  However, an equally effective form of this attack is shoulder-surfing, where I simply watch you as you type your password.  The classic solution for shoulder-surfing is user education.  However, attendees at a recent security conference seemed oblivious to being shoulder-surfed, even by someone taking their picture.

3) Password can be stolen.  There are several well-established techniques for verifying passwords that don't require the site/service to keep the password around.  Storing the password - even in an encrypted form - isn't necessary.  Furthermore, if the password isn't stored in any state it can be recovered, then it can't be stolen, even if someone copies all of the files from the server.

4) Password is guessable.  If the password is based on PII, it's a lot easier to guess.  If your dog's name is Fluffy and your password is "fluffy", then I can probably guess your password, especially if it's the password you use for the blog about that dog.  There are tools specifically made to gather information about people - and computers are very patient at trying lots of passwords.

5) Password is too complex.  One answer that has been proposed to several of these failure cases is to enforce complex passwords - must be at least 8 characters, mixed case, at least 1 number, etc.  This is not the way most people are used to thinking.  As a result, it becomes difficult for most people to create a password that satisfies these criteria which they can actually remember.

6) Password reminders / reset.  The demands of complex passwords tax the capacity of the average user to create a strong password, and exceed their ability to remember that password.  Mindful that these passwords should not be written down (even in pencil), users naturally forget.  System designers must therefore add in password reminder or recovery methods - which of course become alternate means for attack.  Quick show of hands - whose first car was a Honda?  (Keep your hands up while I write this down... j/k).

No wonder there is a desire to avoid passwords - they're the worst solution, except for all of the others.

Coming soon:
  • Doing passwords right, or at least not failing
  • Parameters for non-password authentication

Thursday, March 11, 2010

Passwords: Pro and Con

As a follow-up to the Password Alternatives talk at B-Sides San Francisco by some smart people (watch video), it's clear that password-based authentication has both use cases and failure cases.  The first step in the larger conversation seems IMHO to be defining the advantages and disadvantages for this time-tested method.

Pro:
  1. Portable:  Passwords, being stored in a person's memory, can be taken anywhere.
  2. Durable:  Passwords, being non-physical, can't be "lost".
  3. Convenient:  Quick to set up, requires no special client software.
  4. Ephemeral:  If I change my password, the old one is gone!
  5. "Secure":  Password strength is roughly proportional to length and alphabet size.  A password can potentially be unguessable.
Con:
  1. Static:  The password is the same every time.
  2. Anonymous:  A password is not a unique identifier tied to a person.  If I know someone else's password, I can authenticate myself as them.
  3. Forgettable:  I forgot 4got foreg0t feurgot 4Gott where did I write that darn thing down...
  4. Potentially insecure:  Memorable correlates with guessable, especially if it's based on PII, or uses "standard" sub5t1tu7i0ns 4 l3tters.
So... when are passwords appropriate?

      Tuesday, March 9, 2010

      Asocial Media

      Slowly my resistance to social media is evaporating.

      After bsidesSF, I've started Twittering... then submitted a talk for bsidesLV... and now a blog.

      Let's see who finds this before I tell anyone.