Welcome to my blog. Here, I will post items of interest to me most likely focusing on:
Friday, March 23, 2007
Tuesday, March 13, 2007
Encrypting hard drives from Seagate
This week, Seagate technology made headlines with their announcement of a new encrypting hard drive. The idea is that the hard drive will automatically encrypt and decrypt data so that it will always be stored encrypted. That way if a laptop with this hard disk is lost or stolen, the data will not be accessible to an attacker. I performed a search on this story on google news today and came up with over 250 articles covering this announcement.
I think that the drive is an appropriate choice for where to encrypt data, but the limitations of this approach should be addressed, and none of the news stories that I read mentioned the shortcomings of drive-level encryption. On the positive side, data in this scheme is encrypted on the fly so that users and applications do not need to participate in the encryption - it is entirely transparent. A raw hard drive physically extracted from a laptop provides no data to an attacker, assuming a proper encryption key is used. This provides protection for the data at rest, when nobody is using the computer, and no user is logged in.
However, an encrypted drive does not guarantee that attackers can never access the data on the disk. To function properly, the system must allow access to legitimate users. This access must be simple and transparent. My expectation is that the user login password will be used to derive the encryption keys that protect the data on the drive. But, regardless of the scheme used to obtain the key, when a user is active on the machine, the keys must be available to the hard drive so that data can be encrypted and decrypted in the course of normal use. At that time, the data is just as available to malicious code in the form of spyware, Trojan horses and viruses as it is to the legitimate user. If the system is designed well, then the keys will be erased whenever a user logs out. Another problem with login keys to encrypt the drives is that user-level keys are frequently susceptible to dictionary attacks.
I'm not certain, though, that user-level keying makes sense for a drive-level encryption scheme. Drives contain all kinds of data, including system data, and data from many different users. At the disk drive level, there is no notion of a user, just data blocks. So, it would be awkward to use login keys to encrypt the drive. How would system files be decrypted? In fact, all kinds of file system information, such as file permissions, are not supposed to be known at the disk drive level. So, my feeling is that there is not an intuitive key management scheme for the Seagate hard drives. I'd be curious to know what they are doing in that regard. Encryption is great, but without proper key management, its benefits are questionable.
I applaud Seagate for pushing the envelope and encrypting at the drive level. Such a move by the leading manufacturer of disks can only be good news for those concerned about security. But, I caution users not to blindly trust that their data is no longer susceptible to theft. As long as users can access their data, so can attackers, and the security of the data on a lost laptop is to a large extent dependent on what Seagate did for key management - a difficult problem that is often left unsolved.
I think that the drive is an appropriate choice for where to encrypt data, but the limitations of this approach should be addressed, and none of the news stories that I read mentioned the shortcomings of drive-level encryption. On the positive side, data in this scheme is encrypted on the fly so that users and applications do not need to participate in the encryption - it is entirely transparent. A raw hard drive physically extracted from a laptop provides no data to an attacker, assuming a proper encryption key is used. This provides protection for the data at rest, when nobody is using the computer, and no user is logged in.
However, an encrypted drive does not guarantee that attackers can never access the data on the disk. To function properly, the system must allow access to legitimate users. This access must be simple and transparent. My expectation is that the user login password will be used to derive the encryption keys that protect the data on the drive. But, regardless of the scheme used to obtain the key, when a user is active on the machine, the keys must be available to the hard drive so that data can be encrypted and decrypted in the course of normal use. At that time, the data is just as available to malicious code in the form of spyware, Trojan horses and viruses as it is to the legitimate user. If the system is designed well, then the keys will be erased whenever a user logs out. Another problem with login keys to encrypt the drives is that user-level keys are frequently susceptible to dictionary attacks.
I'm not certain, though, that user-level keying makes sense for a drive-level encryption scheme. Drives contain all kinds of data, including system data, and data from many different users. At the disk drive level, there is no notion of a user, just data blocks. So, it would be awkward to use login keys to encrypt the drive. How would system files be decrypted? In fact, all kinds of file system information, such as file permissions, are not supposed to be known at the disk drive level. So, my feeling is that there is not an intuitive key management scheme for the Seagate hard drives. I'd be curious to know what they are doing in that regard. Encryption is great, but without proper key management, its benefits are questionable.
I applaud Seagate for pushing the envelope and encrypting at the drive level. Such a move by the leading manufacturer of disks can only be good news for those concerned about security. But, I caution users not to blindly trust that their data is no longer susceptible to theft. As long as users can access their data, so can attackers, and the security of the data on a lost laptop is to a large extent dependent on what Seagate did for key management - a difficult problem that is often left unsolved.
Friday, March 09, 2007
The FSU report on the ES&S iVotronic used in Sarasota County
On February 23, a team of computer scientists, based out of Florida State University put out an exceptional report analyzing the ES&S iVotronic 8.0.1.2 voting machine firmware. The reason that this particular machine was of interest is that it was used in the 13th Congressional race in Sarasota County last November. As many of you know, this is the machine that was responsible for approximately 18,000 undervotes in that race. The research team was chartered with the task of attempting to determine if anything related to that code could have caused the missing votes due to some bug in the software on the voting machine. Of course, they could only analyze the source code of software that was supposed to be on the machine. They did not have an opportunity to examine whether or not the binaries actually running on those machines corresponded to that source code, nor is such a determination possible today.
When I first heard about this study (and I was even approached about joining it), my first thought was that it is a silly idea to try to figure out what went wrong in Sarasota County by analyzing the source code. So many factors that have nothing to do with the source code could have contributed to the problem, and source code analysis cannot be used to find all problems that may have arisen in the software. There are all kinds of run time conditions such as, for example, race conditions and runtime bounds errors that could cause problems without the ability to be detected by source code analysis.
However, the team, which contains quite a few all stars, proved that even though a source code analysis is not likely to shed any light on what happened in this particular election, it is nonetheless an extremely valuable exercise. I wish more real voting systems were subjected to such careful scrutiny followed by a public report. I have not seen the confidential appendices in this report, but just from the table of contents, it is clear that some serious problems were found in this machine, and once again it boggles the mind that it was ever certified and used in elections. On page 37, section 7.1 begins as follows:
This is reminiscent of the vulnerability that the Princeton team exploited in the Diebold DRE. I would not suggest reading this report before bed, because it is quite scary. To me, the Princeton work, coupled with this FSU report should serve as wake-up calls to the elections community that these sorts of studies need to take place before voting systems are deployed, not after an election has proven problematic. Studies such as the FSU one should be done as part of the certification process. This report clearly uncovered problems that would have been show stoppers, and yet, relatively little attention has been paid to this.
When I first heard about this study (and I was even approached about joining it), my first thought was that it is a silly idea to try to figure out what went wrong in Sarasota County by analyzing the source code. So many factors that have nothing to do with the source code could have contributed to the problem, and source code analysis cannot be used to find all problems that may have arisen in the software. There are all kinds of run time conditions such as, for example, race conditions and runtime bounds errors that could cause problems without the ability to be detected by source code analysis.
However, the team, which contains quite a few all stars, proved that even though a source code analysis is not likely to shed any light on what happened in this particular election, it is nonetheless an extremely valuable exercise. I wish more real voting systems were subjected to such careful scrutiny followed by a public report. I have not seen the confidential appendices in this report, but just from the table of contents, it is clear that some serious problems were found in this machine, and once again it boggles the mind that it was ever certified and used in elections. On page 37, section 7.1 begins as follows:
"We identified several buffer overflow vulnerabilities that in a worst case scenario may allow an attacker to take control of a voting machine by corrupting data on a PEB. These create the possibility of a virus that propagates by exploiting the buffer overflow vulenrability."
This is reminiscent of the vulnerability that the Princeton team exploited in the Diebold DRE. I would not suggest reading this report before bed, because it is quite scary. To me, the Princeton work, coupled with this FSU report should serve as wake-up calls to the elections community that these sorts of studies need to take place before voting systems are deployed, not after an election has proven problematic. Studies such as the FSU one should be done as part of the certification process. This report clearly uncovered problems that would have been show stoppers, and yet, relatively little attention has been paid to this.
American Idol - I demand a recount!
For this posting, I have to admit something that will probably lose me the respect of many, and yet I can't help it. Here goes... I am a closet American Idol fan. Every week, my wife and I go downstairs after the kids are in bed and we watch the most recently Tivo'ed episode of American Idol. We don't like the early rounds very much, which are mostly about watching the judges humiliate unfortunate people who don't realize they can't sing. But once the top 24 are chosen, we really enjoy the singing and the drama of who will be eliminated.
As someone who is consumed with voting and voting security, I have more than once wondered about the voting on the American Idol show. How easy would it be to rig the vote that is conducted over the phone? A friend of mine has some pretty good and convincing ideas for ways to tamper with the votes using computers and automated dialing tricks and even taking advantage of some weaknesses in the phone system. I'm not sure if the tricks he has suggested are legal, and I'm certain that most of the population wouldn't know how to do them. Although, it would only take one enterprising attacker to really mess with the votes. I'm convinced of that.
Last night, the unthinkable happened. Sabrina Sloan was eliminated and missed making the top 12. There are several reasons why I find it impossible to believe that the vote was fair. I had Sabrina pegged as #3 in the overall competition, after Lakisha jones and Melinda Doolitle. Okay, you could argue that maybe Stephanie Edwards is up there with Sabrina. But, American Idol is also about popularity and looks. Sabrina is by far the most attractive of the candidates, and in my opinion she has that star quality to her. She is also an absolutely incredible singer. I'm not alone in my thinking. All three judges were completely stunned by this result. Furthermore, Sundance Head (who I don't think was that spectacular) lost out and Sanjaya Malakar advanced. Now Sanjaya seems like a nice kid, but he's totally out of his league on Idol, and Sundance can sing circles around him. Not only that, Sundance has real personality and charm, and is just the kind of person that goes far in this competition. He's better than at least 3 of the guys who advanced. Far better.
So, is it possible that the judges are wrong? They can be wrong, but I don't think they can be that wrong about these two singers that were cut. Considering that Haley Scarnato and Sanjaya Malakar made it to the elite 12 and Sabrina and Sundance did not, I have to figure there was some funny business with the vote. I don't know if it was because somebody hacked the phone lines, somebody read the results wrong, somebody was paid off, or any combination of the above. But there is no way on Earth that America voted this way this week.
Having a non-verifiable vote, like the one on American Idol can result in people like me being upset that we won't get to watch Sabrina Sloan sing any more on Idol. We can be upset that Chris Daughtry did not win last year when he was by far the best singer, as his album sales are demonstrating this year. But, that's about where it ends. Having non-verifiable voting in public elections, with the doubt that such election outcomes can have, is much more serious.
As someone who is consumed with voting and voting security, I have more than once wondered about the voting on the American Idol show. How easy would it be to rig the vote that is conducted over the phone? A friend of mine has some pretty good and convincing ideas for ways to tamper with the votes using computers and automated dialing tricks and even taking advantage of some weaknesses in the phone system. I'm not sure if the tricks he has suggested are legal, and I'm certain that most of the population wouldn't know how to do them. Although, it would only take one enterprising attacker to really mess with the votes. I'm convinced of that.
Last night, the unthinkable happened. Sabrina Sloan was eliminated and missed making the top 12. There are several reasons why I find it impossible to believe that the vote was fair. I had Sabrina pegged as #3 in the overall competition, after Lakisha jones and Melinda Doolitle. Okay, you could argue that maybe Stephanie Edwards is up there with Sabrina. But, American Idol is also about popularity and looks. Sabrina is by far the most attractive of the candidates, and in my opinion she has that star quality to her. She is also an absolutely incredible singer. I'm not alone in my thinking. All three judges were completely stunned by this result. Furthermore, Sundance Head (who I don't think was that spectacular) lost out and Sanjaya Malakar advanced. Now Sanjaya seems like a nice kid, but he's totally out of his league on Idol, and Sundance can sing circles around him. Not only that, Sundance has real personality and charm, and is just the kind of person that goes far in this competition. He's better than at least 3 of the guys who advanced. Far better.
So, is it possible that the judges are wrong? They can be wrong, but I don't think they can be that wrong about these two singers that were cut. Considering that Haley Scarnato and Sanjaya Malakar made it to the elite 12 and Sabrina and Sundance did not, I have to figure there was some funny business with the vote. I don't know if it was because somebody hacked the phone lines, somebody read the results wrong, somebody was paid off, or any combination of the above. But there is no way on Earth that America voted this way this week.
Having a non-verifiable vote, like the one on American Idol can result in people like me being upset that we won't get to watch Sabrina Sloan sing any more on Idol. We can be upset that Chris Daughtry did not win last year when he was by far the best singer, as his album sales are demonstrating this year. But, that's about where it ends. Having non-verifiable voting in public elections, with the doubt that such election outcomes can have, is much more serious.
Wednesday, March 07, 2007
Today's Congressional hearing
I testified today in a hearing of the US House Appropriations Subcommittee on Financial Services and General Government in Washington DC. Here is my written testimony. The hearing was very interesting. I think we've come a long way from the days when members of Congress had no idea what was going on with respect to e-voting security. The questions, for the most part were intelligent, well researched, and to the point. Many of the questions were directed at another witness, Donetta Davidson, who is Chairwoman of the Election Assistance Commission. The Members grilled her about the lack of accountability of the EAC after they provide money to the states. They also asked for some third party research reports that the EAC has kept confidential.
It turns out that the ranking member of the subcommittee is from Diebold's home district. So, predictably, he tried to ask me challenging questions that sounded as though they were written by Diebold. "Voters love these machines, so why am I arguing against them?" I pointed out that none of my complaints against the DREs have to do with whether or not the voters like them. He also asked me why I would want to go back to an error-prone system such as op-scan when Diebold DREs in Maryland virtually eliminated voter error. I explained to him that modern optical scanners in precincts can provide the same level of overvote and undervote detection. He seemed to run out of steam after that.
Another member of the committee gave me the best opening I think I've ever had. He asked me if I thought it was possible to have a trustworthy and secure election using paperless DREs. I replied "no". He then said, "Why?" It was a question I was hoping for. I explained that a software only system, especially one as complex as a DRE where all all of the voter input and vote tabulation takes place in a closed box, cannot possibly be audited. There is no way to know for sure that the totals produced by the machines at the end of the election correspond to the votes that were cast by the voters.
Finally, I was asked if I thought that a DRE with a paper trail was an adequate voting system. I replied that when I first studied the Diebold DRE in 2003, I felt that a Voter Verified Paper Audit Trail (VVPAT) provided enough assurance. But, I continued, after four years of studying the issue, I now believe that a DRE with a VVPAT is not a reasonable voting system. The only system that I know of that achieves software independence as defined by NIST, is economically viable and readily available is paper ballots with ballot marking machines for accessibility and precinct optical scanners for counting - coupled with random audits. That is how we should be conducting elections in the US, in my opinion.
It turns out that the ranking member of the subcommittee is from Diebold's home district. So, predictably, he tried to ask me challenging questions that sounded as though they were written by Diebold. "Voters love these machines, so why am I arguing against them?" I pointed out that none of my complaints against the DREs have to do with whether or not the voters like them. He also asked me why I would want to go back to an error-prone system such as op-scan when Diebold DREs in Maryland virtually eliminated voter error. I explained to him that modern optical scanners in precincts can provide the same level of overvote and undervote detection. He seemed to run out of steam after that.
Another member of the committee gave me the best opening I think I've ever had. He asked me if I thought it was possible to have a trustworthy and secure election using paperless DREs. I replied "no". He then said, "Why?" It was a question I was hoping for. I explained that a software only system, especially one as complex as a DRE where all all of the voter input and vote tabulation takes place in a closed box, cannot possibly be audited. There is no way to know for sure that the totals produced by the machines at the end of the election correspond to the votes that were cast by the voters.
Finally, I was asked if I thought that a DRE with a paper trail was an adequate voting system. I replied that when I first studied the Diebold DRE in 2003, I felt that a Voter Verified Paper Audit Trail (VVPAT) provided enough assurance. But, I continued, after four years of studying the issue, I now believe that a DRE with a VVPAT is not a reasonable voting system. The only system that I know of that achieves software independence as defined by NIST, is economically viable and readily available is paper ballots with ballot marking machines for accessibility and precinct optical scanners for counting - coupled with random audits. That is how we should be conducting elections in the US, in my opinion.
Friday, March 02, 2007
Herald Tribue infected by virus
Today's Southwest Florida's Herald Tribune online has the following disclaimer on the web site:
Reading this made me think of the times that I and other computer scientists postulated that a virus, such as the one Ed Felten's team wrote at Princeton, could infect a voting system and copy itself through the memory cards and the voting terminals. The voting machine environment might be more difficult to infect than the Herald Tribune, but the possibility definitely exists. Every time I hear people argue that this could never happen, I wonder what these people would have said about the possibility of a virus corrupting a major newspaper's operations such that the paper was printed with several pages missing. What's more, having looked at the Diebold source code, I wonder if the voting machine vendors' security procedures are better or worse than those of this newspaper.
March 02. 2007 8:36AM
An apology to our newspaper readers
STAFF REPORT
A computer virus crippled parts of the Herald-Tribune's production equipment Thursday night, forcing the newspaper to print Friday's editions without several of its local news, sports and editorial pages. The technical problems also caused papers to be delivered late. We apologize to our readers and advertisers. Our technicians are working diligently to fix the problems that the virus attack created and to ensure that they are not repeated.
Reading this made me think of the times that I and other computer scientists postulated that a virus, such as the one Ed Felten's team wrote at Princeton, could infect a voting system and copy itself through the memory cards and the voting terminals. The voting machine environment might be more difficult to infect than the Herald Tribune, but the possibility definitely exists. Every time I hear people argue that this could never happen, I wonder what these people would have said about the possibility of a virus corrupting a major newspaper's operations such that the paper was printed with several pages missing. What's more, having looked at the Diebold source code, I wonder if the voting machine vendors' security procedures are better or worse than those of this newspaper.
Subscribe to:
Posts (Atom)