Friday, August 24, 2007

IronKey -- Clipper Chip revisited?

Can there be any doubt what music these guys were listening to (with the amp turned up to 11) as they developed their core packaging and brand-image concepts? As products marketed to technology geeks go, this one practically exudes faux testosterone from its microscopic and epoxy-hardened metallic pores.

Plus, IronKey is more affordable and easier to park than a Hummer (not one of those newer girlie-man models, mind you, but the good old military-wannabe war-wagons that became the late-1990s ride of choice for cigar-chomping multi-millionaires, whose Austrian accents -- whether imitated or real -- seem so doggone corny, in retrospect).

Gizmodo describes the IronKey product as follows:

"Designed to be the world's most secure flash drive, the IronKey employs military-grade AES hardware-based encryption using its IronKey Cryptochip. The encryption keys are stored on the drive itself and your password is required in conjunction with the keys to access and decrypt files. If you forget your password, you may be in trouble; after ten consecutive failed password attempts, the IronKey self-destructs (internally) and erases everything on the drive using "flash-trash" technology that physically overwrites every byte, making the data completely unrecoverable."

In other words, the tough-as-nails military-rugged guts of this gadget have been dressed by the marketing department in an elegant and sexy, shaken-not-stirred dinner jacket of espionage-chic, almost as seductive as Peter Graves's self-incinerating tape recorder -- a prop from back in the day when the "Mission Impossible" brand (queue the original Lalo Schifrin theme song) had not been corrupted by (queue record scratch) Scientology.

Wired magazine certainly were on to something when they assigned the name fetish to their monthly envy-column for gadget-enthusiasts. Gadget-philia always has tapped into instinct and the subconscious, down in the limbic system, to bypass rational, prefrontal cortex concerns about mere pedestrian practicality. Many gadget people will confess that, much like the nearest housecat, they just simply cannot help but respond to certain stimuli.

And I, too, confess. Military-grade corniness aside, there's still something almost irresistibly appealing about this product, as can be seen from how each shipment of the 4GB model seems to sell out even before the new cargo-container arrives.

After seeing the IronKey on Gizmodo and Slashdot, how could I possibly resist? Mine should arrive within the week.

So this is *not* a product review. Just some observations about a product of interest. Reviews are available on other sites, if that's what you are after.

Likewise, despite the title, those looking for an announcement that the sky is falling and the Apocalypse is upon us, because some sinister government cabal is out to impose mandatory key-escrow on all our cryptographic applications, or that the IronKey is just the first step down the slippery slope to the Panopticon, Big Brother, surveillance society, will inevitably be disappointed. They'll be disappointed for two reasons.

The first reason is this: A mass-market flash drive (no matter how sophisticated) is hardly the best or most ideal way to spearhead the relentless drive toward Total Information Awareness. Much better and more effective methods to establish universal surveillance as a norm already are in place, or well on their way to implementation, both in government, and in private industry. So, getting all alarmed about IronKey is a little like a resident of New Orleans, during or shortly after Katrina, getting worked up about the urgent need to fix a leaky roof. That roof is not the main reason you're up to your armpits in water, snakes and toxic chemicals. Indeed, on balance, a customer using the IronKey is probably substantially better off than someone who uses a flash drive with no cryptographic features at all. (Interestingly, a similar argument also could be made that -- even today -- two people using phones with Clipper Chips in them, to enable secure communications, on balance would be better off than comparable people using phones without any encryption capability at all).

The second reason the conspiracy crowd will be disappointed, is that the only "Clipper Chip" criticism I have to level against the IronKey, relates to a technical issue, not a public policy issue. Simply put, for technical reasons (analogous in some ways to the technical -- not policy -- criticisms leveled against Clipper), the IronKey and the services that come bundled with it, appear to be somewhat less secure than they could be. From the standpoint of "security by design," IronKey has (for whatever reason) elected to make its customers at least marginally more vulnerable to unauthorized disclosure of their information to third-parties, than could be the case. Since IronKey differentiates its product from potential competitors, principally, on the basis of security, it would seem to make sense that such technical issues might matter to some prospective customers. That said, I've decided to become a customer, anyway. Caveat emptor.

Returning to the merits of the IronKey (apart from the way it tempts your inner Steve Austin, by stimulating that vicarious Bourne Redundancy, Übermensch daydream that gadget-geeks universally have known since childhood), the practical selling-point of this device, in reality, simply is not all the Desmond Llewellyn cryptochip techno-wizardry but rather, the built-in access to a souped-up TOR network -- which enables customers to access the Internet in a (relatively) secure manner, even if using a public terminal at a coffee shop or hotel.

Other products *are* available, that can enable similar (but not identical) functionality using *any* flash drive -- such as XeroBank's XB Browser (free trials available on XeroBank Website). Since I'm not presently in a position to compare IronKey's product with that of XeroBank, I'll refrain from offering any comparison of product functionality. Moreover, in the interest of full disclosure, I've done some trademark-related work for XeroBank, and therefore I will not purport to offer a completely objective opinion. One point of comparison that certainly will matter for some prospective customers, however, is that XeroBank's infrastructure is located mostly outside the United States. The location of IronKey's infrastructure, at least according to their Privacy Policy, appears to be primarily in California.

If you are concerned (even if it is just as a matter of principle) about the U.S. government (or even some other government) paying unwanted attention to your personal communications and data, then it makes sense to look at the "fine print" in IronKey's Privacy Policy (minor edits have been made to promote clarity, but not to change the intended meaning):

"Will IronKey share my information with other companies or people?

* * * *

"[W]e disclose personal information
[but] only in the good faith belief that we are required to do so by law, or that doing so is reasonably necessary to: (i) comply with legal process; (ii) respond to any spamming or Internet crimeware abuses; or (iii) to protect the rights, property or personal safety of IronKey, our customers, or the public.

"Note to United States customers: IronKey complies fully with all laws of the United States and the State of California. If required by law, through subpoena or other legal requirement, we will release information in our possession about members that are the subject of an investigation.

"Note to European customers:
The information you provide us will be transferred outside the European Economic Area for the purpose of processing by IronKey, Inc., its affiliates and agents. By submitting your information, you agree to that international transfer."

Not to put too fine a point on it, but XeroBank's Client Secrecy Guarantee (excerpted below) appears to embody an entirely different category of thinking about privacy issues than the perspective expressed by IronKey (i.e., engineer the thing in the first place to minimize the amount of data that can be given or lost to a third-party, even in the absolutely most cataclysmic conceivable worst-case scenario):

"Requests from Authorities

"XeroBank has built its privacy networks to have client account data separated, segregated, and encrypted on multiple servers in multiple countries so no single party can compromise a client and their data. Most internal account transaction details are not mathematically reversible due to one-way operations. Subsequently, XeroBank does not have specific client data to share with network providers, legal authorities, or law enforcement of any jurisdiction. In the case that such authorities can validate claims of violation of XeroBank's Terms of Service, we will attempt to terminate the client account the abuse originated from. If XeroBank is served with court orders of all appropriate jurisdictions for all specific servers, we may be forced to attempt to trace live data connections. A coordinated multijurisdictional effort is highly unlikely, even in the most improbable of circumstances. Violation of XeroBank's Terms of Service invalidates the Client Secrecy Guarantee. XeroBank will not aid or protect criminals. If fraud or hacking is detected within XeroBank's networks, we will proactively notify and cooperate with authorities to track and identify the criminals involved. XeroBank is not a service to mask abusive or threatening actions; thieves and criminals beware."

So, why -- exactly -- does the article title refer to the Clipper Chip? Principally because I had written most of it before I realized the ultimate point could be made much more easily by comparing IronKey's privacy policy with the approach taken by another company. But I also believe that the "Clipper Chip" hook offers a chance to tell an interesting story or two. At least, I find this stuff interesting. If you do, too, read on.

To begin, presumably not all readers were paying close attention to crypto policy in the first term of the Clinton administration (1993-96). Nor has everyone read Steven Levy's wonderful book, Crypto: How the Code Rebels Beat the Government -- And Saved Privacy In The Digital Age (Penguin 2001). So it probably makes sense to say a little bit about what the Clipper Chip is (or was) (that's clipper *chip* -- not a wind-propelled vessel with lots of sails, like the Cutty Sark).

As an aside, from today's perspective (August, 2007), Levy's book -- about "How the Code Rebels . . . Saved Privacy In the Digital Age" -- is roughly analogous to a history of George Lucas's Star Wars universe, published in late 1979. Not everyone, in 1979 -- having recently seen the Death Star explode, and Darth Vader's spaceship spin out of control, off into empty space -- had any idea that the Empire might soon strike back.

This year's Computers Freedom and Privacy conference (where quite few of those "Code Rebels" hang out together), in Montreal, was held several weeks before things got even worse (specifically, Judges Batchelder and Gibbons issued a decision about illegal wiretapping that would make even Franz Kafka marvel at the triumph of the will over reason. Yet, I think it is fair to report that -- compared with, say, the 2001 CFP in Boston -- CFP 2007 had a somewhat more somber vibe (something some Code Rebels might even describe as more of a hanging by the knees underneath the Cloud City of Bespin, with one hand cut off, atmosphere). That's not to say that the "Code Rebels" are completely demoralized. It is just fair to say that . . . uh, challenges remain.

At this year's CFP in Montreal, I had the personal honor (I suppose you might call it that) of accepting a Privacy International Big Brother Award, a fetching image of a boot stomping on a human face, forever -- in the category of Worst Public Official (which really ought to have gone to the recipient's former colleague Shannen Coffin, or perhaps to Coffin's new boss) -- on behalf of my good friend (and former boss) Stewart A. Baker. I first met Stewart, and did some work for him, after he returned to private practice, following his stint as general counsel for the NSA. In a subsequent post, I've got more stories to relate about Stewart, but for now I'll resist the impulse to digress. It was at the NSA (just before I met him) that Stewart's name became forever inseparable from the Clipper Chip fiasco.

In a C|Net news article about Stewart's recent appointment to a key policy role at the Homeland Security Department, Declan McCullagh summarizes:

"In a famous article published in the June 1994 issue of Wired Magazine, Baker warned against the ready availability of strong, secure encryption products without backdoors. 'One of the earliest users of (Pretty Good Privacy) was a high-tech pedophile in Santa Clara, California,' Baker wrote. 'He used PGP to encrypt files that, police suspect, include a diary of his contacts with susceptible young boys using computer bulletin boards all over the country.'"

To be fair, neither Baker personally, nor the Clinton Administration ever (to my knowledge) expressly advocated any government ban on private encryption technology; nor did they ever propose any government mandate of "back-doors" or the mandatory use of the Clipper Chip. That said, Declan is absolutely right that Stewart's hyperbole about pedophiles really is (and remains) a cheap shot, and that somebody as smart as Baker really ought to know better.

What the 1994 Wired article, titled "Don't Worry; Be Happy" (an allusion to the Bush 41 administration that undoubtedly endeared the article's author, a Bush 41 hold-over, with his new Clinton Administration bosses), really was about, was dismissing what Stewart characterized as certain "myths" then in circulation about the Clipper Chip.

At that time, the U.S. had imposed excessively stringent export controls, to prohibit U.S. individuals and companies from shipping strong encryption technology (hardware or software) overseas. There was no outright ban on the use of such technology by private individuals within the U.S. (and now, of course, SSL is built into just about any Web browser, and SSH is in widespread use, not to mention other useful ways to employ crypto -- such as PGP / GPG -- that have become widely available).

Although there was no domestic ban, the export control regime widely was viewed outside government (and even inside the government, to a substantial degree) as the metaphorical camel's nose; the first step down a slippery slope of probable widespread U.S. domestic surveillance, accompanied by the erection of government obstacles to domestic adoption of effective privacy-protection technology.

These disputes over privacy and crypto did not start suddenly in 1992; they had been going on for some time, already.

The Clinton Administration, roughly three months after assuming office, proceeded naïvely to march right into the political minefield by issuing this announcement to say that the National Security Administration had been developing, in secret, for quite some time (since before Ronald Reagan left office), a hardware-based crypto technology, that was very inexpensive. The Clipper Chip was so inexpensive, according to the announcement, that it would enable telephone manufacturers (among others) to bring secure telephony to the masses. No longer would secure phone equipment (like the good old STU-III) be found only in Fort Meade, the Pentagon, the halls of government, and offices of specialized contractors and think-tanks.

Just after Clinton took office, the Administration started to appreciate the benefit of export controls and Clipper, thorugh alarmist briefings from the intel community about how "crypto" was very dangerous technology for ordinary people to have. Hence, the Clipper announcement probably did not seem so controversial in April, 1993, just before Clipper was introduced to the public (meaning, for the first time made known to people without security clearances).

As product launches go, the Clipper announcement does not compare very favorably with the way products ought to be revealed for the first time in public. No doubt, I'm hardly the first to make this observation. Presumably, more than a few people from the Regan and Bush 41 administrations, had committed this lesson to memory before returning (at one time or another, or, in some cases, over and over again) to government service after the 2000 election. Others, perhaps, missed the memo, or misunderstood the lesson.

Without getting into the question of whether the market for private telephone devices, in 1993, was actually experiencing any grass-roots demand for intel-grade hardware-based, anti-eavesdropping technology (I seem to remember that Disney and AT&T had rather different ideas about what customers wanted in their phones), it is fair to say that the reception the Clipper product received was -- well -- less than enthusiastic.

Honestly, the product was designed for government crypto engineers -- not for the consumer market. The NSA cares deeply about such matters as chip packaging, and making sure that a would-be attacker cannot compromise the security of a crypto chip by sanding down the chip packaging and probing the silicon guts. Frankly, this kind of stuff tends to make eyes glaze over in the consumer market. I won't get into the "user interface" of a clipper phone, but suffice it to say that most end-users would have found it about as simple and intuitive to use (and appealing to learn) as a UNIX or DOS command line. But really, from the standpoint of a crypto engineer, the Clipper was a thing of beauty -- something that anyone ought to be proud of. No doubt, Baker, who is a really smart guy (and who, after all, came in at the end of the process -- after it had already been developed -- and then got saddled with the task of persuading both the market and the Clinton Adminsitration to understand and appreciate the brilliance of the design) rather quickly was seduced by all the careful thought and engineering prowess that went into Clipper.

The most important "feature" of Clipper, of course, was "key escrow" -- in theory, a way to protect privacy, yet still to ensure that the government could wiretap phones when it needed to do so. The "key escrow" mechanism was, as one might expect, designed by engineers and lawyers with IQs at least 4 or 5 standard deviations above the mean. Accordingly, explaining to the average consumer how "key escrow" works and what it is for, requires rather more than a bumper-sticker. The bumper-sticker version (and all that the average 100 IQ person takes away from the explanation, no matter how hard the explainer tries to compress the message into an intuitive format) is this: "The government is going to tap your phones!" I won't bore the reader with the non-bumper-sticker version.

Returning to the less-than-enthusiastic public reception that Clipper received after introduction-- in particular, John Perry Barlow of the EFF, and Marc Rotenberg of EPIC (then with CPSR), recognized a lot of problems with Clipper (not the least of which is that it probably did represent the "camel's nose" of mandatory key escrow for all crypto in the U.S.), and became instant opponents of it. Whitfield Diffie, of Sun Microsystems, also helped explain -- fortunately, in laypersons' terms that don't require an engineering degree -- many of the problems with the way in which Clipper was developed and proposed for general adoption.

Without getting into a long explanation of all the technical problems with a "key escrow" scheme (those interested can find all the answers they need, through Google), lets just skip ahead to a good stopping-point. The Clipper Chip episode (I think it is now called Episode IV: A New Hope) metaphorically ends with the Death Star exploding (or was that the Clipper Chip sinking?), and Darth Vader spiraling off into empty space, attempting to regain control of his spaceship. No Ewoks, yet.

Without spoiling the ending, I think it is fair to say that the bad guys eventually returned, in a brand-new (or, perhaps the word is "reconstituted") Death Star, to crate a whole lot of new problems.

In retrospect, had Clipper been adopted, it may well have led to somewhat more widespread hardware-based secure telephony, than we currently enjoy today (which would not necessarily be a bad thing). Granted, the technology was less than ideal, and the slippery-slope fear of setting the wrong "key escrow" precedent was genuine.

I think it is fair to say that many lessons were learned in the course of the Clipper episode, both by opponents and advocates, and that those lessons have not been forgotten by those who were directly involved.

So, with that background in mind, we can return to the IronKey. Some of the following is speculation, to be honest, but I think the big-picture overview is largely correct.

Upon first hearing of the IronKey, I noticed that certain aspects of it (each of which undoubtedly has an effect on price) were not exactly consistent with the kind of product that is initially designed for consumers or businesses, but that also (like, say, a photocopier or laser printer) just might happen to scratch a government itch, as an afterthought. Rather, IronKey is a lot more consistent with the kind of product (like Clipper) that originally is designed to meet (or to anticipate) a government specification, and then subsequently re-purposed for introduction to the general consumer/business market.

For instance, IronKey employs hardware encryption, which heavy government users of crypto demand, but the benefits of which would be challenging to explain to ordinary end-users, using any known form of customary advertising medium. Hardware encryption can increase costs. Does it really produce a corresponding increase in sales? Next, not only is the IronKey waterproof and housed in metal (which is probably enough to give regular end-users a fuzzy feeling of "security" and to close the deal), but the cryptochip also is encased in epoxy -- to fend off the kind of physical attacks on the cryptosystem that few adversaries other than enemy governments would have both the means and the resolve to mount. That kind of overkill costs extra money, and does not necessarily increase sales enough to justify it.

My initial instinct was to look for a connection to NSA or In-Q-Tel, but after a couple of Google searches relating to IronKey, it appeared that Homeland Security might be a more likely candidate for IronKey's government sugar-daddy (more on this, later).

Then, interestingly enough, while deciding whether to get an IronKey (or, perhaps, a bunch of them for several colleagues at work, who neither know, nor want to know, much about network security), I had the same question that Gizmodo raises -- namely, "If I forget my password, and enter the wrong one ten times, how do I ever get my data back?"

One plausible answer (which I might have accepted, and which Gizmodo appears even to have assumed) would be, "Tough for you. You never get it back; so always back the data up yourself -- someplace that you feel is secure." Turns out that IronKey has a different answer.

Specifically, I asked one of IronKey's channel reps about this issue on the phone. His answer left me with an unsettling good news / bad news reaction. The good news is that IronKey actually has thought about this issue and has developed an answer to it.

The bad news is that IronKey's answer to the "lost password" problem (and also, presumably, the "stolen key" problem) is that they'll keep your password for you -- as if in escrow. (Oops, did I say "escrow?").

Moreover, IronKey will even keep a complete copy of all your data on their server, at -- and if you lose or forget your password, you can still get access to all your backed-up data (and the password, too), by answering some security questions that are supposed to determine that you are really you.

Now, in the case of Clipper, the way the "escrow" worked was to divide up each person's secret key into two parts -- neither of which was good enough, alone, to open the "backdoor" and to allow snooping. One of the part-keys was to be held by the Attorney General (at the time of Clipper, Janet Reno). (Thank God, that plan, never was implemented!). The other key-part was to be held by a completely separate government agency. That way, the only way the keys could be combined (and the only way the "backdoor" could be opened for snooping) would be to secure a court order (a warrant) authorizing the two key-parts, for the specific equipment in question, to be combined.

Suffice it to say that I have no reason to believe that IronKey has implemented (or plans to implement) any comparable institutional safeguard to protect against improper access (not just the government, but hypothetically, even by a "rogue employee") to a customer's data. The mere fact that a "security question" procedure has been implemented, leads me strongly to believe, that no such institutional safeguards (the computer equivalent of making sure that your books are not kept by the same person who handles all the cash) even have been considered.

Maybe the next question I asked would not occur to everyone, but I represent Internet service providers, who (from time to time) get served with warrants, subpoenas or national security letters -- sometimes the subpoenas come from law enforcement, sometimes from private litigants. Needless to say, sometimes the government really does go on completely bogus "snooping" expeditions.

"So," I asked the channel rep, "What happens to my data stored on, if somebody, without my authorization, gets interested in my data, and serves IronKey with a subpoena or a national security letter?" (And remember, with "national security letters," often the customer is not even informed that his or her data has been accessed.). The channel rep's answer was essentially the same as what IronKey's Website says.

According to the Website, the following are included among IronKey's core values:

Respecting the law
  • IronKey complies fully with all laws of the United States and the State of California.
  • If required by law, through subpoena or other legal requirement, we will release information in our possession about members that are the subject of an investigation.
Their privacy policy contains the same warning.

The bottom line is simply this -- IronKey's FAQs strongly suggest that your data is accessible to you and only you. However, the reality with IronKey (especially if you back-up everything to their server) is that your data is potentially accessible with anyone who can correctly answer your "security questions." Moreover, the "security question" protocol can be bypassed entirely, if certain legal procedures are followed.

The situation with IronKey, thus, turns out to have some rather remarkable similarities to the clipper chip "key escrow" scheme peddled in the early 1990s by the Clinton Administration.

One other thing worth noting, in relation to IronKey's "core values" disclosure, is this: They specifically use the word "subpoena," not "warrant." To a lawyer, the difference is significant. A warrant can only be obtained by law enforcement by requesting one from a judge -- who is supposed to require a showing of "probable cause" as a pre-condition for issuing the warrant. Subpoenas do not require the signature of a judge. Whose signature is required, for any particular subpoena, depends on the kind of subpoena.

Access to certain kinds of "stored electronic communications," according to federal law, requires a warrant -- in other words, judicial approval. See 18 U.S.C. secs. 2701-2712.

Presumably, IronKey has gone through the analysis, to determine when -- precisely -- a subpoena is good enough, and when it would be a crime for IronKey to hand over data with anything less than a warrant for cover. Id. All the ISPs I advise, after all, have had drilled into them, which kinds of information require warrants and must not be accessed or disclosed in any manner, based on any legal process short of a warrant.

I haven't gone through the full analysis of how the statute governing wiretaps and stored electronic communications might apply to the various information stored by Ironkey on customers' behalf. But the issue is interesing enough that I just may do it sometime for grins (probably won't post the conclusions drawn from the exercise however).

In the end, I think it would be a mistake to assume that IronKey is quite so bullet-proof as their marketing materials seem to suggest (except on a very careful reading).

Of course, my point is not to criticize IronKey for their commitment to obeying the law. Nor do I want to criticize them for cooperating with legitimate law enforcement investigations.

According to the IronKey Website, one of the company's Board members served as "National Cyber Security Division director of the United States Department of Homeland Security." And at least one member of the "management team" also touts his DHS ties on the IronKey Website. Certainly, if the Department of Homeland Security is a likely customer, one would hardly expect IronKey to bite the hand of an agency that might feed it business.

(Incidentally, while this 2006 report shows IronKey doing some other DHS-related work, I do not have good information one way or the other as to the extent (if any) that IronKey could be DHS-funded. Does anyone have good information on this subject?).

I'd like to close with some speculation about why the U.S. government might actually consider it a good thing, and worth promoting, to promote the widespread (consumer market) distribution of devices that enable access to TOR networks (including the IronKey). I have no present idea, one way or the other, at present as to whether the IronKey originally was developed for a government application, and then re-packaged for consumer applications, or whether it took a different path to market. Again, this is mostly speculation.

Somewhat like the original Hummer (which found itself, at least temporarily, enjoying a kind of popular mass-market demand), but unlike, say, the product line of Augmentix (conspicuously absent from their site is any way for the public to get their hands on one easily), the IronKey appears not only to be developed for certain niche applications (healthcare, financial services, and government leap to mind as viable niche applications for it), but also developed with enough potential mass-market appeal that the manufacturer appears to expect to bring per-unit costs down significantly for core customers, by attempting to achieve mass-market economies of scale.

It is always worth remembering that the U.S. government (please click the link!) not only invests a lot snooping on others, sometimes illegally, but that certain agencies also are tasked with the job of keeping U.S. government information and communications more or less snoop-resistant. Indeed, the Clipper Chip itself came out of the U.S. government's rather extensive "codemaking" efforts to secure its own communications against potential eavesdroppers.

Presumably, it is not entirely accidental that TOR was made widely available (and also made available for commercial applications) by the Office of Naval Research. As explained here, "The variety of people who use Tor is actually part of what makes it so secure. Tor hides you among the other users on the network, so the more populous and diverse the user base for Tor is, the more your anonymity will be protected."

Simply put, the more widely TOR can be implemented for relatively innocuous applications (such as hobbyist applications or file-sharing, or just making sure that your visits to somebody's Website cannot be traced back to you), the less susceptible the TOR network and its nodes will be to attacks such as traffic analysis.

This expansion of the users of the secure network is good not only for people who use snoop-resistant means of communications for government and enterprise purposes, but also for folks like human rights activists (be sure to check out NGO in a Box, Security Edition), who would prefer not to attract suspicion to themselves whenever, originating from -- say, Saudi Arabia or China or Myanmar -- an SSH pipe opens to a known TOR node.

Assuming that the government has an interest in these IronKey devices (no doubt, the IronKey will eventually find its way into at least a few government niches), it certainly seems reasonable that the government could well see the advantage of (1) bringing its own costs down by piggy-backing its own purchases on mass-market economies of scale (I think this is how the Clipper Chip was expected to be so cheap, too), and (2) at the same time, rendering its own use of TOR networks more secure and more difficult to detect, by multiplying the number of TOR users all over the world who are not affiliated with the U.S. government.

Is there a vast government conspiracy to propagate TOR and to multiply the number of users? I have no idea; again, this is all speculation. But I certainly hope there is. And if so, I'm all in favor of it. Happy (safe) surfing!


dave said...

You've done a really great write-up. I really enjoyed reading it.

Backup of your device password to the IronKey service is optional, and under user control. If you prefer not to have it backed up, you are free to not have it sent to our servers. Backed-up device passwords are stored encrypted on our servers.

I am interested in finding a design for device password backup/recovery that can realisticaly be shown to a user, but not available to the service provider.

One idea is for users to back up their device password on a piece of paper, or in their gmail account??? Not necessarily secure or convenient.

Also of note: any user data backed up to IronKey servers is encrypted on the user's device before being sent up to the IronKey servers.

aw said...

Great review. I recently conducted a security exercise, and for fun set the bar at complete privacy. I have to admit that the only totally secure method of data storage, is to not create any data, and not remember anything. I think we have to accept that there will always be a risk of determined people to access data. All we can do as ordinary people is make an attempt at being secure and if you want to be devious, create as many layers and distance as possible between the hacker and the crown jewels, at least this will buy you time.