copyright notice
link to the published version: IEEE Computer, June, 2015

accesses since May 7, 2015


Hal Berghel

One of the most fundamentally misguided ways to protect a networked infrastructure is to introduce an air gap. The US has been diligently mastering the art of crossing them for over thirty years.



Eager to send détente into an early grave, Ronald Reagan seized the opportunity to take advantage of the intelligence provided by French President Francois Mitterand's KGB spy, Colonel Vladimir Vetrov, codenamed “Farewell” by the French domestic intelligence agency. Mitterand offered the 4,000 KGB documents (the “Farewell Dossier”) that Farewell supplied to President Ronald Reagan in 1981, which by some accounts triggered a spectacularly kinetic CIA response.

The entire affair was documented by Gus Weiss ( ), who served as a special Assistant to the Secretary of Defense as well as Director of International Economics for the National Security Council. According to Weiss, the Soviets under Brezhnev viewed détente as a means to ensure a much needed economic breathing room to enable them to improve their economy. To expedite the improvement, the Soviet Council of Ministers and the Central Committee established KGB Directorate T to find and retrieve Western targets-of-opportunity (read: stolen intellectual property). The new operating arm, Line X, was charged with the oversight and management of the acquisitions produced by the KGB and GRU.

The adversaries of Line X and the forces behind it were the Western/NATO economic agreements and export policies. Needless to say, Nixon, Kissinger and their Western cohorts weren't keen to provide the Soviets the keys to the kingdom, so nuclear secrets, certain sophisticated manufacturing techniques and semiconductor technology, and sensitive trade secrets were verboten, and the corresponding products, weapons, advanced computers, machinery and the like were embargoed. However, that didn't hold up Line X and the KGB. Line X populated visiting Soviet delegations with KGB agents to learn all that they could about agriculture, manufacturing, defense and whatever else they could get their hands on. What they didn't learn from direct inspection, they tried to buy. What they couldn't buy directly, they'd try to purchase through third parties. Failing that, theft was still an option. This was technology transfer Soviet style during the Cold War: a continuation of the atom bomb spying effort, but this time with economic objectives, that continued to some degree until Ronald Reagan found out about it from Mitterand.

In his capacity with the NSA, Gus Weiss read the Farewell documentation and determined that the Soviets were acquiring trade secrets in technology at breakneck speed. “Our science was supporting their national defense,” he reported. The CIA extracted from Farewell a Line X shopping list, and with the approval of then Director William Casey in 1982 and the help of the FBI, decided to help the Soviets in their shopping by seeing that they got “improved” versions that “would appear genuine, but would later fail.” In this way, defective computer chips, flawed parts, and misleading and/or bogus technical information were “seeded” with vendors known to sell to the Soviet Union.

This much is known from Weiss' report (above). But it gets much more interesting. Enter Thomas Reed, Gerald Ford's Secretary of the Air Force and Director of the National Reconnaissance Office, and a member of Reagan's National Security Council. Reed and Weiss were both with the National Security Council when the Reagan-Casey-CIA-FBI intrigue began in early 1982.

Reed picks up Weiss story line in his 2005 book, At the Abyss (Presidio Press, 2005). On this account, the Soviets needed some software for their trans-Siberian pipeline stretching from Siberia to Eastern Europe. They dispatched a KGB operative to a Canadian software supplier. Farewell notified U.S. intelligence, the FBI works with the Canadians to “enhance” the software which is then sold to the KGB operative and delivered to the KGB. According to Reed, the software had a Trojan horse which allowed the West to regulated pump speeds and settings, valve openings, and internal pipeline pressures – read: far beyond safe limits. Needless to say, this flawed software produced the desired result of disrupting the operation of the pipeline. In fact, it is alleged to have produced what some claim was the largest non-nuclear explosion in the history of the world to that point (estimated at three kilotons of TNT). When the explosion was registered, Weiss apparently told the NSC not to worry about it but gave no explanation. Reed claims that Weiss told him the story and provided him with his notes shortly before his mysterious death on November 25, 2003. While some authors dispute whether the hack resulted in the actual explosion, it wasn't for want of effort on the part of the Reagan administration. So far as I can determine no authoritative source has been found that disputes the rest of Reed's account.

But there's more. Weiss' obituary wasn't published by the Washington Post for twelve days. Perhaps others were looking for the notes. Incidentally, some of his notes appeared on the CIA Website under is name on April 14, 2007 ( ), over two years after Reed's disclosures in his book that removed any cover of secrecy. What is more, the trans-Siberian pipeline part of the story is not included in Weiss' published CIA notes. Add to this his strong opposition to the Iraq war and you have the stuff of which great conspiracies are made. I would be remiss if I failed to mention that Thomas Reed remains a strongly partisan loyalist to Reagan and his politics, so it may be that his account is biased and somewhat sanitized – i.e., it is entirely possible that there is much more to the story than Reed reports.


To get us all on the same page, the officially unconfirmed US/Israeli cyberattack against the Iranian uranium enrichment facility at Natanz that was reported in 2010 was never called Stuxnet internally. The codename used by the planners when they presented the idea of a cyberattack on Iran to the George W. Bush administration was “Operation Olympic Games.” Stuxnet was a name given much later by investigators external to the project who juxtaposed fragments of contained filenames.

Stuxnet archeology has produced sufficient digital artifacts that several conclusions may be drawn. First, Stuxnet actually shares some of the architecture and codebase with the remote access Trojan and infostealer, Duqu ( , Figure 2.1, pp. 4 ff) and the espionage hack, Flame. In fact, the early versions of Stuxnet (circa v.0.5) are thought to be derivative of the Flame Platform ( , while the later versions (circa v.1.0) are also derivative of the “Tilded Platform” – so-called because investigators found the fact that the contributors tended to start filenames with tildas noteworthy. Other malware based on “Flame” and “Tilded” malware are certain to remain in circulation in some form. Kim Zetter (sidebar) convincingly documents that while Flame and Duqu are derived from the Flame and Tilded code bases, respectfully, Stuxnet borrows from both (Zetter, ch. 15), and in different proportions over time. For a variety of reasons that Zetter documents, Kaspersky Labs' Costin Raiu believes that the cyber-aggressors that started Stuxnet borrowed heavily from the Flame Platform code and then shifted to the Tilded Platform a few years in development due to the simplicity and tightness of the latter code. David Sanger (sidebar) speculates that the complex Flame code was a U.S. artifact and used when Stuxnet was in the experimental stage, while Duqu/Tilded Platform code was primarily an Israeli product. It was only after Operation Olympic Games was authorized by George W. Bush that the two teams began sharing code. There is considerable evidence for this, not the least of which is that Flame and Duqu vastly differ in terms of sophistication, suggestive of different teams with different skill levels. It should be noted that the Stuxnet payload (the part that was directed at specific industrial controllers) seems to have remained relatively constant over time, suggesting that it was developed first and held in reserve while awaiting an opportunity to launch. Absent leakers or whistleblowers we may never find out the full extent of the Flame and Tilded-Platform cyber-mischief of which Flame, Duqu and Stuxnet were a part, although it is known since 2012 that the Flame arsenal of attacks has expanded to include a hack of Microsoft Windows Updater – a feature that was not present in Stuxnet-era Flame and Tilded Platform exploits. So the Flame exploit continued to evolve after Stuxnet was exposed. Much of the current analysis must end with some speculation, of course, since state-sponsored developers don't use networked version control repositories like GitHub!

The details of the Stuxnet worm have been well documented by scores of security analysts at this point ( ; ). It is well known that Stuxnet targets a very narrow range of SCADA industrial control systems running Step7 software by corrupting programmable logic controllers (PLCs) using two specific frequency converters, causing them to direct the connected systems to operate outside safe limits to the point of self-destruction (sound familiar?). The SCADA systems themselves are controlled by Windows computers, so the attack vector is indirect via Windows PLCs which are likely isolated from the Internet by means of an air gap. More specifically, Stuxnet is a cyberweapon that targets the Siemens Step-7 PLC software running on Windows computers (more accurately WinCC or Siemens PCS 7 software) which in turn controls Siemens Simatic S7 industrial controllers with S7-315 or S7-417 frequency converters used by the IR-1 family of uranium centrifuges. That last sentence is a mouthful! But it highlights the fact that the payload of Stuxnet was precisely targeted. The particular S7 controllers used with these centrifuges were Siemens SCADA controllers of which so much concern has arisen in recent years. SCADA was specifically introduced as a cost-saving measure to replace the hardwired and telemetry control systems of the 1950s and 1960s. They effectively reduced the cost of control if one discounts negative externalities from the total cost of use as they were designed without robust security in mind.

That's what Stuxnet did. Of course, that doesn't explain how it accomplished the feat. To understand this, one needs to re-visit the archeological record. The earlier Stuxnet version (~ v.0.5, circa 2005-7 – see ) was not Internet-enabled, and only propagated through shared Step 7 project files on Windows computers. Unlike its more effective descendant, v.0.5 manipulated the input and output valves on the centrifuges. While excessive pressure would have produced damage to the centrifuges, it was less effective and dramatic than v. 1.0 which actually manipulated centrifuge rotors to the point of spinning out of control. v. 1.0's injection method was also more sophisticated, using existing Windows vulnerabilities to propagate the worm through a LAN, removable USB storage devices, network file shares, Windows remote procedure calls, printer spools, or the Internet itself. The initial injection of v.0.5 was accomplished by the Autorun exploit on a Flame Platform via infected USB drives carried to Natanz by four Iranian subcontractors – a relatively simple injection strategy compared to v.1.0. Interestingly, the payload for v.0.5 didn't change in subsequent versions suggesting that the weaponized Simatic S7/SCADA attack code was likely mature by the time the Presidential decision was made to proceed to the final stages. Symantec also assesses the payload code to be far superior to the injector and call-home code, which many have used to identify the nationality of the teams. As an interesting aside, v0.5 used four command and control servers to update the code all of which claimed to be from the Media Suffix advertising agency whose tag line was “Believe What the Mind Can Dream” – the mantra for what I'll call post-modern, neo-conservative, malware epistemology.

The detailed technical reports on Olympic Games/Stuxnet are consistent in their recognition of its aggressiveness and capabilities – when measured in terms of breadth and depth, it certainly qualifies as revolutionary. Not only was it a hydra-headed near zero-day exploit and the first known rootkit to target PLCs, Stuxnet also deployed a sophisticated injector into the S7 processes. It was also a self-replicator par excellence that worked within LANs, peer-to-peer communications, removable storage, network shares, and I/O streams. It used a sophisticated covert command and control interface, an advanced form of anti-malware sonar for most of the popular security products available, strong encryption to hide its binaries, and had an embedded playback mechanism to spoof normal operation.

While not as complicated as Flame, Stuxnet was a serious contribution to the art of cyber warfare and in fact unique as a cyber-kinetic attack tool. Attacking the infrastructure of sovereign nations alone put it in a league of its own. This has not escaped the attention of political leaders and state-supported technologists of every stripe. Not only did it set a new standard for hacking into industrial control systems, it also raised the bar in the global cyber arms race. While the inevitable retaliatory attacks by anti-Western interests will be labeled by the mass media as naked cyber-aggression (as the bogus claims surrounding the Sony hack make clear – see my Cyber Chutzpah: The Sony Hack and the Celebration of Hyperbole, Computer, February, 2015), the cyber-aware globalists will always regard the U.S. and Israel as mentors. Future cyber-kinetic attacks will predictably involve failures of power grids, water supplies, and transportation. This cyber-genie is now out of the bottle.

The point we need to take away from the Siberian Pipeline Hack and Stuxnet is that air gaps have been ineffective for well over thirty years. An air gap joins security-through-obscurity as a classic examples of what I've called Faith-Based Security: a security that is based on faith alone (see my “Faith Based Security,” Communications of the ACM, April, 2008).

Next month: we'll investigate what, if any, lessons we've learned.



The discovery of the worm now known as Stuxnet by a small security company in Belarus called VirusBlokAda ( ) is itself an interesting story that is now part of cyber-lore. The Stuxnet authors went to considerable lengths to develop security suite sonar to thwart discovery by security software produced by the major vendors. But VBA was too small to be on their radar. An Iranian reseller for VBA reported events-of-interest by several customers that couldn't seem to rid themselves of malware infection. He alerted VBA who in turn uncovered a kernel-level rootkit operating on customer's computers. Further analysis discovered an injector based on Windows .LNK files carried by USB flash drives ( ). VBA had unknowingly uncovered the first few layers of Stuxnet because the sonar was effective against more widespread security software.

.LNK files are simply file shortcuts or symbolic links to local files. If the local file is an executable, activating the link causes the executable to load and run. Actually the Stuxnet v.1.0 .LNK exploit began life as a Microsoft feature. Not content to allow users access to a distracting array of file extensions, Microsoft has shipped Windows with the extensions to known file types (e.g., .DOC, .PPT) suppressed. To see the common file extension (recommended) the user has to disable this option in the folder options panel. What is more, even if you disable this suppress feature, Windows will still suppress system reserved file extensions like - you guessed it - .LNK. These file extensions are not file system links and hence not suppressed in Windows Explorer, but rather in the Windows Registry. By default the Registry entry for .LNK is “NeverShowExt.” The same applies to URL, PIF, SHS, etc. extents as well. So, this “feature” was already weaponized by Microsoft: if .LNK file extensions are suppressed, it might appear as an innocuous file (e.g., <file.doc.lnk> would appear as <file.doc> ) concealing that this doc file will execute something other than MS Word. The Stuxnet .LNK injector took advantage of a design flaw in the IconHandler within the Windows Shell that incorrectly parses .LNK files. The IconHandler allowed the execution of the executable linked to the icon instead of just displaying the icon on USB devices (see “Icon Loading Vulnerability” in ). VBA analysts spotted this attack vector and hence gained credit for the Stuxnet discovery, although they did not reverse engineer the code enough to reveal the payload. That would be done by Symantec ( ) and others. While the injector was designed for Windows, the payload was specifically designed for the industrial controllers.

The .LNK injector was novel – not a zero-day, but novel. Autorun was the default injector for removable storage and media before the mid-2000s. Since responsible security consultants had recommended disabling Autorun by 2000, it was becoming ineffective by the time that v.0.5 was released, so the Stuxnet authors jettisoned it in favor of the newer and more sophisticated exploits. Unlike v.0.5 v.1.0 used a multi-exploit . In all, Symantec and Kaspersky identified five different infection hacks: the Autorun exploit in v.0.5, the .LNK hack for v.1.0 and later, and three elevation-of-privileges exploits due to vulnerabilities in Windows keyboard file handler, print spooler, and task manager. In addition, Zetter documents eight different propagation tactics once the malware was installed on one computer: (1) the .LNK hack described above that would continue to work on new USB targets, (2) infecting Step 7 project files, (3) exploiting a security-through-obscurity defect in the way Siemens handled user-authentication, (4) injecting malware into shared Siemens databases, (5) a peer-to-peer exploit on LAN file sharing that worked much the same way as software updaters do, (6) installing a covert file sharing server on each infected machine for redundancy, (7) spreading via network file shares, and (8) a Conficker-style Trojan horse vulnerability that had already been reported and a patch created, but not necessarily installed. Of these (1) and (2) were the most critical. The newness of the exploits, together with the complexity of the code and the multitude of attack vectors distinguishes Stuxnet from other malware and made it the current gold standard of cyber warfare.

For technical information about Stuxnet, the most detailed reports that I've found are the Symantec references I've already mentioned. However, the long-term importance of Olympic Games/Stuxnet isn't to be found in the malware itself, but in the political context surrounding it. The two best sources that I have found are David E. Sanger's Confront and Conceal (Broadway Paperbooks, 2012, particularly ch. 8) and Kim Zetter's Countdown to Zero Day: Stuxnet and the Launch of the World's First Digital Weapon (Crown, 2014). Sanger's book puts Operation Olympic Games in the context of the political climate of the latter George W. Bush and early Barack Obama administrations. It highlights the allure of kinetic attacks that do not involve the immediate risk of American lives. As such, Olympic Games may be thought of as a tactical sibling to the current U.S. drone war whose ideological ancestry dates back to the covert CIA operations in the Middle East (Operation Ajax) and Central America (Operation PBSUCCESS) by President Eisenhower.

Zetter's book is also contextual, but the context is cyber warfare rather than political. Her book is an excellent investigation of Stuxnet from the perspective of malware archeology and epistemology. Both books are important contributions and required readings for those who wish to understand state-sponsored perspectives on cyberspace. Ralph Langner's 2011 talk ( ) on Stuxnet and industrial controllers offers a convenient alternative.