Pages

Thursday, March 17, 2011

Stuxnet - Cyberwar; Attack On Iranian Nuclear Facility

STUXNET - Excerpt from GRC Security Podcast


(,,,)
Leo: And now, let's talk about Stuxnet.

Steve: So what we have is, without argument, a true cyberweapon which was, over the course of about nine months from the time it was first seen to the last version that was seen, was under development. Symantec called it the most complex threat they had ever analyzed because of the number of different functions that it contained and also the fact that it was very cross-platform. It was -- or is, because it's still out there a little bit -- but it is a Windows-based worm, but it's designed to infect non-Windows-based systems. Many things are absolutely no longer in doubt. It cannot be doubted that this was directly targeted at the Iranian nuclear enrichment project. And I'll explain exactly why we know and how we know what we know. But it contained multiple zero-day exploits bundled in a Windows rootkit to hide itself from anyone. The first ever PLC, or Programmable Logic Controller rootkit, that had never been done before. It incorporated antivirus invasion techniques that I'll detail in a minute, where it literally looked to see what AV tools were in the system and knew how to get around them by version number.

Leo: Wow. Oh, wow. Talk about targeted.

Steve: Oh. It had -- well, and what that means is, think about it, it means the people who developed it ran it in these different AV environments and watched the AV tools capture it. See, because one of the things it needed to do was it was trying to remain hidden. So, for example, after it replicates itself three times from a USB stick, it removes itself from the USB stick.

Leo: Oh, wow. Oh.

Steve: To minimize the chance of discovery, it figures, okay, I have spread onto three new systems, me, the USB stick. So I'm going to now - Stuxnet sees that, because it's logging and recording what it's doing, and then it deletes it from the USB stick so that someone later wouldn't see it and wonder, whoa, wait a minute, what's this? 


So, I mean, it's all of this stuff. It's got process injection and hooking code that allows itself to insert itself in other processes in the machine; an array of network infection techniques, including a peer-to-peer technology that allows it to spread within local area networks; and a command-and-control interface. It connects to a couple of domains that I'll describe in detail in a minute, in order to report on its existence and to give those domains the opportunity to update the code. So essentially a binary package comes back which is actually encrypted. It's decrypted and then executed in order for Stuxnet to evolve over time.

So it's -- functionally, it's able to self-replicate through removable drives, as I was saying. And that exploits a vulnerability which Microsoft knew about. And we've talked about it, it was that .LNK vulnerability that where just -- then you'd have to open the link, a shortcut. Just viewing the shortcut in Windows Explorer could cause that file to execute by malforming the way the link file was made. And the rootkit which is hiding this knows exactly how many bytes long the file is; and when Windows Explorer attempts to retrieve that from the directory, the rootkit says there's no file here. So you just don't see it, even though it's sitting there on the drive.

So it's also able to spread through the LAN using a vulnerability that was also known for some time in the Windows print spooler in order to -- so everyone has this service running in Windows by default. The LAN is a trusted environment. So unless those Windows machines were patched current, they would have this problem. 


Oh, which is a perfect example for one of the questions we were asked last week. Remember the guy whose company had 15 machines behind a "Windows Server," and they were back on SP2, and no one was patching them. And he said, you know, is this a problem? Well, here's a perfect example of where machines on a LAN have visibility to each other, and the Windows firewall protects you from WAN-based things, but because Microsoft wants to make things easy, like filesharing, does not protect you from LAN-based threats to the same degree. So if you're not patched, if you've got this Windows print spooler service listening, then Stuxnet would have been able to infect all the machines on that network.

And there's an SMB exploit, another well-known problem in the server messages block, the so-called file and printer sharing service, which Stuxnet also knows. So machines that were kept really current would have been safe because these were known and patched vulnerabilities in several cases. But there were, and still are, even today, Stuxnet is using some privilege escalation exploits which have never been made public, which it uses in order to get around these AV devices. So it copies and executes itself on remote computers through network shares.

And Siemens has a version of Windows called WinCC, which runs something called Step 7, which is their -- it's all Windows hosted. And this is sort of the programming and code-writing and debugging tool to which you connect Siemens-based programmable logic controller devices in order to sort of download the code that you write. 


PLCs are programmed in sort of a -- they have, like, an assembly language and also sort of a simple, step-based, basic language in order to tell them what they want to do. They're pretty simple-minded. But so you do all your authoring of this stuff on a Windows-based machine, then hook up the device and download it into the PLC. Stuxnet is able to update itself through this peer-to-peer mechanism.

So through using remote procedure calls, RPCs, Stuxnet sets up a server when it installs itself in a -- when it infects a machine, and then sends out a broadcast for any other machines to see if they are of a later version. And, if so, they share their updates with older versions of Stuxnet. So it's constantly keeping itself up to speed. And it exploits a total of four unpatched Microsoft vulnerabilities, two of which have never been disclosed publicly, as I mentioned before.

Okay. So what's significant about this, when you look at how comprehensive it is, is that it could never have been designed blind. That is, this is just -- this is not something that script kiddies, no matter how much they want to, could create. In order to pull this off, you need, first of all, essentially schematics of the target. Somehow, someone got, through information leakage, very detailed description of what it was that was going on in Iran's nuclear enrichment program. And of course that's not information they were letting go of. We know that because the targeting side of Stuxnet only fires when it sees a specific configuration of frequency converters tied onto this programmable logic controller which matches the fingerprint of what was going on in Iran.

The problem with Stuxnet is that it's a little bit blunt, in that it is a propagating virus. A hundred thousand copies of it are, like, infected Windows machines all over the place. So although it was dispersed in a targeted fashion that I'll talk about in a second, because of these abilities it has to propagate, it got loose from the targeted companies, the five companies with connections to Iran that were infected with this. And it got out into the wild.

Well, we wouldn't want this thing infecting our own nuclear power plants or opening the floodgates on the Hoover Dam or anything else. I mean, programmable logic controllers are used for all of these things. This is like the way process control systems are run. And so you don't want to let something loose that is this powerful that is going to misfire.

Leo: That's like weaponized anthrax. You've got to have some sort of protocol.
Steve: Yes. And so what that meant was that the designers of this thing had to know exactly what it was going to find if it could get into the enrichment plant. They had to know exactly what was there because, I mean, there were versions of it all over. I mean, it was found in thousands of other Siemens systems. So this thing, I mean, this was -- this upset a lot of people who...

Leo: Oh, yeah.

Steve: Because, I mean, this got into many of these Siemens PLC systems. But because the equipment that it found connected to the programmable logic controller didn't exactly match what it was designed to find, they didn't do anything malicious in those cases, thank goodness. But so you have to know exactly what your target is. And then, as I had mentioned in a prior podcast as information began to come out, I remember saying to you, Leo, a few months ago, somebody had to actually have this equipment. I mean...

Leo: You called it. You called it totally.

Steve: You had to set it up. You had to -- I mean, you don't just write code and say, well, hope this works. I mean, all of this had to be prototyped. So you had to have frequency converters and basically mock up what is in Iran in a lab somewhere in order to write the code to make this go. So basically, as Symantec put it, a mirrored environment had to be created in the lab. 


Also remember that this thing, in order to work, it needed to get into the kernel in order to set up a rootkit to protect itself. It needed to have digitally signed drivers. And we know where they came from, remember? They came from Realtek and JMicron, two companies in the same industrial park, same physical location. So it is believed that some agent broke into and physically compromised those facilities to steal their private keys for their credentials.

Leo: Wow. There's a novel here.

Steve: Oh, I know.

Leo: I mean, what a book.

Steve: It really is. I mean, this is real. You couldn't, I mean, this is -- it's incredible. So some agent, you know, covert, undercover, in the middle of the night, went into RealTek Semiconductor and JMicron and did whatever they had to do to get their private digital signing keys and made off with them so that the drivers could be signed for this to all work. So, I mean, there are so many facets to this.

Now, the problem with these PLC-based machines, these programmable logic controller authoring machines, is it is understood that security is a concern. So they are never directly connected to the Internet. So the designers of Stuxnet understood that they were not going to infect the machine. But think about it. As a consequence of not being connected to the Internet, you have to get data in and out of them. So it's thumb drives. Which is the infection vector. If you're going to have a standalone machine because you're worried about security, well, you're going to use thumb drives.

And so a lot of attention in Stuxnet is paid to infecting removable drives, protecting their contents, keeping the contents invisible. And it is believed that essentially that strategy is what worked, that machines that were connected to the Internet got infected with Stuxnet and then, in the normal course of transferring data, updating files, here you've got this machine running your nuclear enrichment facility, and you're all proud of yourself that it's not on the Internet, so nothing can get to it. Yet there's a new version of the PLC software. So you download it over on this machine...

Leo: Oh, boy. Oh, boy.

Steve: ...load it onto your thumb drive, and then bring it over to the not-on-the-Internet Iranian enrichment plant controlling computer, and bang. That gets it infected. So when Stuxnet arrives in a new machine - and I have it written down here somewhere the domains that it queried.

Leo: I should look. I have your notes.

Steve: Here it is. It's mypremierfutbol.com, so www.mypremierfutbol.com and www.todaysfutbol.com are two servers which originally pointed into Malaysia and Denmark. When the worm was able to get itself installed, it would look up the IPs of those DNS domains and send a package of sort of status, including its log of its entire history of infection. It had a timestamp, information about the OS version, and additional information, and that log. So over the time that Stuxnet was known about, Symantec was able to collect over 3,280 unique samples, individual instances of Stuxnet, each with a different log because each log tracked basically the lineage, all the ancestral versions. As it had infected one machine after another, it kept appending to this log.

What they know as a consequence of being able to mine these logs, this 3,280 different instances of Stuxnet, is that there were three events targeting exactly five organizations, each having a presence within Iran. From those three events, targeting five organizations, 12,000 infections can be traced back to exactly those five organizations. So basically -- and we don't know how Stuxnet was planted in those organizations. Could have been a conspirator. Could have been emailed in. Somehow they got within those organizations.

The first organization, and they remained anonymous in this report, was targeted twice, in June 2009 and then again in April 2010. The second organization was targeted three times, in that June '09 attack, the second one in March of 2010, and then in May of 2010. The third one was targeted once, the third organization targeted once in July of 2009, as was the fourth organization. And the fifth one was targeted once, in May of 2009, but had three initial infections because the same initially infected USB drive was inserted into three different PCs. Oh, yeah. So they were like, it was targeted once, but it was, like, salted in three different locations within that organization. And so Symantec was able to track back all the way back to those very original three instances within that fifth organization.

The shortest span of time between the compilation of Stuxnet, where it was, literally, its source code was compiled, which inherently binds some date information into the code, to an initial infection was 12 hours. So this thing was built and, in at least one case, within 12 hours an infection was planted. The longest span between compilation time and infection was 28 days, and the average was 19. So this whole thing took place in the latter half of 2009 and the beginning of 2010. And so they know from, again, looking at these logs, that there were three attack waves: essentially June 22 in 2009, March 1 in 2010, and April 14 in 2010. And Stuxnet was getting better. The March 1st attack was a much more capable worm than the June one.

So if you can sort of put yourself in the mindset of the people who were doing this, who designed this, they had a goal. And they had a system which was providing them feedback. And so that was a mixed blessing because obviously Symantec is able to determine everything they have because of the feedback which the worm provided to its command and control servers every time it propagated. But you could see also that, while it was unknown, before it became known, this was vital information for the designers because it allowed them to profile the performance of this weapon they had written in the wild, and these were spear attacks. I mean, they were somehow sending agents into Iran or into affiliated companies and planting Stuxnet there. We know that because of the dispersion of the virus. Of all infections of Stuxnet globally, 58.31 were in Iran. 58.31 percent, sorry. 58.31 percent.

Leo: That's pretty effective.

Steve: So, yeah. Like...

Leo: Good job.

Steve: Nearly 60 percent were in Iran. But that's just machines infected. So that means it wasn't released in Santa Clara and all went there because all the machines between here and Santa Clara would be infected. I mean, so the point is that it started there somehow. Somehow it was planted in that location, like near to its goal, and then spread locally. And of course due to the fact that it was a worm, and used unpatched but known vulnerabilities of Windows, it did get loose. Yet as I said, the weaponized end, thank goodness, was so tightly targeted that it didn't do damage to all the other Siemens Systems that it sought out and did infect, 18 percent in Indonesia and 10 percent in India. And then it fell off.

And also the Siemens Step 7 system that I mentioned, of the infections, 67.6 percent of the Iranian infections had Step 7 software installed. So it was, again, it was seeking out and looking for these process-control-based systems. 8.1 percent in South Korea of the infections had Step 7 installed, 5 percent in the USA, and 2.18 in the U.K. So it did, for example, in the U.S., 5 percent of the infections of Stuxnet were Siemens-based systems. So it was infecting U.S.-based process control systems. And the good news is the flood gates of Hoover Dam didn't get opened as a consequence. So...

Leo: Well, I think it's really clear that, well, we know -- in fact, I think we know who did this now because there have been some revelations. But I think it's pretty clear that they were heavily targeting.

Steve: Yeah, well, and this evidence, I mean...

Leo: And effectively.

Steve: I guess what I find so interesting is that, if you really take advantage of the information coming back to you, you can, as you said, Leo, this is a plot. I mean, we can work out what had to happen in order for this result. In order for the drivers to be signed with good driver certificates from two innocent companies, somebody had to go and break into them and get their private keys in order to sign the drivers. Stuxnet the virus is aware of Kaspersky KAV versions 6 through 9, the current McAfee products, AntiVir, BitDefender, eTrust, F-Secure, Symantec and Symantec Common Client, ESET's NOD32, and Trend's PC-cillin. It has code in it to specifically see that those products are in the system.

And remember, one of its priorities is stealth. It very much wanted to get its work done before it was being found. So what it did was it would look in the system to see if these things were present. And, if so, it would look at the EXEs to determine the versions, and had version-specific behavior, so it was designed to go underneath the detection. And in several cases it did that by using either one of two -- because it ran on all versions of Windows, XP through Win7, not earlier than XP. It used vulnerabilities that had never been published for getting admin privileges, if it was not running with admin privilege, and it used those in order to place some kernel-level hook games in order to install itself into processes in a way that specifically would not be detected by these intrusion detection systems that were designed to detect exactly this behavior.

The rootkit that it installed, even with these tools, with these AV systems in place, it was able to install a rootkit, robustly and reliably, and filter the API calls that Windows was making to the kernel such that, if a file with a .LNK extension was going to be enumerated in a directory search, and the file was 4,171 bytes long, the rootkit would just remove that from the listing because the malicious link files that Stuxnet used were obviously 4,171 bytes long. And if a file was named "~WTR[FOUR DIGITS].TMP," whose file was between 4Kb and 8Mb, but the sum of those four digits, modulo 10 was zero, then that file would also not appear.

Leo: And why would that -- I don't understand what the point of that is.

Steve: And so, well, so this was -- Stuxnet needed some flexibility in its payload. So the link files wouldn't be seen, but it needed other files from time to time that it might need to hide. And so what it would do is it would design - it designed the rootkit filter such that -- sort of with a pattern match. So that if the pattern was "~WTR[FOUR DIGITS].TMP, and if the sum of those digits added up to zero modulus 10, then, that is, added up to 10, 20, 30, 40, for example, or zero, I guess, then that triggered the rootkit not to show that file.

Leo: So they could hide in plain sight.

Steve: Yes.

Leo: It would be an obvious rootkit file.

Steve: Precisely. And it had to coexist. So this was on the thumb drive or on the system where Stuxnet was installed. It had to coexist with other things. And it would look, when it was going to jump onto the thumb drive, it would verify that the drive had not just been infected by comparing the files with the current time. It would verify that the infection source was less than 21 days old. Meaning that after three weeks...

Leo: Wow, it expires.

Steve: Yes. It would stop trying. It would just...

Leo: So cool.

Steve: It was. It was just brilliantly designed. So the point is, again, it was trying to not get discovered. So it gave itself three weeks, on a given system, it gave itself three weeks to infect all the drives it could. And after that point it would go silent and just not do it anymore because, again, it figured, hey, if I haven't done it within three weeks, then -- and who knows what the developers knew about the protocol being used in Iran's nuclear enrichment facility. They might have known, for example, that something happened every two weeks or every week or something. So if they were able to get onto the machine that was one step away from the machine doing the development and controlling the programmable logic controller process control stuff, if they could get to that machine, and they knew that, like, there would be some thumb-drive-based communication between those two within three weeks, and if not, then they're just not on a machine where that's going to happen.

Leo: Sounds like they really knew what they -- not only what they were doing, but where they were going to be. I mean, this was so clearly targeted.

Steve: Yes. And the drive, the thumb drive, had to have had at least three files and five meg of free space, because you wouldn't want to run into, I'm sorry, you don't have enough room on your drive to hold our rootkit and our Stuxnet virus. So one of the files, WTR4141.tmp, and if you think about it, 4141, that adds up to 10, which is zero modulus 10, that would -- it was sort of like the advance guard that was a small bit of code that hid its companion file, ~WTR4132. And again, 4132, that sums to 10. So that's zero modulus 10. And that contained the entire Stuxnet payload that jumped over onto the thumb drive.

When they finally got there, one file, which was a DLL on this Step 7 PLC programming computer, the DLL was s70tbxdx.dll, that got renamed to, instead of the last characters being xdx.dll, it got renamed to sxs.dll. And a replacement s70tbxdx.dll, which was the PLC rootkit, it was installed. So essentially this DLL that was - it's very comprehensive. It has, like, 140 different, what Microsoft calls "exports." Those are, like, functions that the DLL can offer. The replacement file didn't duplicate all of those. For almost all of them, it simply forwarded those calls to the fake DLL to the real one because it knew what it had renamed the real one.

So when 135 of those different functions were called, it handed them off to the original DLL to work correctly. But the few that it needed to alter allowed it to intercept those functions on their way to the Siemens programmable logic controller and essentially add its own code to the code that was being downloaded and arrange for that code never to be visible, never to be seen. And so everything we talked about was for just the sake of getting a bit of code, custom-written code, appended to the front of the code controlling the PLC. And it also had to be that code that looked around at what it was connected to and knew whether to do anything or to stay inert. And so that's the history of the world's first, I mean, truly weaponized Internet worm.

Leo: Do you think the people who wrote this were security researchers? Virus authors? Do you think they took existing code and modified it? I mean, it sounds fairly sophisticated.

Steve: Anyone who...

Leo: Maybe, like, they contracted out.  I mean, look. Israel did this; right? We know that.

Steve: Yes.

Leo: We've heard that in fact they had exactly the same setup intentionally. It was pretty clear.

Steve: Yes.

Leo: And they had, of course, they had means. They had the motive because they didn't want Iran to have a nuclear bomb.

Steve: Yeah, I would say they probably did it with help. I mean, I believe that there are resources in the U.S. And, I mean, we certainly would not be hostile to the intention of keeping Iran from getting a nuclear bomb. And the argument was that that's what they were using this nuclear enrichment for, despite their denials, saying that they just want it for electric power generation. 


So you have to think that within the NSA, within our own government, and sort of shady organizations -- I mean, we were talking about, what is it, HSGary is the company? Or HSB? I can't remember the name of the company [HBGary]. But it's a major sort of hacking contractor that...

Leo: Right. This is what they do.

Steve: ...organizations in Washington use for their own purposes. And it was those guys who created a device, a Firewire device that allowed the guts to be sucked out of a computer just plugging it into the Firewire port to DMA and copy the contents out. And that came from that company. So there are even commercial organizations -- it's where our tax dollars are going -- that have this kind of competence and are able to participate in projects like this. 


And you know, Leo, how many times in the early days of this podcast, like Episode 2 and 3 and 4, when we were looking at sort of interesting Windows viruses and worms and things and commenting, isn't it nice, I mean, and aren't we really lucky that they're not malicious?

Leo: Right, right.

Steve: So many of these things -- and I scratch my head. It's like, why -- okay. I'm glad that they're not doing bad things, but...

Leo: Why not?

Steve: ...who's going to all the trouble of creating them, just to sort of have them float around out there?

Leo: They're practicing.

Steve: That's what they did. They just floated around out there.

Leo: I think I remember the very first virus was written by that guy Morris...

Steve: The Morris Worm.

Leo: ...the Morris Worm, just to see what would happen. He wasn't malicious.

Steve: No.

Leo: It escaped.

Steve: And it really hurt his reputation a lot.

Leo: Yeah, his father was a very famous security guy.

Steve: Exactly.

Leo: And so I think that some of this in the early days was just people -- you know, hackers are curious. And I could see how tempting it would be to say, you know, I could create something that would spread itself. Wonder what would happen? And just do it. I can understand that.

Steve: And within the white-hat community. we still hear echoes of, well, boy, you know, why can't we write a disinfector worm?

Leo: Yeah. Yeah. Remember that? Yeah.

Steve: It's like, I know you want -- yeah, I know you want to. But sorry, even if you're altering someone's machine, and you think that's a good thing, you're doing it without their permission.

Leo: Well, and I think ultimately, while it sounds like Stuxnet was pretty carefully crafted not to do harm and was very specifically targeted...

Steve: Oh, danger.

Leo: ...it's still a bad, bad, bad, bad, bad idea.

Steve: Yeah, and Leo, imagine if it had misfired.

Leo: Right.

Steve: Imagine if there had been, literally, collateral damage from the thousands of Siemens computer systems. I mean, it went in and it replaced a DLL. I mean, we depend upon these process control systems to run big plants. And it was in there replacing a DLL in order to get access to the programmable logic controller, to then go add code to it, and hope that it didn't do the wrong thing. I mean, it was gutsy.

Leo: Look, we know there is no such thing as perfect code. And we also know that programmers have a little bit of hubris. And there's probably not a programmer alive who thinks he can't write perfect code.

Steve: Yeah, and not one alive that ever has.

Leo: Yeah. If it were that easy, everybody, anybody would do it. What a great subject. Show notes, as always, are on Steve's site, GRC.com. And I put the show notes in our TWiT wiki every week. It's the one show I actually do that because you have such good notes. I always make sure they're in the wiki at wiki.twit.tv. You can get 16KB versions of the show at GRC.com. Steve has transcripts, too, which is really great. And this is the kind of show that I could imagine a college class or somebody who's teaching security might very well want to get people to listen to or read because it's so interesting. John is asking, was there an Easter egg in Stuxnet?

Steve: You know, actually I skipped over that, but...

Leo: There was?

Steve: Yeah, well, there were some odd things. Like there were some codes which, if you took it to represent a date, was the birthday of somebody famous in Iran. I mean, it was those sorts of things, really subtle. And in Symantec's report they made a point of saying, look, this is what Wikipedia says about this, I mean, about this particular collection of characters. But remember, the people doing this would have strong reasons to be pointing fingers to someone else. So we absolutely couldn't take that as ego out of control, but rather just additional subterfuge.

Leo: Red herring. It could be a red herring.

Steve: Precisely.

Leo: Wow. Oh, I want somebody to do the -- some intrepid journalist to do the research on this and write a book. What a fascinating story that must have been. I don't think we'll ever know, because...

Steve: No. Because, I mean, it really, oh, thank goodness it didn't misfire, Leo. As I really came to understand what this thing was, I was thinking, oh, goodness. I mean, this was really -- this was potent.

Leo: Well, I can guarantee you in future Security Nows we'll be talking about worse. Absolutely. Sad to say.

Steve: Well, we'll be here.

Leo: Yeah. 291 episodes in, and no sign of stopping. Steve Gibson, he's a machine. 


Visit GRC.com for your copy of SpinRite. You've got to have it. If you've got a hard drive, you've got to have SpinRite. And of course if you've got questions, we do a feedback episode every other episode. And now is the time to go to GRC.com/feedback, ask those questions. Maybe you'll get included in next week's episode. And tune in every Wednesday at 11:00 a.m. Pacific, Apple permitting. Thank you for moving last week. 11:00 a.m. Pacific, 2:00 p.m. Eastern time at live.twit.tv to watch. Steve, thanks so much.

Steve: Thanks, Leo.

Leo: We'll see you next time on Security Now!.

0 comments:

Post a Comment