Computer Crime Research Center

cybercrime/55.jpg

A Ukrainian Cyberattack Shows How Dangerous Software Backdoors Can Be

Date: July 13, 2017
Source: Computer Crime Research Center


For tens of thousands of people, June 27, 2017 started with a black and red computer screen and a jolt of panic. That day, malware called Petya—named for a virus discovered in 2016—spread from Ukraine out across the world. Petya was a form of ransomware, an attack that encrypts a user’s data with a secret key that must be purchased from the hackers. What made Petya special were the files it encrypted. Rather than encrypting the entirety of a hard drive, which takes a significant amount of time, Petya encrypted the files that tell the operating system where to look for other files. It blinds the operating system to its own knowledge. A more viral variant of Petya, named WannaCry, was released in May 2017, causing significant damage to the U.K.’s National Health Services, FedEx, and Deutche Bahn.

The June 27 cyberattack started at a Ukrainian firm called M.E. Doc that makes accounting software used widely in Ukraine—it’s installed on 1 million computers and used by about 80 percent of all Ukranian companies. Using credentials stolen from an M.E Doc employee, the hackers ensured that when the accounting program attempted to update itself, it would download the malicious software instead. Each computer that was infected by the virus saw the same message: “If you see this text, your files are no longer accessible…Nobody can recover your files without our decryption service…All you need to do is submit your payment…”

Except, it turns out, the June 27 Petya attack wasn’t a Petya ransomware attack at all. It was a much more insidious cyberweapon costumed as ransomware. Instead of ransoming data, the weapon encrypted files with no intention of providing the secret key. It effectively “wiped” files from Ukrainian banks, airlines, the metro, government servers, and other crucial parts of the Ukrainian infrastructure. After the real purpose of the weapon was discovered by Kaspersky labs that same day, it was renamed NotPetya to distinguish it from the original ransomware.

NotPetya may have targeted Ukraine, but every country suffers from the vulnerabilities it exploited. The malicious updates were installed on vulnerable computers through the creation of “backdoors”—a nebulous term referring to methods that provide access to the secure parts of a computer system while bypassing usual security measures, like passwords or encryption.

In rare cases, companies have placed backdoors for legitimate, but lazy, reasons. One of the more high-profile cases was the backdoor Microsoft built into a Windows 8 program called SecureBoot. The program was intended to prevent malware from being installed on any machine running Windows 8. Engineers from Microsoft included a backdoor so that they could test the program in a safe environment. But it backfired: It turned out that this backdoor made SecureBoot vulnerable to the exact malware it was intended to stop.

Most of the time, though, backdoors aren’t installed with individual users’ best interest at heart. After the 2015 San Bernardino shooting, the FBI sought to compel Apple to build a backdoor into an iPhone 5c used by Syed Farook, the primary gunman. The case had just reached the appellate courts when the FBI dropped the case, stating it had contracted an outside company to exploit an unnamed vulnerability in the phone. Had the FBI truly been interested in the public’s well-being, it would have reported the exploit to Apple immediately so that Apple could have patched the issue, securing the iPhone 5cs and older models that were vulnerable to it. The FBI did not do that. Neither do most of the security agencies that discover exploits that become accidental backdoors. Instead, they stockpile exploits in the interest of reusing them later.

That is especially true of the U.S National Security Agency, which contracts developers to build backdoors so it can gain covert access to information. Perhaps the most important is its backdoor into what used to be one of the four main random-number generating algorithms standardized by the National Institutes of Standard and Technology. The NSA used this backdoor to snoop on torrents of internet traffic.

There are clearly ethical issues here and serious breaches of the social contract, but there is a more practical issue. Backdoors are like golden keys. They give unlimited access to the person who possesses them. However, as Jeremy Gillua of the Electronic Frontier Foundation has put it, “even a golden key can be stolen by thieves.” And, once the backdoor is open, anyone can walk through. In 2010, Chinese hackers penetrated Google’s services by taking advantage of a backdoor the NSA and Google had put into place. The Washington Post reported that these hackers gained access to a database of Chinese intelligence operatives who were under U.S surveillance.

Google is far from the only company with NSA backdoors, however. The Snowden revelations revealed, among many, many, other things, the existence of backdoors in the biggest edge providers around. It’s hard to say whether those backdoors were abused by anyone other than the U.S government. Many of the companies involved with NSA backdoor programs were, indeed, hacked, but the evidence isn’t as clear as it is in the Google case.

The NotPetya backdoors differ in origin from the ones in the Google and the Snowden revelations. In the latter cases, backdoors were installed knowingly and willingly by the technology providers. M.E. Doc did no such thing. The backdoors in its case were installed by dangerous cybercriminals as part of a cyberattack looking to sow discord and freeze a country. However, in practical terms, the backdoors in these cases are near identical. To an attacker, a vulnerability is a vulnerability, whether or not it was put there intentionally.

NotPetya shows the danger of scattering golden keys throughout the kingdom: As long as there are keys, there are thieves who will find them.


Add comment  Email to a Friend

Copyright © 2001-2013 Computer Crime Research Center
CCRC logo