Remember the Apple iPhone / San Bernardino case from early 2016? Here’s a recap:

The F.B.I. has been unable to get into the phone used by Syed Rizwan Farook, who was killed by the police along with his wife after they attacked Mr. Farook’s co-workers at a holiday gathering. Reynaldo Tariche, an F.B.I. agent on Long Island, said, “The worst-case scenario has come true.”

But in order to unlock the iPhone, which Apple couldn’t simply “do” because of the passcode implementation used by Farook, a legal dispute ensued whereby the FBI demanded Apple build a backdoor to the “single” device.

Behind the scenes, relations were tense, as lawyers for the Obama administration and Apple held closely guarded discussions for over two months about one particularly urgent case: The F.B.I. wanted Apple to help “unlock” an iPhone used by one of the two attackers who killed 14 people in San Bernardino, Calif., in December, but Apple was resisting.

When the talks collapsed, a federal magistrate judge, at the Justice Department’s request, ordered Apple to bypass security functions on the phone. The order set off a furious public battle on Wednesday between the Obama administration and one of the world’s most valuable companies in a dispute with far-reaching legal implications.

There were two binary sides to this case.

  1. Apple’s case: To some, this was the pro-privacy side of the case. Why not create a quick backdoor to the phone for the US government, and then close it up? In Apple own words: “Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.” You create one backdoor for the US Government, then what? You’ve created a backdoor for all iPhone iOS users of the same version, and it could be used over and over again. It also sets what should be obvious: a dangerous precedent for the security of iPhone users and the power of the US Government. As the Washington Post makes explicitly clear,1 “This is an existing vulnerability in iPhone security that could be exploited by anyone.”
  2. The US Government’s case:2 Create a “key”, essentially a backdoor into the terrorist’s iPhone, to unlock whatever data is in there (if there’s anything to find at all), and as with #1’s concerns, endanger one of the most used mobile devices on the planet. If the data helps the case, great. If, that is.

Okay, so what happened again? The FBI lost the chance to decrypt the phone via Apple, but apparently “may have found way to unlock San Bernardino shooter's iPhone” anyway. Specifically, this single iPhone and not the other ones. Whatever technical means was found, it isn’t clear, but this maneuver spared a massive security risk across all iPhones.

If the FBI would have gotten its way, though, the most recent news about both the NSA and CIA would have hit even harder. And that’s saying something, because there are a few massive pieces of news that crept out recently that are entirely related to the FBI’s request from last year.

As we’ve been finding out, when US Government agencies aim to have tools to monitor terrorists or its own citizens, they rely heavily on finding (or buying) vulnerabilities in software and devices, or creating exploits (essentially malware) for physical exploitation of such devices. This unraveling began in March of this year, when WikiLeaks began positing redacted documents freshly acquired. Without getting into the weeds (you can read up on it if you so desire), the NSA leaks have been confirmed as legitimate, and they keep unspooling concern to security experts and software developers the world over.

<div
    class="
      image-block-outer-wrapper
      layout-caption-below
      design-layout-inline
      
      
      
    "
    data-test="image-block-inline-outer-wrapper"
>

  

  
    <figure
        class="
          sqs-block-image-figure
          intrinsic
        "
        style="max-width:1024px;"
    >
      
    
    

    
      
        
      <div
          
          
          class="image-block-wrapper"
          data-animation-role="image"

data-animation-override

      >
        <div class="sqs-image-shape-container-element
          
      
    
          has-aspect-ratio
        " style="
            position: relative;
            
              padding-bottom:75%;
            
            overflow: hidden;-webkit-mask-image: -webkit-radial-gradient(white, black);
          "
          >
            
              <noscript><img src="https://cdn.uploads.micro.blog/25423/2023/5535e78029.jpg" alt="" /></noscript><img class="thumb-image" src="https://cdn.uploads.micro.blog/25423/2023/5535e78029.jpg" data-image="https://cdn.uploads.micro.blog/25423/2023/5535e78029.jpg" data-image-dimensions="1024x768" data-image-focal-point="0.5,0.5" alt="" data-load="false" data-image-id="58f2539ac534a52c8267d998" data-type="image" />
            
        </div>
      </div>
    
      
    

    
  
    </figure>
  

</div>

The latest concerns coming out of this are a series of newly found exploits deployed by the NSA to attack computers using pre-Windows 10 operating systems (roughly 65%+ of all desktops on the planet). There is one in particular, called FUZZBUNCH, that can automate the deployment of NSA malware and would allow a member of the agency to easily (from their desk) infect a target computer. As reported by the Intercept:

According to security researcher and hacker Matthew Hickey, co-founder of Hacker House, the significance of what’s now publicly available, including “zero day” attacks on previously undisclosed vulnerabilities, cannot be overstated: “I don’t think I have ever seen so much exploits and 0day exploits released at one time in my entire life,” he told The Intercept via Twitter DM, “and I have been involved in computer hacking and security for 20 years.” Affected computers will remain vulnerable until Microsoft releases patches for the zero-day vulnerabilities and, more crucially, until their owners then apply those patches.

“This is as big as it gets,” Hickey said. “Nation-state attack tools are now in the hands of anyone who cares to download them…it’s literally a cyberweapon for hacking into computers…people will be using these attacks for years to come.”

Yes, the cybertools used by our government’s agencies have been compromised, and are now available to anyone. While we’re sure Microsoft is working on patches, this is what happens when governments have access to exploits and backdoors into software that can, sequentially, endanger people’s most valuable information. While this is still about digital privacy, it’s also about security. What will it take for citizens to take notice of monumental weight of these leaks, these compromises? An attack on their credit cards? Their mortgage? Their identities?

This Doesn’t Seem Fine

A great piece by Vice’s Motherboard further extrapolates on this topic, essentially warning that it’s foolish and naive to assume any government official or contractor can keep cybertools safe. Here’s another way of thinking about this: let’s turn to the master key TSA agents have, granting them the ability to unlock any piece of luggage (with a TSA-approved lock). Well, as you may know, that key was compromised, and you can now download CAD files to get your own version 3D-printed. Imagine that. Anyone can get into anyone else’s luggage. But who would take the time to print one of these keys? Probably someone with malicious intent. And if you apply this same concept to master keys for software, apps, banking systems, etc., would you still trust the US Government (or any other government) to keep that key safe? To not misuse it?

Security and privacy in a digital context are becoming more intrinsically attached, as nearly every compromise to the former affects the latter. As my friend Eric mentioned in a recent email exchange, we may be seeing privacy become a third-rail issue in Washington. As unfathomable as it may seem, privacy doesn’t appear to be a non-partisan issue. We’ve already seen recently the reversal of ISP data privacy restrictions, even though Comcast tries to reassure us that they won’t sell our “individual” data (they will likely sell pools of data so advertisers can create look-a-like models and advertise to individuals anyway, or target individuals with their own ad network based on browsing history), Republicans seem to be more prone to manipulation by telecommunications lobbyists. Or maybe they just don’t give a shit about the digital privacy and security of the American people.

Let’s hope the recent leaks of cyber tool information makes enough headlines to reach the (mostly) non-news reading American populace, and that they take the time to understand the consequences of what can happen when we put too much trust and power in the hands of our governments.

Update

Microsoft has reported that "most of the exploits that were disclosed fall into vulnerabilities that are already patched in our supported products", and "of the three remaining exploits [...] none reproduces on supported platforms, which means that customers running Windows 7 and more recent versions of Windows or Exchange 2010 and newer versions of Exchange are not at risk".

As always, keep your software and operating system updated to the latest version.

  1. This article is a good read, as it complements Apple’s letter and explains the intricacies of what is really being requested ↩︎
  2. No, I didn’t complete the reading of this article, but we’ll assume it covers “both sides of the story”, amiright. ↩︎