mead cohen berger shevtsova garfinkle michta grygiel blankenhorn
Privacy in the Tech Age
Can Lawmakers Crack the Encryption Puzzle?

As the high-profile battle between Apple and the FBI over access to a terrorist’s iPhone makes its way through the courts, Congress is starting to look for a middle ground. Mother Jones reports:

The chairman of the House Homeland Security Committee will introduce a bill on Monday afternoon aiming to help solve the long-running fight between the government and the tech and privacy communities over encryption, which has made headlines recently thanks to the FBI’s attempt to force Apple to help unlock an iPhone used by one of the San Bernardino shooters.

The bill, which will be introduced by Rep. Michael McCaul (R-Texas) and is backed strongly by Sen. Mark Warner (D-Va.), would create a commission of 16 experts with a range of backgrounds—from cryptographers and intelligence officials to privacy advocates and tech executives […]

McCaul and the commission’s backers hope the panel may find a new, previously undiscovered way to reconcile the legal and technical demands of the two sides, but there appears to be little idea of what that could be.

The proposed expert commission might yield helpful insights, but it seems unlikely that it can yield a silver bullet solution to the encryption dilemma, as some of its backers are hoping. The tradeoff between security and privacy is at some level inescapable, and no amount of technocratic tinkering can make it go away. Apple has staked out a hard line—no backdoor into its products, under any circumstances—while the government is demanding one, at least in extraordinary cases like this one. Creative policy innovations probably won’t be able to circumvent what is ultimately be a political fight, with civil libertarians and technology companies pitted against security hawks. One side or the other will ultimately walk away with a win.

That said, it’s good to see lawmakers at least starting to assert themselves on this vitally important question, rather than surrendering it to the whims of the executive and judicial branches, which has often been Congress’s inclination since the September 11 attacks (as with the NSA surveillance program). New technologies are forcing a redefinition of privacy in America, and it is appropriate for the first branch of government to be at the center of this unfolding debate.

Features Icon
Features
show comments
  • Frank Natoli

    John Doe shoots Jane Smith and half the people at Jane’s employer dead. John Doe is later shot dead by police. John Doe had a safe deposit box whose contents the police wish to examine. The bank refuses to permit the police to enter the vault or the box, insisting that the dead mass killer has constitutional rights to privacy?

    • f1b0nacc1

      No, the bank is refusing to provide the police with a skeleton key to enter ALL safe deposit boxes. The dead man has no privacy rights, and even if he was still alive, any court in the jurisdiction could order his (and ONLY his) safety deposit box opened with or without his consent. The government is asking for much, much more than that…they are asking for a blanket right to open everything.

      • Jim__L

        If Apple sends the update to that iPhone and ONLY that iPhone (which can be done), then the key exists but it’s in Apple’s hands. (Apple could, by the way, construct this key any time it wanted to — which means it’s in Apple’s hands already, effectively.)

        If the OS code update isn’t handed over to the FBI and it is only deployed onto specific phones under valid warrant, that would neatly solve the problem.

        • f1b0nacc1

          The point is that the change you are talking about (removing the retries limit) MIGHT be possible through an OS update (there are some experts who disagree with this), but even so…it would have no more than a cosmetic effect in any practical sense. Creating a generic skeleton key would, once created, fatally undermine ALL of Apple’s computers and devices. Do you honestly believe that any foreign government would believe that Apple, once compelled to create this, would only use it once, and not regularly in the service of the government? Hell, it doesn’t even exist, and many ill-informed individuals already believe that Apple (as well as other tech companies) regularly uses secret back doors to violate privacy. The damage to their reputation would be incalculable.
          Once again, once this is done, it will undermine ALL existing devices, not just one single one. Limiting the damage is not possible.

      • Frank Natoli

        Please show a reference to any suggestion by the government that it is asking for a “blanket right to open everything”, i.e., all iPhones, everywhere, without a search warrant.

        • f1b0nacc1

          The security services (including the FBI) have been agitating for a back door into all encrypted devices since the Clinton administration, or does the phrase ‘Clipper Chip’ not ring a bell. The FBI has at least a dozen other cases where they want access to devices (one is actually for 175 devices in a single case), and has made its arguments against widespread use of encryption abundantly clear literally for decades.
          As for warrantless searches, google ‘NSA’, that should do it for you…
          Come now, you are smarter than this…

          • Frank Natoli

            You are not answering my direct question. Please show a reference in the context of the San Bernardino iPhone that the government is asking something beyond the context.

  • http://www.chaosinmotion.com/blog William Woody

    The problem that most people seem to be having with encryption as a way to protect data is that most commentators seem to be using the mental model of a locked safe or a locked front door. And that’s not quite correct.

    There are two ways data can be secured on a computer. The first is using access control; this is software on a computer which will allow you to read data only if certain access control security checks are passed. The data itself is on the disk in a readable format; the computer simply will refuse to read the data without the proper permission. This is akin to a bank deposit box or a secure vault; once you bypass the locks the data can be read.

    The second way to secure data is through encryption. Here, the data could be read by anyone–but without the key to unscramble the data all you have is a scrambled mess. The problem is there is only one way to unscramble the data–and you either have the key to unscramble it, or you do not: there is no middle ground. And unlike access control, where the rules can be made as complex as you like (only Alice and Bob can read the data, Bob can also write the data, and Charles is allowed to copy the data but he can’t read it), with encryption you either have the key, or you do not. You can read the data, or your you cannot.

    So when Apple “[stakes] out a hard line”, in part their position is driven by mathematics: either you have the function F(x) which can reassemble Humpty Dumpty, or you do not. There is no middle position, no function G(x) that sorta gives you Humpty Dumpty under certain conditions, no way to engineer a way to hand out F(x) to a third party without resorting to access controls.

    And as we have seen repeatedly, access controls on a computer system fail: intentionally or not, passwords can be hacked, security checks can be bypassed, customer service representatives can be tricked into “helping” someone out.

    So long as we continue to think of encryption as a locked door behind which our data sits, unscrambled and in plain text, we will continue to think that somehow a master key can be cut, or a second lock installed, or a door can be bashed in. But there is no door to open; only a special magic formula to put Humpty Dumpty back together again.

    • ljgude

      Excellent. I think having people on the committee that understand thetechnology is essential. To make one point that hasn’t quite been said explicitly all the Apple case involves is opening the front door – the pass code to the phone – which in the process unscrambles the data. No encryption will be broken. Encryption is like a safe that cannot be opened. Period. William Wood’s comment here makes this point very well. One compromise suggested is that phones can only be decrypted with a search warrant but that only works in the US and opens the floodgates to the likes of China and Iran demanding access by their rules. I keep thinking that manufacturers might be better off making their phones unencrypted like they used to and just telling people if they want encryption they must do it themselves. Like with email today where if you want/need unbreakable encryption of you email it is available. On the other hand I don’t think cops should be able to search an unencrypted cell phone without a warrant. If they find some coke or heroin in the car, then get a warrant. If they can just search the phone we know they take phones off women at traffic stops to see if they have naked selfies on them as some California cops have been done for. Some kind of balance needs to be struck and with digital technology the math makes it unbreakable and we already know that backdoors would break too many things we need like on-line banking, not to mention national and trade secrets.

      • Frank Natoli

        and just telling people if they want encryption they must do it themselves.
        Absolutely correct. Of course, Apple doesn’t let anyone “do it themselves”. You can’t install software of your choice on any Apple product. You must go through their App store, and all products on their App store pass their acceptance criteria. Ask someone who spent 18 months developing an iPad product, an animated history map, to be told by Apple that the product was not acceptable and therefore could not be sold.

        • ljgude

          Good point about Apple. So I would suppose the obvious, probably only choice to do it yourself would be Android.

        • wiffle

          Did you redevelop for Android? And did they give you any reason beyond that?

          • Frank Natoli

            Straight answer is “no”. The problem with developing for Android is that there are too many sub-divisions of Android, not compatible with each other.
            I can only guess what Apple’s reasons for rejecting my product were. Apple insisted on forcing me into converting the product into an iBook [their version of an e-book]. Problem is, e-books, iBooks, any-books, are confined to an extremely limited, effectively crippled browser language, that doesn’t allow getting out of the iBook and onto the Internet, e.g., to use Google maps. No amount of explanation to Apple about the beyond iBook features had any effect on their decision, and the developer has no recourse.
            It will be a cold day in Hades before I again use an Apple product. Thankfully, they have plenty of competition.

          • wiffle

            Save you a lot of money too. I did Apple for a while and then I realized I was basically paying money for a luxury computing product. My computer is a PC/phone is Android. You get more for the money.

        • f1b0nacc1

          And your point is? Apple has every right to control what software can be deployed upon their products, and while I sympathize with your situation here, I must gently suggest that you are letting it cloud your judgement here.
          So Apple has created its own security environment…we agree that this is the case. What relevance does this have to the case at hand?

          • Andrew Allison

            His point is that apps are only available from the Apple Store. The relevance to the case at hand is that if there’s no encryption app in the Store, the data is not encrypted, only password-protected.

          • f1b0nacc1

            No, it means that encryption (when it occurs) is only done through an Apple-provided mechanism….a very different thing indeed. The data is not simply inaccessible because there is a password in front of it, it is unreadable, even if accessible, because it is encrypted. That is a HUGE difference…

          • Andrew Allison

            Both Frank and I made, perhaps not explicitly enough, the point that if there’s encryption, it’s from Apple. This has nothing to do with the fact that what the FBI is asking for is the opportunity to crack the password (which, incidentally, Carter claims would take less than 26 minutes). If/when they crack the password, they may or may not face the further challenge of attempting to decrypt the data. The point which I and others have been trying to make is that the FBI is not asking for a backdoor, but to be provided the opportunity to break down the front door and look at what’s inside. Regardless of whether they should be granted that opportunity is debatable, but let’s at least be clear about what they are asking for, which is neither a backdoor password by-pass nor an encryption key.

          • f1b0nacc1

            I have heard a lot of suggestions about what the FBI is asking for, but Apple’s response to the FBI contains a significant collection of specifics, which I recommend for review:
            http://arstechnica.com/tech-policy/2016/02/apple-fires-back-at-doj-this-is-not-a-case-about-one-isolated-iphone/
            That is hardly a simple ‘disable the login limitation’ change. As for the time required to break the password, 26 minutes strikes me as exceptionally low, and I would like to see more detail on how he arrives at that number.

          • Andrew Allison

            Here’s exactly what the FBI is asking for: https://assets.documentcloud.org/documents/2714001/SB-Shooter-Order-Compelling-Apple-Asst-iPhone.pdf. Para. 2 is very specific. The FBI is asking for the opportunity to try to break the passcode and attempt to access the data, i.e. a brute force attack, not a backdoor. Just to be clear, this debate is about what the FBI is asking for, not whether it should or Apple should comply.

          • f1b0nacc1

            Paragraph (2) isn’t quite a specific as you suggest, but your point is well taken. And with that said, while I am LESS resistant to opening the phone to a brute force attack, the original objection still stands.

          • Frank Natoli

            Thank you, but I am going beyond your explanation. By restricting all software that can be installed on any Apple device to the Apple Store, Apple is making it impossible for any Apple device owner to install software that the owner alone determines is “safe” to use. Safe can mean bug-free. Safe can mean virus-free. Safe can mean without any backdoor. Lots of “safe”. An open source AES encrypt/decrypt with private/public key pairs obtained directly from Verisign is inherently “safer” than anything sold through the Apple Store. But you cannot install such software on any Apple device.

          • Frank Natoli

            My point, which I thought was clear, and the relevance to the case at hand, is that the Apple user cannot be certain how the software provided by Apple is compromised at the factory.
            Now let me ask you a question. If you wished to install an encrypt/decrypt package, for which you were certain had no backdoor, so that you could further be certain that your files and communications were truly secure, how could you do so with an Apple product?
            Edit: I am NOT suggesting that Apple be compelled by government to change its software policy. I AM suggesting that, certainly in the matter of secure data and secure communications, one must especially caveat emptor with respect to Apple.

          • f1b0nacc1

            Since Apple’s software (actually more than that, since their architecture requires software + firmware + custom chips, a VERY different state of affairs) is not installed in one piece in one process, it isn’t too hard to imagine cross-checking verification and multiple stages, some of which can be done by sealed automated devices, and then a final verification done outside the factory (or inside the physical factory but outside the control of the manufacturing group) by trusted personnel (i.e. NOT Chinese). This is in fact done all the time with high security components manufactured for the DOD, and I cannot see any reason why Apple couldn’t easily do it themselves. So no, I don’t see why we would take the ‘made by the Chinese, it is filled with traps’ trope particularly seriously, despite my extremely low opinion of the Chinese and their non-existence business ethics.
            If I wished to install something without a backdoor, I would use a multiple pass authenticator using several independent keys, each tied to a one-time factor. To be doubly certain, I would encrypt my data before passing it through a communications link, and decrypt it on the other end outside the control of Apple’s encryption routines, but then again, I am paranoid by nature, and old habits die hard. Apple wouldn’t even necessarily have to be aware of this last step, and I wouldn’t go to any particular lengths to make them aware.

          • Frank Natoli

            Please identify where in any of my posts I suggested “made by the Chinese, it is filled with traps”.
            You are clearly a well informed person, no sarcasm intended, but, quoting Cool Hand Luke, “what we have HEE-YAH is a failure to communicate”.
            Apple does not permit individuals to install any software on Apple products except that offered by Apple through the Apple Store, correct?
            Individuals owning Apple products [iPhones, iPads, Macs] thus cannot install open source software of their choice, e.g., encrypt/decrypt packages, correct?
            Individuals owning Apple products are thus entirely dependent on the integrity of Apple for whether or not their data and/or communications are secure, correct?

    • Andrew Allison

      Excellent decryption [sorry, couldn’t resist!]. It’s my understanding that the FBI is asking for the key to the door, not an encryption key. In fact, since they don’t have access to the data, they don’t even know whether or not the information is encrypted. Thus the issue has nothing to do with encryption, but to whether a warrant should provide access to the data. Whilst I’m adamantly opposed to warrant-less snooping (or blanket snooping with a warrant), it seems to me that a legitimate warrant should provide access. As other commentators have pointed out, essentially unbreakable encryption is available, at least on Android phones, to those who feel the need for it.

      • f1b0nacc1

        The FBI isn’t asking for a ‘key to the door’, it is asking for a way to bypass the door entirely, and as such, bypass it for all other doors for all other users. This is very different than simply accessing data in one place, as William so elegantly describes.
        Two other points:
        1) Since the FBI knows that the killers destroyed their personal phones, they have not offered any explanation as to why the killers would not have done the same with their work phone, if that phone did indeed have any useful information on it. This means that their argument for the data on the work phone is very, very weak indeed.
        2) The FBI created this problem in the first place, but encouraging the phone’s legitimate owner (the City of San Bernadino) to wipe the password after the killers deaths. Hence they had access to the data if they wanted it, but clearly didn’t think it mattered.
        In both cases, the FBI was either monumentally stupid, or was deliberately trying to create a crisis in order to justify demanding a back door. This is hardly a new demand from the FBI, or the feds in general.

        • CapitalHawk

          The way it seems logical to think about these things is to view the phone as a safe, with access controls (i.e. a password) that allows full access to all of the information on the phone in a non-encrypted state. All data sent to and from that phone (safe) is encrypted. The FBI is not seeking access to the encrypted communications from the phone, nor from anyone else’s phone. They are seeking the ability to have an unlimited ability to attempt to “pick the lock” of the phone/safe and thus gain access to the unencrypted information. So no, they aren’t asking for a key, but a way to unlock the phone without the key.
          As to your other points:
          1) Your argument seems to be “these people were really good, perfect even. As such, because they didn’t destroy these phones, it follows there is no information of value there.” The problem is that we don’t know they were perfect. Criminals screw up all the time and thank goodness for that or most crimes would never be solved.
          2) You seem to think that *ALL* information on an iPhone is backed up to the cloud. This is absolutely not the case. Even if the iPhone had backed up to the cloud, the FBI would still seek access to the phone itself as there is a wealth of (potential) information that is only available with direct access to the phone itself.

          • f1b0nacc1

            With respect, you don’t seem to understand how encryption works, nor what the FBI is asking for.
            The phone is not merely a container here, it is an active participant in the encryption process, and as such you cannot simply ‘grant access’ to an encrypted message without fundamentally undermining the encryption process itself. The FBI (which most likely already knows that there is nothing on this phone….see their testimony in front of Congress today) is simply using this as a way to force what congress and the tech industry will not give them, a universal back door to snoop on everything on the phone itself. Remember that a smartphone is far, far more than the connection that is established, or even the messages stored within, but an ‘intelligent’ device that participates in the transmission and encryption of those messages. That is why the digital keys are so crucial, and it is why the FBI wants them so badly. Contrary to what you seem to believe, this has nothing to do with opening a single safe, it s more analogous to forcing the safemaker to provide a back door to all safes, then trusting the FBI with the combination.
            Criminals who take the time to thoroughly destroy their personal phones are rather unlikely to simply ‘ignore’ their other phone just out of carelessness. These criminals in fact showed that they took time and effort to plan their operation, then meticulously destroy useful evidence that would be associated with it. I think that it is more credible that the FBI is lying about this (and given their purported instructions to the City of San Bernadino, they didn’t seem to think that this was important) or that criminals who destroyed EVERYTHING ELSE neglected this obvious weak link?
            As an iPhone owner, I know very well that most information is not backed up into the cloud…in fact one of the first things that I did when I got my iPhone was to DISABLE this functionality for precisely this reason. The security in an iPhone (particularly the later models, where security is reinforced in both the hardware and the firmware) is quite good, that in the cloud is NOT.
            Look, you previous comments in other threads make it clear that you believe that the government has a right to this data, and thus don’t see any danger in them being able to access the privacy of innocent or unrelated third parties. I reject this completely, and am pleased to see Apple (a company that I typically don’t care for, though they do make some lovely devices for casual use) taking such a strong stand. The FBI has many other ways to perform their investigations….just as I wouldn’t accept the police violating due process rights of a suspect in a crime, I don’t accept the FBI violating privacy rights here. Some bright lines need to be drawn that not even the government gets to violate.

        • Andrew Allison

          You are mistaken. “A judge last month ordered Apple to develop software that would disable security mechanisms on Mr. Farook’s phone so that the F.B.I. could try multiple passwords to unlock the phone through a “brute force” attack, without destroying any data.” (http://www.nytimes.com/2016/03/02/technology/apple-and-fbi-face-off-before-house-judiciary-committee.html“. I agree with you that the FBI was utterly incompetent and is using the case as an excuse to enable law enforcement to break passwords, but let’s be clear about what they’re asking for. It should also be noted that the FBI needs physical possession of the phone to do this. As others have noted, Apple only allows apps from the Apple Store to be installed, ergo, if there isn’t an encryption app, the data is not encrypted. Either way, the FBI is asking only that the 10-strikes and you’re SOL be disabled.

          • f1b0nacc1

            I never denied that the FBI wanted this…in fact I made the argument that the FBI wanting this was in fact proof that they aren’t concerned about the data on that phone. A brute force attack will take weeks or months, even if Apple made the change available (they shouldn’t, but that is a different story), by which time anything that they recover would be useless.
            As for your comment regarding encryption, you are mistaken. Access to data can be restricted with a password while the data remains unencrypted, but while restricted access does not require encryption (I am working with some of that data right this minute), unrestricted data can in fact be encrypted. Consider a cyphered message in plaintext: you can read it freely, but you cannot make sense of what is in it, whereas a message in locked safe cannot be read, even though it is in plain English. This difference is vital to understanding what the FBI is asking for, and why their request is so dangerous.
            As a minor point (and I know that this is referring to another thread, I apologize), there are encryption capabilities built into apps that exist in the Apple store, although there are not (to my knowledge, though I concede I haven’t looked carefully) any encryption applications in the store. This does NOT mean however that Apple doesn’t provide encryption services…in fact their encryption capabilities (particularly in the later model iPhones, which incorporate encryption support at the chip level) are especially robust, which is why we are having this whole discussion in the first place.

          • Andrew Allison

            I knew it was too good to last [grin].

  • Fat_Man

    I want to add something on the general idea of backdoors. That is a method by which the government could obtain the key to an encryption program that would allow it to decrypt a cypher text with very little effort. There are a couple of killer problems with the idea.

    1. If there is a backdoor that the US government could walk through, the Chinese Government, or the Russian government, or a Russian script kiddie could go through it, for purposes of espionage, or for theft, or just for giggles and grins. If that happens, say good bye to on line banking and online transactions through the computers with back doors.

    2. One time pads work and cannot be broken. See: One-time pad on Wikipedia. If a key has an entropy that is ≥ the entropy of the plain text, and the entropy of the key is nontrivial (256 looks pretty good right now, 1024 is probably bullet proof), the cypher cannot be reasonably expected to be broken within the remaining thermodynamic life of the universe. The problem with one time pads has been their creation and distribution. However, a 32 GB micro SD card can be had for $8.00. My guess is that obtaining the random numbers to be the key will be more expensive than storing and copying them. See Random.org.

    Therefor, even with a backdoor, non crackable cypher texts can be generated and transmitted at a low cost.

    3. Once backdoors are known to exist, they will only be useful in catching the unwary. True enemies, can be expected to use technologies that the backdoor will not be able to penetrate such as one-time pads.

    4. Far fewer than all of the people in the world are American citizens. Why any of them would want to open their communications up to the USA is way beyond me. I would expect that Non backdoored devices will be produced and sold all over the world. The US cannot keep drugs out of the country, and it cannot keep illegal immigrants out of the country, why do you think that it could keep non-backdoored communications devices out of the country?

    Beyond that, backdoors are a great idea, if you are a government official who does not know anything, cannot think, and believes that he is smarter than everybody else, which pretty much describes almost all government officials.

    • Frank Natoli

      Also note that Apple alone provides no mechanism for owners of their products, iPhones, iPads, etc., to install whatever software they please, e.g., an open source encrypt/decrypt package that is known to have no backdoors. Apple alone controls all software installed on their products, so a user can NEVER KNOW whether their product is secure or not. By contrast, Android products provide full freedom for owners to install whatever they please.

      • ljgude

        I see we went the same place in the end.

      • Fat_Man

        You touch on a larger problem:

        “The NSA’s back door has given every US secret to our enemies” John McAfee on Feb. 26, 2016
        http://www.businessinsider.com/john-mcafee-nsa-back-door-gives-every-us-secret-to-enemies-2016-2

        McAfee is the man who invented anti-virus software.

        After reading the McAfee piece about back doors, I wonder how safe any of our devices are.

        If you are really paranoid, should you use OpenBSD? Or should you only use roll your own?

        And what about cell phone software. can we really trust Apple? Google? Is there a cellphone analog of OpenBSD? Is CyanogenMod trustworthy?

        My guess is that serious security agencies roll their own, perhaps from Open Source beginnings. But, If I were they, I would be looking at every bit of software and hardware involved. Do you have any idea what is in the editors and compilers you use? How about the chips? Does NSA make their own chips?

        • Frank Natoli

          Roll your own?
          Given that there is no evidence that AES encryption with private/public key pairs has been compromised, my answer to that question is “no”. And what evidence would there be? Every https transaction would have been compromised, bank accounts and credit cards would have been looted the world over, either by O-C or by rogue nations. Hasn’t happened. Are they waiting? Why should they, when they don’t know when some change to the algorithm could make their breakthrough unusable? Recall when the Germans changed the Enigma rotor layout, Ultra decrypts ceased for many months.
          Can we really trust Apple?
          I don’t trust anyone who says “we require you to trust us”. That would be Apple.
          How about the chips
          I recall a story that the ChiComs, very, very careful people, very knowledgeable about cyber-crime, being the most proficient cyber-criminals on the planet, required Intel to provide the microcode for their Pentiums, so the ChiComs could verify that no backdoors existed in the microcode. So, you don’t have to make them, if you can be absolutely certain of their contents.
          Note to the Feebs: if you check my secret clearance, you’ll see I had no access to relevant information, ergo I’m not disclosing anything I shouldn’t be…unlike the former Secretary of State.

          • Fat_Man

            The problem is that you don’t know what you don’t know. You can think you are being reasonable, and then you can look down and see that you have walked over the edge of cliff.

          • Frank Natoli

            you don’t know what you don’t know
            Yes and no. As the much reviled [but I always liked] Donald Rumsfeld noted, there are three degrees of “know”.
            (1) You know what you know. The North Koreans have the Bomb.
            (2) You know what you don’t know. The Iraqis are extracting fissile, bomb-grade U-235, with which in sufficient quantities an uncontrolled chain reaction is relatively simple to create, but how much U-235 they have right now, and how much they are producing per unit time is unknown.
            (3) You don’t know what you don’t know. There’s a threat that U.S. intel is totally oblivious to.
            My point in most of my posts in this thread is that Apple represents case #2. Being prohibited by Apple from installing software on an Apple device that only you approve of, a degree of “don’t know” is introduced into how secure your data and/or communications are. Use a Droid or Windows or Linux machine, install open sourced software, you move to case #1.

  • Jim__L

    Isn’t the issue the fact that the FBI wants Apple to tell the iPhone to automatically update its OS to remove the “brick the phone if the wrong password is entered more than 10 times”?

    That’s adjusting the access controls, not encryption. And Apple could do this whenever it wanted to.

    In fact, Apple could already have a backdoor into all iPhones, without telling anyone about it.

    • Frank Natoli

      Note that Apple says it WON’T help the government hack the dead killer’s iPhone, not that Apple CAN’T help. If in fact there is no way to bypass login security, Apple could simply state that, and the issue of the dead killer’s iPhone is moot.

      Note also reports that Apple, as a condition of selling to the billions in mainland China, gave the ChiCom government iOS source. Thus, if there is a login security bypass, the ChiComs know it, can use it, and Apple cannot get away with lying about it to Washington.

      • f1b0nacc1

        The source code of iOS is worthless in this case without the digital signature, and Apple hasn’t (and won’t) give that to anyone else. A big chunk of this is in the firmware, and some of it is in the hardware itself, so handing over source code (which is updated frequently in any event) is a non-issue.
        The only way that Apple could help Law Enforcement (other than simply disabling the retry limits, which would let the Feebs try a brute force attack, something that would take weeks, if not months to work through) would be to hand over the various digital signatures as well as relevant software source, which would in turn render useless all security on all of their devices. Clearly this isn’t an option for Apple (nor would it be for any other manufacturer) as it would also open up their devices for attacks by hackers, foreign governments, business rivals, etc. Like it or not, this is simply not something that any company whose customers expect their data to remain safe could ever contemplate.
        The Feebs have shown bad faith in this from the beginning, and have made it clear that they expect Apple to cooperate not only in this case but in at least a dozen others, which gives us an idea of what they have in mind for the long-haul. Apple isn’t ‘lying’ about their limitations here (look, you don’t have to like the company – I don’t like them either, for that matter – but suggesting that they are being petulant or dishonest merely because you have some sort of grudge against them is beneath you), they are simply take a practical stand that in this case happens to also have principled results.

        • Jim__L

          “Apple could help Law Enforcement (…simply
          disabling the retry limits, which would let the Feebs try a brute force
          attack, something that would take weeks, if not months to work through) ”

          What’s wrong with this solution?

          • f1b0nacc1

            Simply put, the precedent that it would set. Once Apple does this for a terrorism case, what is going to stop the Feds (or state prosecutors, or local DAs, or ambitious defense counsels…) from demanding the same (and possibly more) concessions? Given the fact that even reasonably good encryption would withstanding months, if not years of determined attack (possibly A LOT more than that, see Fat Man’s comments for detail), this would be more symbolic than anything else, and clearly designed to be the ‘first crack in the wall’ in any event.

          • Jim__L

            The “slippery slope” argument is by nature opposed to finding compromise. I suspect that compromise here is possible.

          • f1b0nacc1

            I would normally agree with you, but as legal systems are based upon precedent, compromise is typically the basis for one side to establish precedent to advance its position at the expense of the other. The FBI and other security services (including the police forces in almost every municipality and state) are institutionally opposed to encryption, as it makes their job MUCH more difficult. Why would they compromise, and even if individuals would, why would others in the same position respect such compromises?
            As a side issue, if you believe that we are dealing with fundamental rights here, why SHOULD we accept compromises? I freely concede that this is a separate issue (I am not trying to divert the conversation), but it is something to think about.

          • Jim__L

            I understand what you mean when you don’t want a backdoor to all personal information. I don’t want that either. I think there are abilities law enforcement shouldn’t have, in the name of getting information (torture comes to mind.)

            That said, rights (like privacy) are subject to due process. I think that a warrant in a
            criminal investigation relating to actions that have always been
            considered crimes is a defensible place to draw the line.

            We’ll see how it all works out.

          • f1b0nacc1

            And as long as this is a question of something limited to a single phone, I have few objections, but what the FBI is asking for is *NOT* limited to a single phone, but rather a generic approach.
            If you were going to require someone to enter a password into a phone, for instance, I have no real objection to that (assuming a warrant was available, that is), but I remain deeply suspicious of anything more ‘automated’…

          • Frank Natoli

            It’s not just a “terrorist case”. It’s a dead man who has no constitutional right to privacy.

        • Frank Natoli

          The source code of iOS is worthless in this case without the digital signature
          What neither you nor I know is whether iOS implements a backdoor login security bypass. Pulse the volume key five times in five seconds followed by password TimCookIsTheGreatest or something like that. Someone with iOS source knows the answer to that, digital signature not required.
          As I have noted in other posts, Apple has said it WON’T cooperate in bypassing iPhone login security, not that it CAN’T bypass iPhone login security. Agreed?
          And one can reasonably infer from the above that there IS a login bypass, else Apple could simply say CAN’T and the issue becomes moot. Agreed?

          • f1b0nacc1

            Come now, you really should know better than that. Building such an obvious bypass (which a determined hacker could easily discover) would be suicide for Apple….even a more sophisticated one would eventually be uncovered, and that would be the end of Apple’s ability to market their product. Worse still, what happens when a disgruntled engineer (and Apple has plenty of those) decides to sell that secret information (lets assume for a moment that it is something more challenging than what you proposed) to the highest bidder, or just posts it anonymously ‘just cause’.
            Further…why WOULD they leave such a hole in their security? What does it gain Apple to do this? You seem to assume that they would do this, but you have offered no reason whatsoever as to why they would do so. It offers them no marketing advantage (and Apple is nothing if it isn’t a marketing company), it gives them no competitive edge (in fact it actually makes them less competitive), and it invites precisely this sort of bullying by the Feebs. So once again….why would they do this?
            As for Apple saying that they will not cooperate is entirely the right way to phrase it. This isn’t a debate about whether or not they can do it (it is unlikely, given their security architecture, that they can do it on the later models anyway, but that is an entirely different debate), but rather whether or not they should do so as a company. So no, I do not agree with this point, and in fact suggest you read their own filings where they pointed out that the ONLY thing that they could do would be impractical for a number of reasons.
            With that said, one can easily conclude that there is likely no login bypass (once again, why would Apple EVER build a hacker highway into their OS that would almost inevitably be discovered and used to compromise their products, thus destroying their most profitable market?), and that Apple simply cannot do what the government wants to do.

          • Frank Natoli

            Building such an obvious bypass (which a determined hacker could easily discover) would be suicide for Apple
            The example I gave was not intended to be final. There could, for example, be a test point on the motherboard which when externally grounded enables the backdoor. Only Apple…and the ChiComs…know. As for suicide, sounds to me like a good reason to publicly refuse to confirm its existence.
            Apple simply cannot do what the government wants to do.
            Then Apple should simply say that under oath and stop pussy footing around.

    • f1b0nacc1

      Adjusting the access controls isn’t quite as simple as you make it out to be (though it is likely possible, at least on the older 5C phones, such as the one in this case), but it really changes very little. Yes, the Feebs can then brute force their way into the phone, but this won’t be quick or easy, and even in the long run it doesn’t serve the REAL interest of Law Enforcement here….forcing a back door into all devices. The City of San Bernadino did, after all, wipe the password at (they claim) the behest of the FBI, which means (if true) that this particular crisis was created deliberately to challenge the very existence of encryption, not simply gather data for one phone of a dead man.
      Remember too that the killers in this case destroyed their personal phones which are likely the ones that they really used to transmit information that LE would care about. If they had any data of significance residing on the work phone (the iPhone 5C), then why didn’t they destroy that one too?
      This whole thing smells of the Administration’s mantra “never let a crisis go to waste”

  • Gerald

    There are a least a number of additional issues here. First, the government is implicitly admitting that it does not have the technical expertise to open the phone’s data and has to resort to trying to force a corporation to do it for them. With the massive investments made in NSA, CIA, FBI, etc., etc.; this is quite an admission. Next, the history of hackers (foreign and domestic) ability to penetrate government data does not lend much assurance that opened data would be kept confidential. See for example the rapid increase in identify thefts of taxpayer information from IRS tax returns. Finally, the government agencies have not exactly proven to be reliable in confining their attention to terrorism, but have seemed to cast their nets much more broadly for all kinds of reasons. As to the proposed commission, it is encouraging to see Congress admit their lack of knowledge, but in the end the issue comes down to freedom versus security concerns.

    • Andrew Allison

      The government is not implicitly admitting any such thing. It is simply asking Apple to give it the opportunity to crack the password. Whether or not the data is encrypted is unknown, and irrelevant.

  • FriendlyGoat

    We don’t think it is a cool idea for North Korea to sell nuke technology to the highest bidder, so we shouldn’t think it a cool idea for Apple to sell crime-assisting technology to everyone in the world. Heck yes, I’m a liberal, but this notion that Apple is some sort of “nobility on a stick” for presumably standing with ordinary citizens for THEIR privacy is tantamount to crazy.

    • CapitalHawk

      Agreed. For Apple this is all about marketing and making money, by posing as a protector of privacy. Let’s not pretend that they actually care.

© The American Interest LLC 2005-2016 About Us Masthead Submissions Advertise Customer Service