FBI demands Apple hack iPhone on their behalf


Off-Topic Discussions

1 to 50 of 113 << first < prev | 1 | 2 | 3 | next > last >>

1 person marked this as a favorite.
Adventure Path Charter Subscriber; Pathfinder Starfinder Adventure Path Subscriber

Apparently one of the San Bernardino killers had an iPhone 5c, which the FBI now wants to access. Because the encryption key for this phone isn't stored by Apple, that means that the only way in is brute-forcing the password (e.g. trying every possible combination). But the security features on the phone are such that, after ten wrong password entries in a row are entered, it will delete all of its data.

Here's where things get tricky. Apparently the FBI has taken Apple to court to order them to build special firmware that will disable this security feature, making the phone accept any number of wrong passwords without deleting anything. Apple isn't willing to do this, pointing out that this could be used to disable that setting on all current and older iPhone models, and so represent a serious security threat.

Yesterday, a federal judge sided with the government, ordering Apple to build the firmware. Apple's Tim Cook has vowed to fight the decision.

Here's a CNN article for those who want more information.


2 people marked this as a favorite.
Alzrius wrote:

Apparently one of the San Bernardino killers had an iPhone 5c, which the FBI now wants to access. Because the encryption key for this phone isn't stored by Apple, that means that the only way in is brute-forcing the password (e.g. trying every possible combination). But the security features on the phone are such that, after ten wrong password entries in a row are entered, it will delete all of its data.

Here's where things get tricky. Apparently the FBI has taken Apple to court to order them to build special firmware that will disable this security feature, making the phone accept any number of wrong passwords without deleting anything. Apple isn't willing to do this, pointing out that this could be used to disable that setting on all current and older iPhone models, and so represent a serious security threat.

Yesterday, a federal judge sided with the government, ordering Apple to build the firmware. Apple's Tim Cook has vowed to fight the decision.

Here's a CNN article for those who want more information.

For those interested, the FBI is on very questionable legal ground in this request. I suspect that this case will end up before the SCOTUS, and would be very surprised if Apple were required to do anything before such a ruling.

In that time, the FBI could -- literally -- clone the memory contents of the phone and brute force the encryption on a simulator. So I'd be rather surprised if this decision survived appellate review.


2 people marked this as a favorite.

Google supports Apple resisting the order.

No doubt thinking about the future where they might face a similar order for an Android phone. Good to see solidarity in this matter.

Grand Lodge

4 people marked this as a favorite.
Pathfinder Adventure, Rulebook Subscriber

The scuttlebutt I heard was that Android already has such a backdoor, so the government doesn't have to force Google to do anything.

Either way, this is certainly government overreach.


What I find fascinating about this is the technology, and how little I know about it, but at the same time, can have opinions about it.

So, the software to do the thing, disable the protection, does not exist, but Apple has, sort of, acknowledged, that it could. Right?

They have said that they will not ask the builders of the software to build the work around, yet, by saying this, aren't they saying that in all practicality, they probably already have?

I mean they could have come right out of the box and said, "What are you talking about. There is no way to do that. It is impossible."

But they didn't, so I bet they already have and now realize that they cannot keep that secret in the bag for long unless they can beat the case.

But if they already have acknowledged that it can be done. What's to stop someone with the knowhow who is not an Apple employee from figuring out how to do it. I bet that the government already has.

They now need it to look like it came from Apple, before they go ahead and open the phone up for investigation, so that we don't tilt our heads to the side and say

"Wait, all this security protocol stuff is just nonsense? Nothing that is build to hide our data is actually safe because if it can be built to hide then it can easily be built to show by anyone who knows how the building was made in the first place?"

To me it's like locks. We love locks. They keep us safe. Unless we need to get past them, then we call a locksmith, which is a man or woman who knows how to get around the concept of a lock at its most basic level.


1 person marked this as a favorite.

My understanding of this is not complete, but from what I have read, while they could probably clone the memory contents (although some of what I read suggests that may actually also be non-trivial), they would have to break actual 256 bit aes encryption on an essentially random key, not just guess the user's password, which is effectively impossible. In order to break the encryption by brute forcing passwords, they need information that is stored in hardware that is usually combined with the user's passcode to generate the actual key used to encrypt the data, which means either manually cracking it open and trying to read the die with an electron microscope (without breaking it and with no documention on which transistors out of millions or billions do what), attempting to find a flaw in Apple's security implementation so they can insert their own patch, or forcing Apple to write a patched firmware upgrade to get the hardware to give up its data somehow. The government cannot trivially tell the hardware to give up its data because that chip has been built to only accept code written by apple and signed with their encryption key, which the government does not have.

Also note that Apple doesn't need to make that hardware be able to accept updates at all, in which case Apple couldn't get in either (there was some speculation that I read that was how it was set up, but more recent developments cast doubt on that), but if they did that then they can't ever change how it works, which would mean if they release an iphone model with a bug in that security they can't fix it, which would have its own dangers for security.

I would not put it past the government to be able to do an invasive analysis of the hardware through electron microscopy or similar techniques to attack the phone's security on a fundamental level, but it is the sort of attack that is very expensive and very hard to try and pull off. It is likely much cheaper and safer to try and force Apple to cooperate through the courts than to attempt to bypass this themselves.


2 people marked this as a favorite.
Terquem wrote:

What I find fascinating about this is the technology, and how little I know about it, but at the same time, can have opinions about it.

So, the software to do the thing, disable the protection, does not exist, but Apple has, sort of, acknowledged, that it could. Right?

They have said that they will not ask the builders of the software to build the work around, yet, by saying this, aren't they saying that in all practicality, they probably already have?

No, they're just saying that it's possible.

I mean, if you asked me if it was possible to make an Angry Birds clone that fired robots out of a slingshot instead of birds,.... yeah, sure, that's obviously possible. But I have no reason to believe that Rovio has even considered doing it. I also know that I myself could do that, but I'd need to put a substantial amount of effort (on the order of hundreds of thousands or even millions of dollars) to do it,... and I haven't done so, and I don't intend to do so until and unless someone ponies up a huge check.

Similarly, if you asked me if it was possible to use CGI to edit every frame of the Lord of the Rings film trilogy to gender flip all the characters (so Froda and Samantha visit Galadrium, King of the Elves),... yeah, sure, it's "possible." All you need to do is go through all 30+ frames per second of about nine or so hours of film with an appropriate editor. That doesn't mean I've done it or will do it, or that anyone's done it.

Quote:


I mean they could have come right out of the box and said, "What are you talking about. There is no way to do that. It is impossible."

Yeah, that's called "lying," and in this case, that lie would probably be "obstruction of justice," (18 U.S. Code, §1505 ) and it carries with it a penalty of up to eight years in prison and a fine against Apple so large as to boggle your mind.

Quote:


But they didn't, so I bet they already have

You'd almost certainly lose that bet. Almost anything you can imagine can be created by a skilled programmer, and any other skilled programmer knows that. That doesn't mean they've wasted the money, brains, and time to do it, and usually they won't.

Quote:


But if they already have acknowledged that it can be done. What's to stop someone with the knowhow who is not an Apple employee from figuring out how to do it.

Cryptography. iPhones won't accept firmware updates that aren't signed by Apple, and the FBI doesn't have the capacity to crack the keys Apple uses to sign their updates. The NSA might have that capacity, but you can be damn sure that they're not going to advertise that capacity by intervening on such a public case as this -- and simple math says that they don't actually have that capacity, because Apple knows how big keys need to be for this kind of thing.

Now, the FBI could of course try to get a court order that Apple release their keys so that the FBI could create and install new firmware, but Apple would resist that order just as hard, and for the same reasons.

But most of the rest of what you wrote was just ill-informed conspiracy theorizing.


1 person marked this as a favorite.
Terquem wrote:


I mean they could have come right out of the box and said, "What are you talking about. There is no way to do that. It is impossible."

But they didn't, so I bet they already have and now realize that they cannot keep that secret in the bag for long unless they can beat the case.

Because something is possible that means it already has been done? That's weird logic when it comes to coding. Of course it's possible. If it's possible to make, it's possible to break.

Terquem wrote:


"Wait, all this security protocol stuff is just nonsense? Nothing that is build to hide our data is actually safe because if it can be built to hide then it can easily be built to show by anyone who knows how the building was made in the first place?"

To me it's like locks. We love locks. They keep us safe. Unless we need to get past them, then we call a locksmith, which is a man or woman who knows how to get around the concept of a lock at its most basic level.

There's a difference between calling a locksmith to open your one specific door and then change the locks afterwards, and giving a skeleton key to everyone on your block so the locks are entirely meaningless in the first place.

The latter is what's really going on here. The FBI aren't asking for them to dismantle a lock. Nor are they asking them to make a key for this specific lock. They're asking Apple to build a key that will let them open any lock at any time.

Orfamay Quest wrote:


For those interested, the FBI is on very questionable legal ground in this request. I suspect that this case will end up before the SCOTUS, and would be very surprised if Apple were required to do anything before such a ruling.

That it will end up there is almost without doubt. The real question would be, is this going to get to the Supreme Court before or after a new Justice is appointed? That could potentially make a big difference in the way the case swings.


Adventure Path Charter Subscriber; Pathfinder Starfinder Adventure Path Subscriber
Terquem wrote:

So, the software to do the thing, disable the protection, does not exist, but Apple has, sort of, acknowledged, that it could. Right?

They have said that they will not ask the builders of the software to build the work around, yet, by saying this, aren't they saying that in all practicality, they probably already have?

I'm certainly no programmer, so take this with a large grain of salt, but I don't believe that's the case.

To reiterate, my understanding is that the firmware update to disable the iPhone's "ten wrong passwords makes the data delete itself" feature does not exist yet.

What you seem to be asking - if I'm reading you correctly - is if Apple's admission that such firmware could be made, instead of saying that it's simply not possible, is indicative that they've already done so, since nobody would be in a position to call them on it if they had simply declared such a thing impossible.

The flaw in that reasoning is that (from what I understand) other people can determine that such firmware is possible to create, even if they can't do it themselves. While Apple's "signature" is necessary to make the firmware successfully work in conjunction with the iPhone, figuring out whether there's sufficient know-how to create said firmware to begin with is something that other computer experts can determine.

So a claim of "this is literally not possible" from Apple would probably be disproved very quickly.


1 person marked this as a favorite.
Terquem wrote:
So, the software to do the thing, disable the protection, does not exist, but Apple has, sort of, acknowledged, that it could. Right?

My understanding is that Apple can still push down OS & firmware updates to iPhones without the owner needing to be logged in. Which makes sense, especially if an Apple patch/update borks the iPhone. The FBI wants Apple to write an update that it can push down, so they (the FBI) can brute force the passcode without bricking the device after 10 failed attempts.

Terquem wrote:
To me it's like locks. We love locks. They keep us safe. Unless we need to get past them, then we call a locksmith, which is a man or woman who knows how to get around the concept of a lock at its most basic level.

But would anyone buy a safe, no matter how secure, when all the manufacturers were required to install a second combination that the government kept?

Edit: Multi-ninja'd


1 person marked this as a favorite.
Alzrius wrote:
Terquem wrote:

So, the software to do the thing, disable the protection, does not exist, but Apple has, sort of, acknowledged, that it could. Right?

They have said that they will not ask the builders of the software to build the work around, yet, by saying this, aren't they saying that in all practicality, they probably already have?

I'm certainly no programmer, so take this with a large grain of salt, but I don't believe that's the case.

To reiterate, my understanding is that the firmware update to disable the iPhone's "ten wrong passwords makes the data delete itself" feature does not exist yet.

What you seem to be asking - if I'm reading you correctly - is if Apple's admission that such firmware could be made, instead of saying that it's simply not possible, is indicative that they've already done so, since nobody would be in a position to call them on it if they had simply declared such a thing impossible.

In particular..... the number ten is suspiciously precise. Somewhere, buried in a file at the Apple orchard, is file containing a line that looks something like this:

if (number_of_attempts > 9) {
lock_the_phone();
}

It wouldn't exactly take a rocket scientist to change that 9 to 99999999999, or even more simply, to comment out the middle line, and anyone who's had a first year programming course in high school both knows how to do it and understands that it's possible -- to the point that Apple saying "oh, that's not possible" would not be credible and would likely get them in serious legal trouble. (It's rather like if I claimed not to be at the crime scene because "I was dead that week." The cops would have a field day with that.)

But this doesn't mean that anyone's actually done it.


1 person marked this as a favorite.
Orfamay Quest wrote:
Alzrius wrote:
Terquem wrote:

So, the software to do the thing, disable the protection, does not exist, but Apple has, sort of, acknowledged, that it could. Right?

They have said that they will not ask the builders of the software to build the work around, yet, by saying this, aren't they saying that in all practicality, they probably already have?

I'm certainly no programmer, so take this with a large grain of salt, but I don't believe that's the case.

To reiterate, my understanding is that the firmware update to disable the iPhone's "ten wrong passwords makes the data delete itself" feature does not exist yet.

What you seem to be asking - if I'm reading you correctly - is if Apple's admission that such firmware could be made, instead of saying that it's simply not possible, is indicative that they've already done so, since nobody would be in a position to call them on it if they had simply declared such a thing impossible.

In particular..... the number ten is suspiciously precise. Somewhere, buried in a file at the Apple orchard, is file containing a line that looks something like this:

if (number_of_attempts > 9) {
lock_the_phone();
}

It wouldn't exactly take a rocket scientist to change that 9 to 99999999999, or even more simply, to comment out the middle line, and anyone who's had a first year programming course in high school both knows how to do it and understands that it's possible -- to the point that Apple saying "oh, that's not possible" would not be credible and would likely get them in serious legal trouble. (It's rather like if I claimed not to be at the crime scene because "I was dead that week." The cops would have a field day with that.)

But this doesn't mean that anyone's actually done it.

Yes, that line or something like it almost certainly exists. And if that line of code is in a source file for software, changing that would be helpful for the police's case. If that line of code is in a Verilog source file to compare an internal register counter on a sealed chip on the motherboard to the value 9, commenting it out does exactly nothing to help get into an iphone that has already been manufactured. That is how it could be technically possible for Apple to be unable to break into a user's phone.

Of course, if they did this, then they can't actually change the logic in any way for any phone they've already manufactured, so if it turned out they had a critical flaw somewhere in their encryption system, they couldn't do anything about it without physically recalling all affected phones and making new ones. I am guessing they decided to keep the check in software for that reason, but it is not impossible for them to have decided to do otherwise.


1 person marked this as a favorite.
7thGate wrote:


Yes, that line or something like it almost certainly exists. And if that line of code is in a source file for software, changing that would be helpful for the police's case. If that line of code is in a Verilog source file to compare an internal register counter on a sealed chip on the motherboard to the value 9, commenting it out does exactly nothing to help get into an iphone that has already been manufactured. That is how it could be technically possible for Apple to be unable to break into a user's phone.

Of course, if they did this, then they can't actually change the logic in any way for any phone they've already manufactured, so if it turned out they had a critical flaw somewhere in their encryption system, they couldn't do anything about it without physically recalling all affected phones and making new ones. I am guessing they decided to keep the check in software for that reason, but it is not impossible for them to have decided to do otherwise.

That's your guess, and it's also the guess that the FBI technicians would make (and for the same reason).

That's why saying Apple saying "it's not possible" would be a very dangerous lie to tell the FBI. FBI would be within its authority to subpoena relevant design documents from Apple to verify their statement that they really had built the test into hardware, and the documents would quickly reveal the lie.

In general, it's usually a bad idea to lie to cops; you're almost always better off simply refusing to cooperate at all and letting the lawyers sort it out. But it's particularly a bad idea to lie to cops when there's paper out there that will prove you knowingly lied. But the fact that Apple doesn't want to do something certainly doesn't suggest that they've done it.

The fact that three different people on this thread -- none of whom, as far as I know, have detailed knowledge of iPhones or even work for Apple -- have all agreed that it's probably possible, brings it to the point that a claim by Apple of impossibility defies basic credibility. But there's a huge gulf between "could be done" and "has already been done," and the fact that it's so simple a change actually makes it more likely that they've not done it, as it would serve little purpose to have done so.


1 person marked this as a favorite.

I don't think a claim by Apple they couldn't break in would defy basic credibility, but you're right that it would be an extremely bad idea for them to lie to the FBI about it regardless of the plausibility of the lie. Since they've essentially come out and said they could potentially do it, it appears their lawyers agree with that.

I also agree that is is highly doubtful that they've already made a custom image to try and break into phones. They get no benefit from having that created, it costs them money to pay people to write it, and it would damage their business substantially if it leaked. Expecting that software to already exist is to expect Apple to work against their own financial interests, which is not very likely. They're willing to fight a court order to avoid having to make this, it is very unlikely they just decided to create it because they felt like it.

The one good thing about all of this is that the FBI request is actually for a customized version of the software that only works on the specific phone in question, and the FBI is ok with the code never leaving Apple's computers. That means that, while Apple would have the ability to break into anyone's iPhone in the future by modifying the software they're being asked to create, the FBI would have to make a request each time rather than just being handed a skeleton key. It still creates a risk in that if the code ever leaked, it could be used by malicious hackers, but that seems a good bit better to me than just handing over a custom version of IOS to the government to backdoor anyone's phone they want.


1 person marked this as a favorite.

They need to access the information in the phone, in order to catch those who committed the shooting, and bring them to justice, before they can strike again. It is a ticking time-bomb![/sarcasm]

I think this shows a fundamental flaw in the operations of the security state. The only logical explanation is that they are exploiting the shootings in order to broaden their spying powers for use in other situations. Given their powers with stingray and other surveillance equipment, they are using the phone thing as an excuse for their piss poor performance at stopping this crime and other similar crimes.

They should fire the guys making the request, or at least put them on a new assignment that isn't kicking the 4th Amendment while it is bloody on the ground.


4 people marked this as a favorite.

Apple CEO gives details.

Apparently it is possible, but it would not be limited to "just this time", and Apple knows it.


1 person marked this as a favorite.
Fergie wrote:


I think this shows a fundamental flaw in the operations of the security state.

Never assume malice when incompetence will suffice.

Quote:
The only logical explanation is that they are exploiting the shootings in order to broaden their spying powers for use in other situations.

Goodness, no. They want to use the phone to find other co-conspirators, if any. (They've already found and charged one person, so it's not unreasonable for them to look for more.) Right now, the only thing that they can find with the (locked) phone is the direction of "down," since it is to all intents and purposes a brick.

Not everything the government does is a huge, overarching conspiracy. All of the email sent on that phone is accessible through it, all of the text messages sent and received on that phone are accessible through it, and so forth. A normal cell phone is a gold mine for investigators, which is why the FBI wants it so badly.

The question is simply whether Apple is obligated to put all of its other customers at risk to satisfy the curiosity of the investigators.

Quote:


They should fire the guys making the request, or at least put them on a new assignment that isn't kicking the 4th Amendment while it is bloody on the ground.

I don't think that the 4th Amendment says what you think it does. The 4th Amendment doesn't prevent searches, simply warrantless ones. It was satisfied when the FBI went to a judge and said "we need to get access to THIS phone," and the judge agreed. This case is actually a textbook example of how the 4th Amendment is supposed to work.

Liberty's Edge

1 person marked this as a favorite.

The case isn't about the 4th Amendment.

It's about whether a judge magistrate can issue a writ compelling Apple to create a version iOS designed specifically to facilitate brute force attacks on the iPhone 5c (I think it's a 5c) and earlier (the ones before the fingerprint scanner in the home button) for a fishing expedition.

Remember that the FBI already has access to the couples other phones and all of the email, texts, and phone calls made from this iPhone and have no reason to think there's any additional info on the phone since this iPhone was the male terrorist's employer issued phone.


2 people marked this as a favorite.

Sorry, I was in a rush to post before, and tried to jam a few too many thoughts into one poorly written post.

I don't normally assume malice or a conspiracy, but when we are talking about what comes from the J Edgar Hoover FBI building, it is hard to to ignore their extensive horrific record of abuse. They have my fingerprints, and mug shot, even though I was never convicted of a crime, and they have no right to keep that information. They are violating my rights this very minute, and that is only the stuff I know about... The FBI is not to be trusted!

I think it is safe to assume that their motive is broader then attempting to find co-conspirators, which may exist, but I doubt there is any reason to suspect there is evidence of conspiracy. Like Krensky said, it is a fishing expedition. I think it is also a bad precedent to set, which is why they picked a high profile, horrific case to publicly do this.

I don't think attempting to access a murderers phone is a violation of the fourth amendment, but I think forcing a company to provide a "masterkey" is absolutely a 4th amendment issue. I also would not trust the FBI not to abuse any info they find, or new tools they are given, in this case or others. I would also take a long look at who is on any court that approves these kinds of searches - many judges are corrupt as hell. My comments in the above post were more about the use of surveillance, spying and citizen monitoring in general. It was poorly phrased, but I intend to say that they should make themselves useful stopping real crimes, rather then trying to weaken privacy laws for what is likely a million to one-shot.


2 people marked this as a favorite.
Orfamay Quest wrote:


Goodness, no. They want to use the phone to find other co-conspirators, if any. (They've already found and charged one person, so it's not unreasonable for them to look for more.) ...

Taking a quick look at this this case, it seems like BS, and I highly doubt that the charges will hold up to a trial, although a plea bargain might be taken due to the risk of decades in prison.

I think it is also highly likely that the laws he was accused of breaking are actually being applied equally. Are any of the Oregon Wildlife Refugee Idiots going to be charged with "conspiracy to provide material support for terrorism" or "making a false statement in connection with acquisition of firearms"? Did Donald Trump comply with all immigration laws when he got his Russian sham wife? Those are just not crimes white people get charged with in the US.


1 person marked this as a favorite.

When encryption without a back door is outlawed, only outlaws will have encryption without a back door.

Liberty's Edge

4 people marked this as a favorite.

Paranoia (deserved or otherwise) reguarding the FBI aside, if Apple is compelled to write the tool, or provide the resources (mainly the private key they sign iOS updates with), then they can't simply refuse to supply it to the security agencies in places like Saudi Arabia or China.


2 people marked this as a favorite.

I look at this as a privacy item in regards to what they are asking. They basically want a way to create a backdoor.

After what the NSA has pulled, is there ANYONE LEFT ALIVE in the US that actually thinks the US government wouldn't use this to enlarge their ability to spy on their own citizens?

Once that dog is out of the bag, do you REALLY think Russia, or China, or some other nation won't want to have those tools as well and use this as precedence to force companies to do the same thing for them?

Furthermore, I don't think the US government has the right to force someone to create new software against their wishes. This is basically calling out someone who does NOT OWN the phone and court ordering an unassociated party to make a program to hack someone else's phone.

The owners of the phone are the ones who would be those who should be supplying the password and/or ability to access the information on that phone.

If they do NOT have that, then there should be bids put out in regards to typical government procedure on who can hack the phone. Sure, it can take YEARS on those contracts, but the US government is not specifically supposed to favor one company over the other, or court order one to create something in regards to a crime they are not involved with.

There ARE hackers out there that probably could break into the phone.

In addition, depending on how complete the deletion is, if you can access the phone afterwards, if it didn't actually shred it, it may be HIGHLY easy to retrieve.

If it wasn't shredded, but just what typically happens with the Apple phones...what I'd do is let it delete itself. I'd actually hook it up to a new computer, get a new account, and write over the basic shell with an empty program connected. This would delete a small amount and overwrite it, but the majority of the hard drive would still be intact (basically the same thing a pushing an OS update, but in this instance, it is an account change which erases the old information on the phone in favor of the new account...though in truth it isn't a real deletion...the information is still there until overwritten...just unaccessible to the new user.

That's easy peasy stuff right there.

Of course, the better solution is to go to the actual owners of the phone. In theory, if they were doing what they should have with their business phones...then the phones are set using a certain computer by IT. That computer probably still exist. Hook up the phone to it with the update in regards to certain programs you want copied over...and you have access to the phone that way...plus it will be automatically unlocked.

Of course, the government may have already done that and the people who actually own the phone...didn't use that backup method or installation method (and instead bought everything individually from apple instead of a license from a single source...which is far more expensive)...in which case...you still could do the account interchange method.

The only reason I could figure they don't do that is that you DO actually overwrite SOME information when you do that...and they could be so paranoid they'd overwrite something on the phone that they'd miss some essential clue.

I think the more likely reason that phone wasn't destroyed was that the terrorists didn't have anything of worth on it in the first place.

Long story short...I support Apple's stance not to be forced to hack a phone that was utilized by a criminal in which they have NO involvement with. It would be akin to telling the SUV's car maker which the terrorist used, that they have to create a whole new line of SUV's in order to show that they were unable to stop the SUV's computer from working during the chase.

Sure, they could do it...but it's a ridiculous charge on an unassociated party.

.
.
.
.
.
.
In regards to the actual case, I would probably take this portion as the applicable part of the SC case on a previous related issue as the one of the defining problems in regards to what was asked...

Quote:


We agree that the power of federal courts to impose duties upon third parties is not without limits; unreasonable burdens may not be imposed.

Unreasonable in this instance being the loss of security of everyone who has an Iphone (Irony of my support of Apple is, currently, I don't use an Iphone).


^Mostly agree, except that if the erasure after 10 unsuccessful logins is halfway competent, it will actively overwrite whatever it is erasing. Utilities that do this have been available for ordinary computer hard drives for a LONG time (with the added complication that they need to overwrite several times to account for slight misalignment on a mechanical hard drive that might cause incomplete erasure the first time).


2 people marked this as a favorite.
Krensky wrote:
Paranoia (deserved or otherwise) reguarding the FBI aside, if Apple is compelled to write the tool, or provide the resources (mainly the private key they sign iOS updates with), then they can't simply refuse to supply it to the security agencies in places like Saudi Arabia or China.

I can actually see that happening. One thing in one country, more in another country, until the whole thing's full of holes... I don't think it would be unlikely for an authoritarian Government to demand some kind of access for itself.


GM Rednal wrote:
Krensky wrote:
Paranoia (deserved or otherwise) reguarding the FBI aside, if Apple is compelled to write the tool, or provide the resources (mainly the private key they sign iOS updates with), then they can't simply refuse to supply it to the security agencies in places like Saudi Arabia or China.
I can actually see that happening. One thing in one country, more in another country, until the whole thing's full of holes... I don't think it would be unlikely for an authoritarian Government to demand some kind of access for itself.

Except I don't think that the FBI will have anything to do with that process.

Turning it around, do you really think that the Chinese government would refrain from demanding a back door on the grounds that the FBI had not asked for it (or had asked for it and been refused)? The Chinese have been notorious for demanding these things as a condition of allowing various international technology companies to do business in China, irrespective of what US privacy laws say. I can't see the Chinese suddenly developing enough respect for US laws to allow US magistrates to influence Chinese law.


1 person marked this as a favorite.

John Mcafee offers to decrypt iphone with his team of L33T hackers so Apple doesn't have to.


1 person marked this as a favorite.

I hope they take him up on it. This would be interesting. XD


1 person marked this as a favorite.
BigDTBone wrote:
John Mcafee offers to decrypt iphone with his team of L33T hackers so Apple doesn't have to.

That's a hell of a boast. Especially the "will primarily use social engineering" part.

Which is certainly normally the best approach, but I'm not sure how to apply it to a given phone, where the phone's user is dead and the technical owner is already fully cooperating with authorities.

Can he guarantee to do it without triggering the "10 tries" thing and rendering the information either impossible or at least even more difficult to access?

In theory, he should be able to perform the hack on a test phone in the same condition. If he can do that, then he can give the FBI the solution and they can apply it to the phone. Handing his team the evidence to experiment with would be foolish.

Liberty's Edge

Orfamay Quest wrote:
GM Rednal wrote:
Krensky wrote:
Paranoia (deserved or otherwise) reguarding the FBI aside, if Apple is compelled to write the tool, or provide the resources (mainly the private key they sign iOS updates with), then they can't simply refuse to supply it to the security agencies in places like Saudi Arabia or China.
I can actually see that happening. One thing in one country, more in another country, until the whole thing's full of holes... I don't think it would be unlikely for an authoritarian Government to demand some kind of access for itself.

Except I don't think that the FBI will have anything to do with that process.

Turning it around, do you really think that the Chinese government would refrain from demanding a back door on the grounds that the FBI had not asked for it (or had asked for it and been refused)? The Chinese have been notorious for demanding these things as a condition of allowing various international technology companies to do business in China, irrespective of what US privacy laws say. I can't see the Chinese suddenly developing enough respect for US laws to allow US magistrates to influence Chinese law.

Well, US privacy laws AFAIK offer no protection to the privacy of foreign citizens where US companies are concerned :-(

Google, Apple and similar companies should shed the US "nationality". I do not know why they have not done it yet.


1 person marked this as a favorite.

...Tax benefits? There's also that thing about Corporations being People in the law, and maybe they wouldn't have as many local legal protections if they gave up US nationality. *Shrugs* Given the sheer volume of legal activity tech companies tend to get involved in - not to mention patents, copyrights, and so on - it does make sense to be local to wherever is your main focus...

Liberty's Edge

3 people marked this as a favorite.
thejeff wrote:
BigDTBone wrote:
John Mcafee offers to decrypt iphone with his team of L33T hackers so Apple doesn't have to.
That's a hell of a boast.

Cocaine is a hell of a drug.


3 people marked this as a favorite.
UnArcaneElection wrote:

^Mostly agree, except that if the erasure after 10 unsuccessful logins is halfway competent, it will actively overwrite whatever it is erasing. Utilities that do this have been available for ordinary computer hard drives for a LONG time (with the added complication that they need to overwrite several times to account for slight misalignment on a mechanical hard drive that might cause incomplete erasure the first time).

From what I've read, I don't think the "erase the phone" functionality actually erases anything in memory at all, at least on newer iphones. All of the data on the phone is encrypted with a key that is a combination of a key stored in the secure enclave coprocessor, i.e. a piece of hardware on the motherboard, and the user's PIN. You enter the PIN, the secure enclave combines it with the key it has and creates the actual encryption key to use to decrypt the phone's data for use. Without this process, pulling the raw data from the phone's memory is useless, since you would have to be able to break 256 bit aes encryption to get anything from it, which is currently probably impossible.

You enter the PIN wrong 10 times, the secure enclave deletes its key. Now all the data is still there, but no one can get the password to it even with the PIN, rendering it unrecoverable, and you never had to overwrite anything.

I haven't independently verified this, but this was the explanation I read online somewhere, and it made sense to me from a design perspective. Why try and make sure you completely overwrite the entire phone's data in such a way that it is not recoverable when you can just clear a 256 bit register and get the same effect?


1 person marked this as a favorite.
7thGate wrote:
UnArcaneElection wrote:

^Mostly agree, except that if the erasure after 10 unsuccessful logins is halfway competent, it will actively overwrite whatever it is erasing. Utilities that do this have been available for ordinary computer hard drives for a LONG time (with the added complication that they need to overwrite several times to account for slight misalignment on a mechanical hard drive that might cause incomplete erasure the first time).

From what I've read, I don't think the "erase the phone" functionality actually erases anything in memory at all, at least on newer iphones. All of the data on the phone is encrypted with a key that is a combination of a key stored in the secure enclave coprocessor, i.e. a piece of hardware on the motherboard, and the user's PIN. You enter the PIN, the secure enclave combines it with the key it has and creates the actual encryption key to use to decrypt the phone's data for use. Without this process, pulling the raw data from the phone's memory is useless, since you would have to be able to break 256 bit aes encryption to get anything from it, which is currently probably impossible.

You enter the PIN wrong 10 times, the secure enclave deletes its key. Now all the data is still there, but no one can get the password to it even with the PIN, rendering it unrecoverable, and you never had to overwrite anything.

I haven't independently verified this, but this was the explanation I read online somewhere, and it made sense to me from a design perspective. Why try and make sure you completely overwrite the entire phone's data in such a way that it is not recoverable when you can just clear a 256 bit register and get the same effect?

Well, you do have to clear the key in such a way that it's not recoverable, but that's a simpler, or at least smaller job.

Liberty's Edge

2 people marked this as a favorite.

It's a 5c, so no secure enclave.

If it was a later phone the FBI wouldn't need to brute force the password, bolt-cutter cryptanalysis would suffice.


1 person marked this as a favorite.

Yup. However, that is further aided by the fact that you have to figure out how to read the register's value at all, which will be extremely nontrivial if it isn't exposed in any way. You have to figure out how to dissect a chip without breaking it and get some sort of detection device on the individual transistors or whatever makes up the register storage. That is a lot harder than stripping the enclosure off a hard drive and running the raw disk platter through a machine with a custom read head that tries to recover all the erased data.

Liberty's Edge

3 people marked this as a favorite.

I'm sorry, you misunderstood.

Bolt cutter cryptanalysis is when you take a pair of bolt cutters down to the morgue, remove the corpse's finger, and use that to unlock the biometric fingerprint scanner.


7 people marked this as a favorite.
Krensky wrote:
Bolt cutter cryptanalysis is when you take a pair of bolt cutters down to the morgue, remove the corpse's finger, and use that to unlock the biometric fingerprint scanner.

♪ ♫ It's beginning to look a lot like Shadowrun... ♪ ♫


1 person marked this as a favorite.
Krensky wrote:

I'm sorry, you misunderstood.

Bolt cutter cryptanalysis is when you take a pair of bolt cutters down to the morgue, remove the corpse's finger, and use that to unlock the biometric fingerprint scanner.

I'm pretty sure that doesn't work, at least on the newer versions. The finger has to be living - pulse, blood flow, etc.

Silver Crusade

2 people marked this as a favorite.
thejeff wrote:
Krensky wrote:

I'm sorry, you misunderstood.

Bolt cutter cryptanalysis is when you take a pair of bolt cutters down to the morgue, remove the corpse's finger, and use that to unlock the biometric fingerprint scanner.

I'm pretty sure that doesn't work, at least on the newer versions. The finger has to be living - pulse, blood flow, etc.

*shrugs*

So strap it to a generator.


1 person marked this as a favorite.

Development: The Government apparently changed the Apple ID password on it. Had it not done so, accessing the data might have been very easy.

Deliberate change or accidental screwup that made stuff harder later? I have no opinion on that. Either is plausible.


GM Rednal wrote:

Development: The Government apparently changed the Apple ID password on it. Had it not done so, accessing the data might have been very easy.

Deliberate change or accidental screwup that made stuff harder later? I have no opinion on that. Either is plausible.

Of all the ridiculous, freaking...errggg.

This is something the FBI should be taking up with the employer in other words.

At least that's what it sounds like.

There IS one little gem there that makes it sound like there might be another possibility. IF they know the old iCloud password and switching it back on the account syncs everything up...I suppose that's a possibility (never tried it myself...but hey...it could work).


If i was apple and the government I'd have the back door and then loudly complain and sue that such a door was not available. Best of both worlds from their point of view


7thGate wrote:
UnArcaneElection wrote:

^Mostly agree, except that if the erasure after 10 unsuccessful logins is halfway competent, it will actively overwrite whatever it is erasing. Utilities that do this have been available for ordinary computer hard drives for a LONG time (with the added complication that they need to overwrite several times to account for slight misalignment on a mechanical hard drive that might cause incomplete erasure the first time).

From what I've read, I don't think the "erase the phone" functionality actually erases anything in memory at all, at least on newer iphones. All of the data on the phone is encrypted with a key that is a combination of a key stored in the secure enclave coprocessor, i.e. a piece of hardware on the motherboard, and the user's PIN. You enter the PIN, the secure enclave combines it with the key it has and creates the actual encryption key to use to decrypt the phone's data for use. Without this process, pulling the raw data from the phone's memory is useless, since you would have to be able to break 256 bit aes encryption to get anything from it, which is currently probably impossible.

{. . .}

Until somebody figures out a way to break 256 bit DES encryption (or builds a sufficiently colossal computer to do it by brute force), and then they can just run their discovery on the copied data that you didn't spend the few seconds or minutes of battery life to overwrite.

Of course, somebody who figures out where the key is stored before it gets auto-erased could copy this and all the data from the chips, and then run a simulated version of the phone as many times as they want to brute-force the PIN.

BigNorseWolf wrote:
If i was apple and the government I'd have the back door and then loudly complain and sue that such a door was not available. Best of both worlds from their point of view

A disturbingly plausible hypothesis . . . .

Liberty's Edge

thejeff wrote:
Krensky wrote:

I'm sorry, you misunderstood.

Bolt cutter cryptanalysis is when you take a pair of bolt cutters down to the morgue, remove the corpse's finger, and use that to unlock the biometric fingerprint scanner.

I'm pretty sure that doesn't work, at least on the newer versions. The finger has to be living - pulse, blood flow, etc.

That's what Apple said, but it's been broken using a gummi bear. You may have to peel the skin off, and making a fake out of latex, silicone, gel, etc is for more practical.


1 person marked this as a favorite.
UnArcaneElection wrote:
Until somebody figures out a way to break 256 bit DES encryption (or builds a sufficiently colossal computer to do it by brute force), and then they can just run their discovery on the copied data that you didn't spend the few seconds or minutes of battery life to overwrite.

At which point the entire world economy collapses because this hypothetical hacker now has access to everything. All the banking data, everything.

As I understand it, you can't build a computer big/fast enough to brute force it. Even theoretically. Quantum computing will do it, but hopefully by the time that's practical, we'll have upgraded the encryption.

So yes, if you want your data safe forever, you'd want to actually delete it. And considering that advances in data recovery are also possible, I'd focus on physically destroying the chip. OTOH, most data is time sensitive and it won't really matter if it comes out decades from now.


1 person marked this as a favorite.

Welp, time for me to re-watch season 1 of Mr. Robot.


2 people marked this as a favorite.
BigDTBone wrote:
John Mcafee offers to decrypt iphone with his team of L33T hackers so Apple doesn't have to.
article wrote:
And why do the best hackers on the planet not work for the FBI? Because the FBI will not hire anyone with a 24-inch purple mohawk, 10-gauge ear piercings, and a tattooed face who demands to smoke weed while working and won't work for less than a half-million dollars a year. But you bet your ass that the Chinese and Russians are hiring similar people with similar demands and have been for many years. It's why we are decades behind in the cyber race.

Loved that part...

If I was a US citizen, I would say: "Screw Hilary and Donald, I'm voting for that Guy!"

Sovereign Court

The facts in this thread, if true, are beginning to overtake fiction. [Turns up the volume on Ambrosia Slaad's little tune...]


1 person marked this as a favorite.
Pathfinder Maps, Starfinder Adventure Path, Starfinder Maps, Starfinder Roleplaying Game, Starfinder Society Subscriber; Pathfinder Roleplaying Game Superscriber

From what I saw reading the linked article, it wasn't the FBI that changed the password but the local government agency that the terrorist couple worked for -- and the effect of their doing that is uncertain because of lack of knowledge of whether that phone was ever synced up afterwards.

So the FBI may be overreaching here, but at least they are not asking Apple to help them fix their own mistake.

1 to 50 of 113 << first < prev | 1 | 2 | 3 | next > last >>
Community / Forums / Gamer Life / Off-Topic Discussions / FBI demands Apple hack iPhone on their behalf All Messageboards

Want to post a reply? Sign in.