Feb 25 2016
This is about Apple's fight with the FBI. I , by the way, am siding with Tim Cook because I don't want the FBI having a back door to look at my
porn phone. ( Even though I stopped being an I phone user 2 phones ago and I have no doubt that Android has already given away the farm. PS. all models are over 21 ).
Security does not just mean protecting people from terrorism, but also warding off the threat of rogue espionage agencies, cybercriminals and enemy governments. If Apple writes a new piece of software that could circumvent its password systems on one phone, that software could fall into the hands of hackers and be modified to unlock other devices. If the capability to unlock iPhones exists, so will the temptation for the authorities to use it repeatedly. And if tech firms are forced to comply with this sort of request in America, it is harder for anyone to argue against similar demands from more repressive governments, such as China’s. This newspaper has long argued against cryptographic backdoors and skeleton keys on these grounds. It is possible to imagine a scenario that might override such concerns: if information is needed to avert a specific and imminent threat to many lives, for example. But in this instance, Apple’s case is the stronger.
So that part I get.
But why can't Apple unlock the phone in the privacy of its own labs, with its own people, and just give a "transcript" of the phone to the FBI?
No software revealed-in fact no FBI agents allowed near the lab.
What am I missing here?