The brief is that the FBI cannot access the data on one of the San Bernadino terrorist's iPhones. It's set to erase data if too many attempts are made on putting in the passcode. The FBI wants to gain access to see who the terrorist may have been in communication with to further investigate terror networks. The FBI requested that apple provide a new technology to somehow access the iPhone in a back door sort of way. Apple refused citing privacy concerns and that giving the FBI this sort of technology would enable them to spy on more people.
All that spying stuff aside, for now, I have a question about the technology elements.
If the FBI wants the data on ONE SINGLE iPhone, couldn't Apple give them what they need in this instance to gain access to the data on this one cell phone and then retain the technology if/when the FBI needs it again?
If I understand smart phones correctly, there is some data with is stored on the Cloud or in a nebulous "out there" place somewhere, which could be hacked into by criminals and/or the FBI? But, don't smart phones also have something like individual data cards in them that store stuff too and access one of them is just like accessing one house (not the whole neighborhood)? So couldn't the FBI just get what they need of one iPhone and be done and not have this be a huge privacy issue for the rest of us?
I need education on the technology.
Re: apple and San Bernadino
From what I understand, the FBI is requesting Apple to create something (back door access, or whatever they want to call it) that currently does not exist. Apple feels that if it does create this hack,(aka special software or break the encryption) it will open the flood gates to more potential hacking. If this software is created, it will work on every iPhone.
The FBI is relaying on a law that is more than 200 years old. I believe it's called the All Writs Act? But I would have to read more up on it. It's basically saying America has the same powers of England. So I'm not 100% sure how they correlate. Apple can object to this and claim they are too far removed from this case and it would place an unreasonable burden on Apple. Right now that's all I know.
(Note: This information is third party from a friend that works for Apple, so if anyone else knows more about the technology, correct me if I'm wrong).
I agree with you. I wonder, though, couldn't Apple do something to help out just on an as-needed, case-by-case basis to help solve crimes and/or track terror networks and simply remain in control of the technology or create it that way so it's their proprietary stuff that they lend to the government when/if times like these arise?
With the knowledge and technology constantly growing out there, admittedly, I'm not in the know on this, it might seem like they could do something and just be called in along with their technology to aid in investigations. Wouldn't it be cool for Apple to create a technology crimes division of their company in which they hire more people, make more jobs, and use the division to make stuff that only helps as-needed and they even send their own men and women into the field with the FBI, technology in tow, to help out?
A rough example of these would be like my neighbor wants to cut down a rotted tree in his backyard and asks DH for his awesome nice chain saw. DH doesn't want the neighbor to have access to the saw(it's THAT nice and he's worried it could get messed up), but DH wants to meet a need of the neighbor. To remedy the situation, DH goes to the neighbor's house with his chain saw, and helps the neighbor cut down the tree. DH retains control of his technological tool, and the neighbor gets the help he needs to solve the tree problem.
Another issue is that terror groups are indeed using the unfettered access of technology to do evil things.
At what point, do we lose a little bit of our privacy for safety? I'm thinking the TSA. I hate that now that I'm pregnant, and won't do the scanners, that I have to have the back of a woman's hand "meeting resistance" in my crotch. Isn't this now the world in which we live?
I was mostly going for the chain saw example for the idea that DH would retain his tool and retain control of his tool. It's difficult to come up with a superb example because most things have little to nothing to do with IT, as you pointed out. The point of the example is the retention of the tool by its original owner/creator, only to be used in the presence of the original owner/creator.
The "integrity" point is interesting. Because it goes both ways, right? I'm mostly playing devil's advocate, because I too am a proponent of privacy. But, for the sake of argument, do Apple and other companies for that matter, have a societal duty to help protect the nation (if their product/service could indeed render aid)? One could argue that "integrity" means ALSO working to protect the nation and its citizens by giving help to the government.
Slippery slope, but companies have a duty to protect their employees, customers, and society by not making or doing things that cause danger. For example, my DH's company, a major airline, has a duty to get passengers to and from locations safely, to not crash into populated areas, and to maintain whatever environmental controls necessary to lessen their footprint on the environment. Could one argue that not only does a company have a duty to not DO harm, but to also work to prevent harm if it's especially in a field in which they could do a lot to prevent harm?
Although I am not a big fan of Apple, in general, I did like their response letter. They point out that they have ALWAYS complied and worked with FBI and police when they have warrants and have needed information. That, if Apple themselves could go in an get the data from the phone, they would. But they can't, so that is not what is being asked of them. Instead, they are being asked to create a backdoor to disable the "data is erased after more than 10 p/w attempts" (something like that).
The gist I got from reading the two links @vlagrl29 provided was that Apple probably could create this backdoor but it does not exist now.
And that is something I have a problem with. Although Apple's concern seems to just be with the privacy and security issues with what they are being asked to create, I don't like it because I feels it is opening the door for the government to start making demands that private businesses create this product or that product.
About the government making requests, because that's an interesting point, isn't the government already doing that and aren't we already letting them? Probably most of our safety features in cars, cruise ships, planes as well as things like scrubbers and other cleaning tools for manufacturers likely came into being because of government "asking," more like legislating, that the businesses comply. The government is already making demands of the private sphere, right?
At least for now the government is "asking" that Apple help out. This could get ugly. Is that where we want it to go - more regulations, red tape, etc.?
I don't want the government in my business. But, also I feel like Apple is in a unique place to help out...making the new backdoor technology, but keeping it in their own control and for use if and only if it's a national security sort of endeavor.
I feel very in the middle on this. I know that the federal government already has a tough time keeping up with all the real terror threats to our nation and I certainly don't want to see another attack.
That's a very good point. But it is usually safety/environment related and it is for products that have already existed, not creating a product out of whole cloth. I'll use seat belts as an example. Seat belts were a product that had already existed for many, many years before the government required auto manufacturers to put them in all cars.
With that said, I don't remember the exact stipulations, but the government has also put requirements on auto manufacturers that cars have to be "x" gas efficient by "x" year. And that falls back into the "it doesn't exist now" category. So I guess it is not totally unprecedented. Which is still not to say its okay.
I actually discussed this post with my H last night because MIS/computers has been his career for the last 30 years. He verified this "backdoor" is something Apple could easily do but agrees with Apple that it is not something they should do. When you create a backdoor, you are creating a backdoor that other people can figure out how to use, and thereby compromise the security of those cell phones.
Well, the DOJ is legally pushing Apple to create this backdoor technology. BUT, the compromise they are asking for is that Apple gets to retain the technology and may even do the hack themselves. And, once it's done they may destroy the technology.
This sounds awfully familiar to what I suggested earlier in the thread about the retention of the tool.
Honestly, as a practical matter, I think the information that is on it is probably not that useful at this point. It's likely that any contacts on there/places used to meetings etc have scattered like roaches in the light and might not prove useful at all.