mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Soap Box (https://www.mersenneforum.org/forumdisplay.php?f=20)
-   -   Government snooping, backdoors and #necessaryhashtags (https://www.mersenneforum.org/showthread.php?t=18271)

xilman 2016-02-26 18:06

Found this on Twitter.

[url]https://pbs.twimg.com/media/Cb2muR8WIAA36hp.jpg[/url]

The joke went down well at the security meeting today.

kladner 2016-02-26 19:36

[QUOTE=xilman;427495]Found this on Twitter.

[URL]https://pbs.twimg.com/media/Cb2muR8WIAA36hp.jpg[/URL]

The joke went down well at the security meeting today.[/QUOTE]
Yeah. That is a good one, right down to changing Siri into HAL.

only_human 2016-02-26 19:42

There was this rickroll suggestion:
[URL="http://www.slate.com/blogs/future_tense/2016/02/23/this_comic_perfectly_captures_the_element_of_absurdity_in_apple_s_standoff.html"]The Essence of Apple’s Standoff With the FBI, in One Comic[/URL]

rogue 2016-02-27 00:07

Am I missing a thread on the "Apple vs FBI" issue? I would have expected a poll or at least a number of people arguing one side or the other.

ewmayer 2016-02-27 02:11

[QUOTE=rogue;427533]Am I missing a thread on the "Apple vs FBI" issue? I would have expected a poll or at least a number of people arguing one side or the other.[/QUOTE]

I suspect most folks around here consider this existing thread to be just fine for the current Apple vs USgov discussion, we can consider a separate thread if the volume gets frickin' huge, but that seems unlikely.

retina 2016-02-27 02:18

[QUOTE=rogue;427533]Am I missing a thread on the "Apple vs FBI" issue? I would have expected a poll or at least a number of people arguing one side or the other.[/QUOTE]I don't care about the outcome actually. The important point is that if one uses a short passcode then they are essentially relying upon the goodwill of apple to not allow others access to their data. And any security that relies upon the actions (or non-actions) of others is not the sort of security that I want to use.

[size=1]Use a long and strong passcode then you'll be good.[/size]

rogue 2016-02-27 13:12

The ramifications are significant both ways. I am highly concerned about the "slippery slope" that will happen if Apple loses.

Can the government force companies to make back doors into their operating systems? Can they outlaw OSes that don't have back doors? Even without a subpoena, we know that the government will use those back doors. The government can't be punished for doing things that are illegal, can it? Normally one or two people might be fired, but they probably won't go to jail. In any case, it doesn't mean that the law or behaviors will change. Even if the government would have to get a subpoena, they also own all the cards WRT whom they call a terrorist. To me this is far worse than the idea of the government taking away all of our guns (which we know will never happen).

On the other hand I recall one case where a child pornographer had many GB of encrypted child pornography on computer and the government was trying to force that person to decrypt it so that they could try to determine who the victims were and to track down other child pornographers. The defendant (who was already convicted) refused, pleading the fifth. I don't recall what happened in that case.

What scares me is that if there is a back door, then we know that it will eventually become public because we know our government is not great a keeping secrets. I can understand Microsoft's opinion on this issue, since Windows already has so many security issues with its OS they have little to lose. Forcing the competition to become as insecure as their own is a good thing for them.

I would error on the side of saying "No" to the government for the simple reason that they have other means to get much of the information they are looking for: interviewing (re water-boarding), etc. And for those of you afraid of "underground" organizations taking advantage of my position, they already exist and there are more of them than any of us know about.

ch4 2016-02-27 21:36

[URL="https://bgr.com/2016/02/23/fbi-vs-apple-iphone-mdm-software/"]FBI battle over locked iPhone could have been avoided with a $4 piece of software[/URL]

[quote]. . .

“The county government that owned the iPhone in a high-profile legal battle between Apple Inc. and the Justice Department paid for but never installed a feature that would have allowed the FBI to easily and immediately unlock the phone…,” the report reads in part. “The service costs $4 per month per phone.”

. . .[/quote]I heard that if the Justice Department had just asked politely and quietly for Apple's assistance, Apple might've cooperated ... [I]just as it has 70 times in the past, without publicity[/I].

But noooo..., this time the government had to be heavy-handed, invoking a 19th-century law to try to force Apple to cooperate in a particular way that would make other devices' encryption vulnerable.

only_human 2016-02-28 00:34

[QUOTE=ch4;427624][URL="https://bgr.com/2016/02/23/fbi-vs-apple-iphone-mdm-software/"]FBI battle over locked iPhone could have been avoided with a $4 piece of software[/URL]

I heard that if the Justice Department had just asked politely and quietly for Apple's assistance, Apple might've cooperated ... [I]just as it has 70 times in the past, without publicity[/I].

But noooo..., this time the government had to be heavy-handed, invoking a 19th-century law to try to force Apple to cooperate in a particular way that would make other devices' encryption vulnerable.[/QUOTE]
But they [I]want[/I] to do it this way. They want to attack privacy provisions. That this is a high profile case is exactly what they want because if it was something smaller, people would be saying "WTF are you doing destroying rights over a two bit case?"

ewmayer 2016-02-28 01:30

A view from Oz (as in down-under-stan, not emerald-city-stan) - my only quibble is with the "has gone" tense, and the hero-worship w.r.to Apple which only recently decided to get out of bed with the NSA, probably less on principle than because it had suddenly become bad for business (OK, I admit that's 2 quibbles):

[url=https://medium.com/@jamesallworth/the-u-s-has-gone-f-ing-mad-52e525f76447#.wvrcvlb9k]The U.S. has Gone F&*%ing Mad[/url] — Medium

ch4 2016-02-28 03:48

Folks,

I just stumbled across an [I]ars technica[/I] article that made me realize that I hadn't been using enough imagination to envision what the government might have in mind.

[B]This is much, much worse than I had previously imagined.[/B]

[QUOTE=only_human;427642]< snip > They want to attack privacy provisions.[/QUOTE]

That's what I had been thinking, too ... until I read this [I]ars technica[/I] article.

No, the target is much bigger than mere privacy provisions.

[quote=only_human]But they [I]want[/I] to do it this way.[/quote]

Yes, but "this way" (as described in the [I]ars technica[/I] article) is much more sinister than what _I_ had previously envisioned.

_I_ had been thinking only in terms of the FBI's trying to force Apple to give them software that was a more general encryption-breaker, that could be used on a wider variety of devices than only the particular type of iPhone in this case.

But, as the article explains, what the government is after is something that doesn't merely decrypt, but renders all Apple software on any device vulnerable not just to have its encryption broken, but to allow the government to change [U]any software it wants to change in any way it wants to[/U].

How?

By subverting the software update capability.

[quote]That this is a high profile case is exactly what they want because if it was something smaller, people would be saying "WTF are you doing destroying rights over a two bit case?"[/quote]Exactly,

Very, very few of the general public will understand how dangerous this is.

Think: [B]Suppose the government could force Apple to give it a way to fool any software that had an update capability into accepting the government's own updates to that software.[/B]

That doesn't just let the government decrypt stuff, but [B]lets the government install [U]any software[/U] it wants[/B] onto a device with automatically-updatable software.

So why does it endanger NON-Apple devices?

Because if it can force Apple to allow this, [I]it can force any other company to allow it to subvert [U]that other company[/U]'s auto-updatable software[/I], via malicious updates.

That's why Apple cannot afford to give in. It's not just Apple's future at stake!!

- - -

This may have started to look a bit like mad ravings, so it's time for me to direct your attention to the explanatory article:

(Forgive me for quoting almost the entire article. [U]This is very important.[/U])

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

[URL]http://arstechnica.com/security/2016/02/most-software-already-has-a-golden-key-backdoor-its-called-auto-update/[/URL]

[B]Most software already has a “golden key” backdoor: the system update[/B]

Software updates are just another term for cryptographic single-points-of-failure.

[quote]. . .

... here is a sad joke that happens to describe the reality we presently live in:

[quote][INDENT]Q: What does almost every piece of software with an update mechanism, including every popular operating system, have in common?


A: Secure golden keys, cryptographic single-points-of-failure which can be used to enable total system compromise via targeted malicious software updates.
[/INDENT][/quote]I'll define those terms: By "malicious software update," I mean that someone tricks your computer into installing an inauthentic version of some software which causes your computer to do things you don't want it to do. A "targeted malicious software update" means that only the attacker's intended target(s) will receive the update, which greatly decreases the likelihood of anyone ever noticing it. To perform a targeted malicious software update, an attacker needs two things: (1) to be in a position to supply the update and (2) to be able to convince the victim's existing software that the malicious update is authentic. Finally, by "total system compromise" I mean that the attacker obtains all of the authority held by the program they're impersonating an update to. In the case of an operating system, this means that the attacker can subvert any application on that computer and obtain any encryption keys or other unencrypted data that the application has access to.


A backdoored encryption system which allows attackers to decrypt arbitrary data that their targets have encrypted is a significantly different kind of capability than a backdoor which allows attackers to run arbitrary software on their targets' computers. I think many informed people discussing [I]The Washington Post'[/I]s request for a "secure golden key" assumed they were talking about the former type of backdoor, though it isn't clear to me if the editorial's authors actually understand the difference.


From an attacker perspective, each capability has some advantages. The former allows for passively-collected encrypted communications and other surreptitiously obtained encrypted data to be decrypted. The latter can only be used when the necessary conditions exist for an active attack to be executed, but when those conditions exist it allows for much more than mere access to already-obtained-but-encrypted data. Any data on the device can be exfiltrated, including encryption keys and new data which can be collected from attached microphones, cameras, or other peripherals.


Many software projects have only begun attempting to verify the authenticity of their updates in recent years. But even among projects that have been trying to do it for decades, most still have single points of devastating failure.


This problem exists in almost every update system in wide use today. Even my favorite operating system, Debian, has this problem. If you use Debian or a Debian derivative like Ubuntu, you can see how many single points of failure you have in your update authenticity mechanism with this command:


sudo apt-key list | grep pub | wc -l


For the computer I'm writing this on, the answer is nine. When I run the apt-get update command, anyone with any one of those nine keys who is sitting between me and any of the webservers I retrieve updates from could send me malicious software and I will run it as root.


How did we get here? How did so many well-meaning people build so many fragile systems with so many obvious single points of failure?


I believe it was a combination of naivety and hubris.[/quote]


Sorta like Y2k ... but this is more sinister.


[quote]They probably thought they would be able keep the keys safe against realistic attacks, and they didn't consider the possibility that their governments would actually compel them to use their keys to sign malicious updates.


Fortunately, there is some good news. The FBI is presently demonstrating that this was never a good assumption, which finally means that the people who have been saying for a long time that we need to remove these single points of failure can't be dismissed as unreasonably paranoid anymore.


I won't write much about the specifics of the FBI/Apple situation, because there are already plenty of [URL="https://arstechnica.com/series/apples-encryption-battle/"]in-depth accounts[/URL] of the many details of the case. The important thing to understand is that the FBI is demanding that Apple provide them with a signed software update which will disable an iPhone feature which deletes data after a certain number of failed attempts at guessing the PIN (which, along with a per-device secret, is the seed from which the encryption key is derived). On iPhones with relatively short PINs, this effectively "breaks" the encryption because a small key space can be quickly searched.


(On my Debian system, such a feature doesn't even exist. If someone has my encrypted hard drive, they can freely attempt to brute-force my disk passphrase—but hopefully most people's disk crypto passphrases on computers with keyboards are stronger than a short PIN. If an attacker can convince my computer to run arbitrary code while the disk is decrypted, the key can be exfiltrated and the strength of the passphrase becomes irrelevant.)



So when Apple says the FBI is trying to "force us to build a backdoor into our products," what they are really saying is that the FBI is trying to force them to use a backdoor which already exists in their products. (The fact that the FBI is also asking them to write new software is not as relevant, because they could pay somebody else to do that. The thing that Apple can provide which nobody else can is the signature.)


Is it reasonable to describe these single points of failure as backdoors? I think many people might argue that industry-standard systems for ensuring software update authenticity do not qualify as backdoors, perhaps because their existence is not secret or hidden in any way. But in the present Apple case where they are themselves using the word "backdoor," abusing their cryptographic single point of failure is precisely what the FBI is demanding.


Apple might prevail in their current conflict with the FBI, but the fact that they could also lose means they may have already lost to someone else. Imagine if some other murderous criminal organization wanted to access data on a PIN-encrypted iPhone. What if they, like the FBI has now done, found some people who understand how the technology works and figured out who needs to be coerced to make it possible? Having access to a "secure golden key" could be quite dangerous if sufficiently motivated people decide that they want access to it.


I'm optimistic that the demands the FBI is making to Apple will serve as a wakeup call to many of the people responsible for widely-used software distribution infrastructures. I expect that in the not-too-distant future, for many applications at least, attackers wishing to perform targeted malicious updates will be unable to do so without compromising a multitude of keys held by many people in many different legal jurisdictions. There are a number of promising projects which could help achieve that goal, including the DeDiS [URL="https://github.com/dedis/cothority"]Cothority[/URL] and the Docker project's [URL="https://github.com/docker/notary"]Notary[/URL].


Being free of single points of failure should be a basic requirement for any new software distribution mechanisms deployed today.[/quote]"Being free of single points of failure should be a basic requirement for any new software distribution mechanisms deployed today."


All times are UTC. The time now is 21:09.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.