This is ridiculous. Prosecutors now have so much more information left as breadcrumbs; GPS locations, tower pings, metadata, emails, social media posts, etc, than they ever had before the rise of smartphones.
It used to be easy for people to make anonymous phone calls from phone booths - try making an anonymous call today.
Strong on-device encryption of data at-rest has the nice effect of making mobile device theft much less attractive.
Strong encryption of data in transit, including auto-deleting messages, can prevent lots of other crime as well, and protects vulnerable people such as victims of domestic violence who may have to show their phone to their abuser at any time.
This is really nicely put. It should be hammered home in response to pieces like this. Particularly the bit about people in abusive relationships, and also children of abusive parents. Vulnerable people need privacy much more than the rest of us. Even a shred.
The Austin bomber is a perfect example. They caught him because he brought his cell phone with him when dropping off the bombs. He didn't use it--it was just powered on and with him.
Imagine that happening before cell phones. The criminals carrying a tracker 24x7.
The Unabomber sent mail bombs for 18 years. The austin bomber gets caught in 3 weeks.
I used to work with the local Sheriff's Dept doing IT stuff. The detectives said they were amazed at how much data they could get from phones / cell providers. It's helped solve so many crimes like bomb threats, abuse cases, and even some murders.
They also said that if someone was smart, they would just leave their phone at home, or somewhere else as an alibi and do whatever. Then they would have to go back to the drawing board which they admittedly said they haven't had to do in a long while.
If you’re smart, not impulsive, able to plan and consider other points of view, you tend not to be the murdering, bomb threatening, abusing type. People who can lay plans and carefully follow through with them tend not to rob liquor stores, and deal drugs. If you don’t care about poeple, or hurting them, but can plan and have good impulse control there’s a world white collar crime, politics, business and law for you. In short, the police tend to deal with idiots, or people who have poor planning/impulse control.
In other words, the guy who plans out a careful alibi tends not to be the guy who needs one. The previous post mentioned the Unabomber, who was a rare example of someone who had excellent planning skills, high intelligence, and excellent impulse control. It took almost two decades to catch him, and only because his brother recognized his writing.
I wish I could find the article but a few years ago I read a story that was every well written the went through in detail all the ways the cops have much more access to our private lives and data then they did even 20 years ago, and how the entire "going dark" excuse is really more accurately stating going back to the level of information they had around year 2000
I used to work with some detectives who told me this exact thing.
The "little tracking devices" helped them solve a lot more crimes than they ever dreamed of, but they said that it also requires a lot more paperwork, desk work, warrants, etc to do anything now. So even though it's easier, they have to jump through a lot more hoops to get the data.
They also said that that work is 90% of their jobs now. The other 10% is "non-it computer work."
It's a small local sheriff's dept (if that makes any difference).
I keep hearing phrases like "debate" or "the issue", but there is none. Mathematicians and computer scientists universally state the fact that crackable encryption is not encryption. This is totally understood. Across the aisle sits a group of prosecutors and legislators whose argument devolves to "but if we wish hard enough, maybe...?" and who insist on trying to plow ahead with a discussion as though it were something to be discussed.
There is no "debate", or "the issue". There are only those who understand reality and those who refuse to do so.
Matt Blaze said it best: ‘When I hear the if we can put a man on the moon we can do this, I’m hearing an analogy almost as if we are saying if we can put a man on the moon, well surely we can put a man on the Sun’.
That's a really good analogy for the encryption situation as it contains the sort of "fuzziness of definition" that loses the attention of 'the common folk'.
We could say that we put a man on the sun, but by what measure a 'man' by the time they arrive, and upon what part of the fusion reaction do we land in order to define 'on'?
Framing it as "the encryption debate" instead of "the eavesdropping debate" is telling, in terms of the inherent bias in the article.
The free ride enjoyed by law enforcers in terms of solving crimes by following the money and listening to unguarded communications may be coming to an end, thanks to cryptography. But end-to-end encryption by default isn't going to do anything but raise the lowest-hanging fruit a bit higher. Stupid criminals will still be easily caught, because they cannot avoid doing stupid things. And no one has perfect op-sec. Everyone makes mistakes, and mistakes lead to convictions. The difference is that police resources will have to be prioritized and targeted to uncover those mistakes, rather than continually picking up cheap finds from a surveillance dragnet.
This is no longer a matter for debate, at least in the US. We already went over this with Clipper Chip, and rehashed it several times. Encryption is protected by the 1st Amendment. Back doors can be used by adversaries in addition to law enforcers. Law enforcers cannot be trusted to use their granted powers responsibly. Indeed, encryption has advanced this far in part because governments cannot be trusted.
Maybe it is, but the actual point of the 1st Amendment seems to me that to be able to express idea, opinions, etc. publicly and without fear of retaliation by the government, which is a much higher grade of freedom IMHO than merely allowing pople to exchange those ideas and opinions in a form that noone else but the recipient can access.
It would somehow feel more appropriate if encryption was protected by (an extension of) the Fourth and/or of the Fifth.
Yes. More specifically, encrypted data cannot be decrypted without the key.
So your choices are:
1. Only the user has their key
2. Someone else has the user's key
If a company, law enforcement, or anybody else has a trove of everyone's keys, that trove will be extremely valuable to hackers, organized crime, domestic and foreign intelligence, etc, and it will be stolen. Whether we'll know it's been stolen is another question.
The debate typically continues along the government's right to compel the holder to reveal the key, or to reveal the decrypted message, as a loose analogue of rights of governments, where exists, to compel physical writings or access to a strongbox.
If we accept that mechanisms in common use, like warrantful search of physical belongings under the Fourth Amendment of the US Constitution, are legitimate and rightful functions of government, then warrantful key disclosure is a logical compromise that protects the individual's privacy, except when the government compels them with good cause to reveal information. Balancing this with protections against self-incrimination is one complication that's currently playing out.
This may be nitpicky, but I don't think we should use the phrase "rights of the government." It implies that the government has some innate justification for those actions, and I think it leads to a slippery slope whereby we create rights of the collective. For example, if the government has the right to jail people, the jail is there to protect all of us. As a society, that gives us the right to beg for anyone to be thrown into jail "for the good of society." It also gives legitimacy to any government action: if they have a right to throw people in jail, you can't really complain about it when they do so even if they're clearly in the wrong.
My proposal for this, which I doubt either side would support, I call the "piggybank protocol". It has four key elements:
- hardware key storage, iphone/TPM style
- mechanism for the manufacturer to authorise key release from a device
- important 1: this process should be irreversibly tamper-evident, such as IC "fuses" or one-time PROM. This should be permanently visible in the UI, so it cannot be applied invisibly.
- important 2: an obligation on the state to pay for replacement devices which have been compromised in the previous step but not used in court. This is to prevent it being routineised.
How would tamper evidence help if the device is stolen?
If nobody but me has the device encryption key, and my device is stolen, I can be sure that nobody will have access to whatever data it contains. If anybody else has the device encryption key, however, and the device is stolen, I can no longer know whether anybody else had access to the data, no matter how much tamper evidence the device has.
In general, I agree. However, as a mild counterpoint, I present Dual_EC_DRBG, where the NSA inserted a private kleptographic backdoor into the standard via the conveniently provided P and Q points.
Brilliantly awful.
You realize, of course, that, of your two choices, the DOJ would probably choose the former (i.e. general insecurity).
First off, I agree that, in principle, a backdoor is a backdoor, and vulnerable is vulnerable, period. However, there's A5/3 weak, where everyone can hack it, and there's Dual_EC_DRBG, where the right secret is necessary.
What was interesting about Dual_EC_DRBG was that the default points served as sort of a secret master key to generator state leakage. It was harder than usual for another party to gain illicit access, leaving the vulnerability largely in the hands of the NSA.
It was some pretty interesting math, harder for unintended (by the NSA) people to take advantage of, and still, of course, a crippling vulnerability.
At this point, I'd be hard pressed to trust the NSA with anything security related, as they have demonstrated different priorities.
That’s an insecure system. All Eve needs to do is get hold of that third-party key. It’s like requiring all homes to be master keyed with a key held by the police - you need to find an officer willing to take a photo of the key for a few thousand dollars and now you can get in to any house.
Any plan that relies on NOBUS ("NObody But US") information - like a global private key - is a brittle design that will fail catastrophically when the keys become public (either Snowden-style, or accidentally disclosed like the TSA master keys[1]). Why should we ignore prior experience and assume such an important, linchpin secret will somehow stay known to "nobody but us" this time?
Google HTTP security relies on Google TLS private key. Does it make a brittle design? Keys are supposed to be protected. In this case key could be assembled by a few subkeys with different people holding control over it, so they should come to an agreement before deciphering anything.
Yes it relies on Google keeping their key (or keys - there's nothing wrong with having multiple valid certificates for a domain, especially if you're the registrar) private. Google keep their key private, Amazon keep their key, Stripe keep their key. If you want to attack any provider you have to recover that provider's key.
Now if you mandate everyone use a government key or escrow their key with the government there's now a single place to attack. Also this place is affected by government whims, staff pay freezes, cutbacks, layoffs. All you need is one disgruntled employee or contractor with access and you now have access to _everything_.
Even assuming this is possible, which it isn’t, whose government would hold whose keys? It’s not like there’s one government and there isn’t international crime
What about multinationals? What about subsidiaries, like a Brazilian company which is fully owned by a USA company? Who would get the key, Brazil or the USA?
Or a contractor can take it home, against policy of course, and then it gets submitted to an online anti-virus scan.
Or it gets left on a USB key in a taxi
Or it gets left on an un-scrubbed laptop that's put up for government auction.
Or any of the other which-ways that humans get around policy limitations that don't have technical barriers, or socially-engineered to get around the technical barriers.
One can only make keys so large to be useful—my point is the same, still, that one would only have to break/find/buy/steal a single key instead of millions.
Everyone can leak the private key. That's not a concern of secure protocol. Of course there should be means to migrate to new key if that happens. Important keys are unlikely to be leaked.
The more important the key, the more sought after it will be by hackers, especially one which could theoretically decrypt an entire countries communications. It's only a matter of time.
If all we cared about was solving crimes, we could make US law enforcement much easier by doing away with a lot of things, such as certain amendments to the US Constitution.
But law enforcement is only one component of a functional society.
And the rest of us, from time to time, have to deal with the criminals who have yet to be caught by our acceptably-efficient law enforcement apparatus.
This includes people who like to steal information, many of whom are probably out of US law enforcement's reach.
The author doesn't address that problem, which I can understand, as he's a lawyer and former fed prosecutor. And while it's tempting to tell him to stay in his lane, we're stuck with him and others like him.
And it doesn't seem possible for the people (whom government doesn't really trust, despite platitudes uttered for generations) and the government (whom the people do not trust, for reasons I have a hard time impeaching) to agree on a system for safeguarding information.
Which is too bad, because most people would probably like to help enforce laws, espeically those involving children and other vulnerable types.
BUT, we live in a world where even the most open and free societies do the following:
1) Use the police (more accurately, the threat of legal trouble) to silence, bankrupt, and intimidate problematic people.
2) Defeat or ignore the controls designed to prevent abuses of power.
3) Refuse to hold government institutions accountable for malfeasance.
It might help Dr. Rozenshtein's case if he would apply his gifts to solving the political problems listed above, which might restore the trust between government and governed, which in turn might get him the tools he thinks government should have for enforcing the law.
It does not matter if busted encryption would solve crimes. It would also solve some crimes if police went door to door and searched every house for any signs of law-breaking. The plaintive cries of 'but it would stop criminals, terrorists, whatever' are not anywhere close to being sufficient reasoning.
No, it's about enabling complete surveillance of everything.
For those who don't know, Lawrefareblog has always been pro-mass surveillance. They only started doubting themselves a little when Trump won the election. But it seems they're back on the pro-mass surveillance horse now.
> When we founded this site more than six years ago, I never in my wildest dreams imagined myself writing these words about a man who will take the oath of office as President of the United States. We began Lawfare on the assumption that the U.S. federal executive branch was a tool with which to confront national security threats. While I accepted that its manner of doing so might threaten other values—like civil liberties—or prove counterproductive in protecting national security goods, I never imagined I would confront the day when I ranked the President himself among the major threats to the security of the country.
Huh, no kidding? So a "bad" president could abuse the extensive powers that are given to him thoughtlessly to "fight crime and national secure threats"? It's only the argument privacy activists have always made.
Lawfareblog seems to always work with the assumption that the government is always the "good guys" and therefore will never abuse the powers given to them. That is wrong.
“Solving crime” sounds great as a sound bite or headline. But it is a heavy handed government action. It almost directly translates to removal of free will and the choice to dissent.
Most of the comments here seem to be about the cons of removing encryption or backdooring encryption, but I don't think the author is suggesting that.
I think the article is only advocating for a checkbox in the settings menu to enable encryption for any would-be communication medium, which is turned off by default. I'm not especially against this, so long as there are no detractors to enabling this option.
- Most non-technical users won't know of this setting, and thus be unprotected.
- The consequence of forgetting to enable this setting is severe (forget it even once, and you risk leaking a secret for eavesdroppers).
- When security is optional, having security enabled is suspicious. When everyone has security enabled all the time, an attacker can't know whether someone is sending secrets or cat pictures.
This is why we're currently seeing a push towards encrypted by default everywhere. Not because everything needs encryption, but because then there's no risk of forgetting.
Bullshit. We already know the FBI is acting in bad faith and doesn't care about actually solving crimes as much as it does about having access to all data. If this wasn't the case, they would have never demanded a backdoor from Apple when they had other alternatives (https://www.macrumors.com/2018/03/27/fbi-san-bernardino-ipho...). If the trust in the FBI, and consequently in all other law enforcement organizations, somehow still existed, it should no longer after these latest lies. The FBI isn't interested in solving crimes, it's interested in exploiting secrets of its enemies for political reasons as they did to Martin Luther King Jr. and many others and working towards goals that are not publicly declared. One would have to be completely ignorant of history to think that the FBI (and by extension all US law enforcement organizations) are benevolent institutions that only care about stopping crime. History tells us otherwise. Articles like this by former prosecutors are just more propaganda from the FBI's propaganda machine. Bullshit and more bullshit pandering to the "law and order" types who have never studied any history and likely any other subjects as well.
Just to play devil's advocate: does democracy require privacy? It's not like the early 20th century where if you're were a part of the communist party you were suspect. We can pretty much say whatever we want (and we do) without fear unless we are actually criminals. I know the "if you have nothing to fear" argument doesn't hold up, but privacy is actually a relatively new thing in the human experience. And I'm not sure why democracy would fall apart without it.
But boy do I like privacy / hope it sticks around.
Alternative history, if perfect surveillance existed in 1950:
The first private conversation that Martin Luther King, Jr. had about the injustice of his time was captured and automatically analyzed, and as a result he was identified as a disturber of the peace and arrested for unrelated violations long before his message ever reached an audience.
Rosa Parks was detained for disorderly conduct long before she made her stand, and as a result her stand never happened.
There was never a zeitgeist of change in the oppressed community, because there were never any stories of people resisting. They were all prevented from acting in any significant way, or from getting together to share their stories and find support.
The issue of racial discrimination never made it to court because all of the people who could have brought it there were stopped before they could get started.
Perfect surveillance can stop any threat to the status quo while it's in its infancy, long before it amounts to anything. It can probably do it without introducing any new, draconian laws by simply looking through recent history for "questionable" actions and harassing people over them.
The argument that it's okay now basically amounts to saying that there is no longer any injustice and that today's system is a perfect example of virtue and fair play, and that anyone who disagrees is a malcontent who really does deserve to be silenced.
Sometimes the majority is wrong, and it takes brave people willing to break the rules in order to point it out and change things. I'd say that democracy does depend on privacy for this reason.
The argument I have against this is we now have near perfect surveillance and this isn't happening. I can be a part of a far left socialist movement and spread my ideas as easily as I can be part of this crazy new nazi movement. And both of those ideologies are rampant on the internet, but nobody is being arrested for it (at least not until they do something stupid). I understand in the past this wasn't the case, but as of now it's pretty difficult to stop ideas. And when/if you do stop them the internet history is right there for martyrs to be made.
I'd argue with little to no privacy the government actually has less power in situations like this. Because the lack of privacy extends to what the government is doing also. Cameras/microphones/eyes everywhere means it's much more difficult to get away with a lot of the covert insanity that was happening before.
I would say things like Snowden/Panama Papers/Wiki leaks/many other examples are showing that having privacy can actually benefit the people in power more than the average Joe. Joe Blow doesn't want his dildo collection known about, the government doesn't want (insert insane things like funding drug cartels/terrorist organizations) to be known. All in all I'd say sacrifice knowledge your dildo collection for the greater good.
That said I do believe privacy still needs to be a thing. I hope it is a thing again in the future. But I actually don't know how horrible it is that it's gone away. People might just have to tell the truth for a while, which might benefit society in its current corrupt state?
Yes, it absolutely does. The most effective way to fight a political ideology is not meeting it fairly on the marketplace of ideas to fight it directly. If the opponent is allowed to an opportunity to fight, you might lose. To guarantee a win, you have to stop the opposing political ideas before the organize into an ideology that people can support and fight for.
Political movements need time to mature. It's easy to disrupt proto-ideologies if you can identify the people that might bring together separate parts of the social graph before they knit together into a self-sustaining political movement. This is what the FBI did to disrupt civil rights groups under COINTELPRO[1]. This is what GCHQ's JTRIG[2] group is probably doing today.
While the Freedom of Assembly doesn't get as much discussion compared to speech/religion/etc, it's one of the most important rights. Assembling with people in public is how an idea moves into public view. Ideas you only discussed cautiously in private ("in the closet"?) are a lot easier to discuss openly in public when you realize other people have the same beliefs.
I guess technically democracy doesn’t _require_ privacy, but a central tenant of most western democracies is the secret ballot. The government shouldn’t know who anyone voted for. Losing privacy will allow the connections to be made even if the vote itself is secret. That can easily lead to voter intimidation etc, if not by the government then by third parties.
Follow your own argument. The government has spent its entire history, from 1801 (president John Adams) til today, persecuting people for suspicion of disagreement with the government.
I'm not sure I agree with what I'm about to say myself, but wanted to see how others feel....
The advent of the internet has caused us to question a lot of our rights as a people. We originally felt strongly around our right to free speech and our social networks embodied that to its logical conclusion, and now we're here at a dire state of hate speech and vitriol that threatens the (perceived) stability of our democracy. While the printing press and television have contributed to our partisanship, it's almost undeniable that the internet - and the ability for anyone anywhere to publish anything - has pushed us over the edge.
There have been serious considerations about what we can do about this. One of the issues is that, as a people, we might not be ready to have all this power ourselves.
I'm wondering if that extends to encryption and privacy. Can we, as a people, be trusted with this level of autonomy, power, and security?
Yes, things are now so good for the average person that you can create mass hysteria over mean words. Not a satanic child murdering panic, not teenage violence, sex and substance abuse... just words.
Once you stop consuming the outrage crack pipe, it all goes away. It's all a big game of signaling and sophistry. Ask yourself why the average journalist is qualified to whip you into a frenzy on topics that should rightly require years of study to become authoritative on.
You can be trusted, you just gotta turn your brain back on. When there is a breakdown in the accepted mechanisms for spreading social consensus, you go back to primary sources. It shouldn't be a surprise: nature has told us that monocultures are eminently vulnerable. The solution is always diversity, the actual kind, not the ideological puritanism that has appropriated that concept for themselves.
>> Can we, as a people, be trusted with this level of autonomy, power, and security?
A philosophical question is one worth considering for fun, but it has little practical value. There is no way to stop people from using mathematics to protect their privacy and their data, and any debate around that come from people who are innumerate.
We can discuss the should/could stuff as a matter of interest to waste taxpayers' dollars and time, I suppose, but that's not what lawmakers generally try to discuss. They want to figure out how to do the impossible, or at least lie about it.
It's not impossible, you are confusing math with opsec I reality.
Stop it. Pretending it's possible to encrypt effectively is going to give the government an excuse to apply rubber hose decryption under cover of deniability.
tl;dr Common criminals go with default security settings.
This is about to get even worse for law enforcement. To date, iCloud backup still offers a way for law enforcement to access data with a warrant. However that is about to change. Apple is about to roll out iMessages on iCloud, which sounds innocuous, but actually will premiere a major step forward in security: end-to-end encrypted data in the cloud by default.
This is huuuuuge.
The thing that has been holding back E2EE as the default is that it has sacrificed recoverability. That is, it has required a strong password, which is easy to forget, and if you forget it, no one can help you. So it's something you had to opt into. Not a default.
Now, Apple has created a backup solution that -- get this -- removes the need for a strong password. Sounds crazy, right? All you need to remember is your iPhone passcode. Obviously, this is brute forceable and can't secure your cloud data...
Except yes it can. They've implemented hardware security modules that prevent brute forcing. Then they destroyed the signing key for the HSM firmware. Neat.[1]
So, iMessages are going to transition to this, which probably means they won't be in the iCloud backup, and thus not available to law enforcement. And then it's probably just a matter of time until the whole backup is secured with E2EE.
I'll believe it when I see it. One simple privacy-focused solution Apple could have implemented since many years ago is keeping iMessages E2EE, instead of backing them all up to the cloud automatically and not allowing the users to disable the backups for iMessages unless they disable the whole iCloud backup (which is enabled by default).
> One simple privacy-focused solution Apple could have implemented since many years ago is keeping iMessages E2EE, instead of backing them all up to the cloud automatically
Which means they would not be recoverable or even restorable to a new iPhone. Not a good default for a consumer device.
Apple made the right choice, letting you opt into stronger security at the sacrifice of recoverability. And now they're working on the best of both worlds.
It used to be easy for people to make anonymous phone calls from phone booths - try making an anonymous call today.
Strong on-device encryption of data at-rest has the nice effect of making mobile device theft much less attractive.
Strong encryption of data in transit, including auto-deleting messages, can prevent lots of other crime as well, and protects vulnerable people such as victims of domestic violence who may have to show their phone to their abuser at any time.