The broader problem is that the government, courts, police and everyone else who is otherwise in charge of enforcing rights in this country does not know the first thing about technology.
We have dealt with this problem plenty of times in the past by setting up specialized bureaus and giving them broad authority to regulate players in the space and deal with infractions (the SEC is a great example of this), but there is no such tech authority. Imagine a world where the smartest programming minds aren't just joining Google but also the agency set up to regulate Google. Imagine if in a post-scandal hearing you don't have politicians who can't figure out an iPhone ask the questions but someone who knows how to set up a network firewall and what 2FA means. Until that happens, just passing more and more laws is going to be meaningless.
You might be right, from a US-centric perspective. I'll respond with a European perspective. I feel like the new laws and the first wave of huge fines are causing a cultural shift amongst smaller companies and the wider public. People starting new businesses know that they can't base their success on the ability to track people and sell the data anymore. It'll take time for the culture to change - there's a lot of people making a lot of money from invading your privacy, after all. And they'll fight back, so it's not a foregone conclusion which side will come out on top. But I'm cautiously optimistic about our side of the pond.
But this is business as usual: after the incumbents secure their dragon hoard doing unethical things, they pull the ladder up behind them by supporting legislation that makes anyone smaller than them need to think twice about doing the exact same thing they did (thus preventing competition).
Unless the same rules get enforced on the big players, it's just a moat around the status quo. In Europe's defense, they have been fining the big players, but on the other hand at this rate the fines are starting to feel more like a kickback than actual punitive measures intended to change behavior.
It's now the game over state for personal and family privacy. All the sites and apps and even casio watches has telemetry always-enabled (via mobile apps). Even cars. No one in governments or agencies cares about privacy anymore. Call it whatever one wants: corruption or lobbying, the name does not matter.
The possible solutions are impossible for the non-software engineers (pihole and dns set in the home wifi, personal openvpn gateway set on all 4/5G connected devices). Younger generations don't care. Older ones just don't understand.
It's the dark age of privacy. Literally no hope.
I wonder what the consequences would be in the next 5-10 years when the collected data would be analyzed by AI and sold to anyone for a more and more cheaper price. What should happen for the people to push the unprivacy back hard enough to stop or revert?
It’s not just the big companies. The tiny company I work at doesn’t even think about collecting data because everyone knows an inspection will look into it and impose fines if it’s not properly stored. Afaik If we would collect data, that would require advice from GDPR lawyers and other resources. So, GDPR doesn’t achieve the entire goal of killing it. It’s just a deterrent as I understand it.
Meta have been fined over a billion Euro last year alone due to GDPR. Maybe one can disagree about that being large but it's still a relatively new law showing a trend.
Yes, I disagree that fining a company with nearly $117 billion in revenue a billion euro is “large.” That’s “cost of doing business” money, not any kind of punishment or deterrent.
This "cost of doing business" meme is becoming annoying. Collectively, Facebook, WhatsApp and Meta platforms have been fined in 2022 for $687 million by a single country alone, Ireland[1]. $117 billion was the global revenue, the revenue generated in Europe was $25.8 billion[2], so the fines amount to 2.6% of what they make by accessing the European markets. Also their net income was $23.2 billion, globally, so the fines from a single country made 3% of that.
Do you still think it's a negligible cost? These costs are also going up: in January 2023 Ireland issued two more fines for $0.4 billion.
Ireland's data privacy serves as Meta's main regulator in the European Union because the company's European headquarters are in Dublin. Making it out as if Ireland's GDPR fines are "just one country" is disingenuous.
Then why did you put numbers in your first post if you don't find them relevant? You find local fine/global revenue somehow relevant but you don't find local fine/local profit or even local fine/local revenue relevant? Could you explain why?
Go to https://www.enforcementtracker.com/ and select, say, Italy. There are plenty of multi millions € fines issued to national companies: ISPs, electric utilities, food delivery services and even city administrations.
I'm old enough to have seen the internet evolve in the UK, and as usual the US dominates it and screws it up for the whole planet.
I'm surrounded by USAF, and pro American brits because they get employed by the US Mil, and their extremist mentality grinds you down.
So I really do hope the EU starts putting the US and UK in its place, and failing that Russia or even China because there are some very toxic people running these countries, getting away with murder.
Maybe I’m a cynic, but I believe there is too much conflict of interest here to gain any footing. The major government agencies want companies collecting any and all information possible on people. The agencies can then buy that data without needing any warrants or additional oversight. Find some useful data through the back channel? Get the warrant and go through the hoops to publicly request the data needed to bring to the courts.
Actually, finance is quite a similar shitshow, but people feel safe if they think it is regulated. SEC employs some intelligent people, but a smarter always goes where the real money is. The end result is that SEC will heavily police small funds and investors but will give a wrist slap on big players (which caused almost any mess in finance history) because SEC doesn't have the resources to fight them.
Resources exist to fight them. It’s just a choice not to.
Elizabeth Holmes was indicted 5 years ago. She was sentenced last year. She’s still walking around freely.
A normal person would’ve spent at least 4 of those years locked up, if not all of them. They most certainly wouldn’t be out and about after being convicted. These people simply have different rules applied to them.
She received an 11-year sentence last November but she's apparently a free woman until April 27th of this year.
Can someone who knows more about the American legal system explain this to me? Why is this possible? I assumed that when you're sentenced you get taken to jail pretty much immediately.
(Yes, I'm sure the short answer is just "because she has lots of money", but what are the details? What exactly did she spend it on to buy an extra 5 months of freedom?)
A long time (~20 years) ago I used to work at a large bank in The Netherlands. I was told there were some software engineers who stole a lot of money, but it was never reported, because the bank didn't want to get negative publicity. Instead, the bank made a deal with these ex-engineers and they'd keep their mouths shut.
> The broader problem is that the government, courts, police and everyone else who is otherwise in charge of enforcing rights in this country does not know the first thing about technology.
Politicians might not, but the government apparatus does. There's a reason 3GPP/ITU for years have built backdoors in telecommunication networks. It is simply that the IETF folks building atop Packet Switched Networks are putting up brave resistance: https://archive.is/iawrM
> The broader problem is that the government, courts, police and everyone else who is otherwise in charge of enforcing rights in this country does not know the first thing about technology
In the USA, doesn't know? Or doesn't want to know? Things like The Patriot Act are more effective (?) when there's an endless buffet of personal data available without the need for a wire tape, court order, etc. Probable cause? Much easier with phone meta data and CCT footage. Etc.
There's too little incentive for the gov to get it right. The current situation is no accident.
It’s worse than that. As shown by the 14 eyes alliance, our governments themselves spy on us. The 14 eyes alliance overrides the GPRD which makes it seem more as protectionist legislation than anything else.
> Imagine a world where the smartest programming minds aren't just joining Google but also the agency set up to regulate Google.
So you'd get a revolving door between Google and this supposed agency regulating Google where engineers and policy makers would work at the agency, be friendly to Google and then switch jobs into a cushy job at Google with a nice pay bump as a thank you for years served and services rendered.
The HIPAA seems to prove that although flawed the threat of fines provides an incentive for most entities handling it to take it seriously. Twenty years ago your Health data was not protected and was accessible unanonymized to any grad student who wanted to run a study, or any Hollywood reporter who wanted a scoop on why actor Y or actress Z were hospitalized.
I don't necessarily disagree that we could use some regulation in this space, but do you really think engineers that would have joined Google would choose a government agency instead? How many people that would have become investment bankers at Goldman go to the SEC?
There are many engineers (I'm one myself) who wouldn't even consider working for Google. It very definitely is not the case that all other options are second-bests chosen by someone who failed to get into Google.
> The broader problem is that the government, courts, police and everyone else who is otherwise in charge of enforcing rights in this country does not know the first thing about technology.
It’s beyond that. It’s micromanaging. Regulation cannot and should not be able to keep up.
Based on previous behaviour I've got zero faith that any regulations will stop these bastards from taking the piss. Fines are just a cost of doing business and they're well versed in playing the legal system, dragging out proceedings for indefinite periods of time if necessary. The only way around it is for individuals to take privacy into their own hands. I'm talking adblockers, network level filtration, VPNs and burner credit cards. If you're not already having "the talk" with your family every year (especially with those less technically literate) you really should.
Most people don’t care. They don’t care if Amazon, Facebook, and Google know what they are doing online. They opt in to tracking by joining loyalty programs to save a few bucks. Privacy is just not a concern most people have.
>They don’t care if Amazon, Facebook, and Google know what they are doing online. They opt in to tracking by joining loyalty programs to save a few bucks. Privacy is just not a concern most people have.
I disagree. Most people have no idea what's going on behind the curtain.
Not based on scientific evidence, but we can infer this from several facts:
- The massive user bases of social media platforms. If most people cared, or were even aware of what's going on, they would choose not to consent to give up their personal data in exchange of using the service. Yet clearly they value the service over their privacy.
- Blind acceptance of cookie consent forms. Even with all the publicity around cookies and tracking on the web, if you've ever shoulder surfed a non-technical person, you'd notice that they blindly dismiss any consent forms by accepting the terms. There's a reason why most websites use dark design patterns on these: they work.
- Anecdata: if you've ever tried educating a social media user about privacy concerns, you'd be familiar with several dismissals: "I just log in occasionally for <minor use case>", the popular "I have nothing to hide", and flat out "I don't care". I've heard these both from non-technical and educated, smart and technically literate people. After many years of making an effort here, I've yet to convince a single person to change their habits, let alone abandon these services, and I'm still perceived as a radical technophobe.
So most people seem to "care", but apparently not enough to change their usage habits, or they feel powerless and lack the technical skills to protect themselves. So it could be a matter of education after all, but _most_ of the people I've talked about this, stop me much earlier than we get to the point of discussing what they can do about it.
They "opt in" the same way they "agree" to 30 pages of terms and conditions. The concept of "consent" used here is totally unrealistic because it is based on perfect knowledge and infinite time on the side of the consumer, or alternatively, the idea that a consumer must invest lots of money for an expert just to understand what they consent to.
Regulations (and consumer protection groups) are actually the right way to deal with this because it reduces the redundantly wasted time for dealing with such nonsense.
Your rights are likely not what you think, even if they seem cut and dry. They are determined by the court. Then you would also need someone to enforce them when they are violated. But often times nobody cares.
Do you actually have those rights if they aren't enforced?
It's not that nobody cares, it's that the people with the power to both pass regulation and enforce that regulation aren't affected when they don't use their power to help the people (even if the majority of their constituents/voters support it).
While you could put this all on the evil politicians, it could also be linked to two things: first, the culture of the country (countries) as a whole, in that voters aren't willing to give up voting for someone that doesn't protect them from harm in their daily life (or otherwise overthrow such politicians), and second, because there are a limited number of politicians they can vote for, meaning that both/all of the candidates on the ballot can be disinterested in solving the problems at hand. The only real solutions I see to this are things like ranked choice voting, or more likely, more laws and regulations being represented by having voters vote on those laws separately from their candidates.
Given this is all intrinsically linked to politics, the only real solution us consumers can adopt is by aligning ourselves with products and services (or people) that solve the problems we care about via either logistical or technological means. For example, we can't trust Facebook not to track us, so we enlist browsers and extensions to block ads. Importantly, we can't trust the government to enforce "if you hack a computer you go to jail" so we enlist multi-billion (or trillion) dollar companies to keep us safe, since we know their incentives (money) align with our need for safe and useful
products and services that prevent bad things from happening in the first place.
Even leaving out politics you still have issue with people not caring.
Look at judges. I've personally witnessed judges and magistrates saying things that aren't true. Like thinking you're calling them prejudiced when asking to dismiss with prejudice. Or explaining that a trial de novo is a "complete do-over" and then turning around and saying they won't hear a motion because there's no record of it happening at the prior trial. I've even had a lawyer say that civil rights were violated but that the judges don't care and will see your case as a nuisance unless there was serious bodily injury or severe financial impact.
The politicians passed these laws. The people think they have these rights. The judges don't care. So the minority affected realize that they don't have those rights while the rest live in ignorant bliss.
I kinda believed in things like ranked choice voting before but lately it seems like these just create more situations where people don't trust the outcome of elections.
It seems the average person today is just looking for any excuse to complain. Someone can propose that ranked choice voting is flawed (some implementations might legitimately be) and a drove of people will latch onto that without even understanding it.
One of the main problems is that the party basically tells politicians how to vote on a bill. Very few break with party lines. We're seeing a relative lot of that deviation on the Republican side, although maybe not in a good way. The main problem in this area is that you essentially end up with 2 diametrically opposed positions with no in between. Even the stuff pitched as a middle ground is usually just step 1 in a side's plan and doesn't legitimately address/protect the other side.
The worst thing is one party rule in places like Russia, China, the UK and New York. Any of those places would be better off with two extreme parties that fight like cats and dogs.
Considering all the tech companies are conveniently based in superpowers who will throw a temper tantrum if anyone messes with their golden goose we're all fucked.
Anyone wants to start a trillion dollar/rmb trade war over cookies?
> Do you actually have those rights if they aren't enforced?
Not functionally, no.
It's worth noting that one reason that banks take AML regs so seriously is that the fines are so large, and regulators check compliance, that you don't want to get caught not delivering.
Opting out largely does not work without devastating consequences of not complying. It also does not work even with these consequences, because government power can sneak in and force data collection even when the companies don't want to otherwise. The only way to ensure privacy is with adversarial clients, where false data, indistinguishable from real data, is provided when possible to pollute the data collected. This makes the data less valuable, and also protects the individual from being tracked.
I would definitely believe that some parties in the advertising ecosystem are ignoring opt outs, but I'm not convinced that this methodology demonstrates that. The problem is, I'm not seeing anywhere where they describe setting up a history for these simulated users. For example, if they had said that they first navigated the user to a site about mattresses, cars, or some other highly marketable category, then I would expect to see much lower bids when opt outs were respected. But if they aren't doing anything like that then it isn't surprising that an opt-out fails to decrease bids, because there's nothing to be targeting on.
Their framework is quite close, however, to something that really could do a good job here. Compare these cases:
* Visit some sites that do not indicate commercial interest, then compare bidding behavior opt-in and opt out. That's what they did, and you shouldn't see much of a difference.
* Visit some sites that indicate specific commercial interest, then compare bidding behavior opt-in and opt out. You should see higher bids and ads that are related to that commercial interest for the opt-in category. If you don't, something went wrong. If you see those same higher bids in the opt-out case, then consent is not being respected.
(I used to work in an adjacent area; speaking only for myself)
It's problematic. Over time, companies that abuse opt-outs and consent preferences likely will need to be punished via a combination of regulatory fines, the marketplace taking business from them, and media shaming. We're deploying tools and technology to help audit and hold accountable such entities but I expect it's a long road ahead.
(co-founder of DeleteMe here)
Ban unsolicited marketing entirely and watch as suddenly no one cares enough to pay money for this data. That's how you get rid of it. Put violators in prison.
Bonus: you'll be able to answer phone calls again.
There are some movements in the US to get federal privacy legislation (this is getting more likely now that 4+ states have local legislation). While a lot of the press coverage is about what types of data are regulated, the inside-baseball chatter is much more focused on the how of enforcement. In particular: is there a private right of action? Or in layman's terms: Can I, just some citizen, sue over private violations? This will impact what's actually "allowed" far more than what the law says.
The actual impact of GDPR in Europe is hamstrung by the enforcement mechanisms. All enforcement happens via regulators or government agencies, just like how most CCPA enforcement in California must be undertaken by the Attorney General. Private citizens can lodge a complaint, but cannot actually force action. Despite their increased mandate, most agencies did not receive additional funding post-GDPR and effectively act as a bottleneck to enforcement actions.
(It's even worse because American companies are HQd in Ireland for tax haven purposes, so they get regulated by the Irish agency, which is strategically underfunded so as not to scare away the revenue streams.)
Only severe punishment is enough to protect our privacy. The assembly line perverts who run the tools of corporate surveillance must be personally corrected, in the harshest manner possible.
Maybe it's time to fight back, and write programs that flood the data collectors with data that looks legit, but is in fact bs.
Sooner or later their customers will notice that the data they buy doesn't help sales, and they'll stop paying for it.
Regulation is most effective when used to prevent the widespread commercial exploitation of personal data. Such exploitation involves multiple parties. Conspiracies are more fragile.
Regulating the widespread abuse of personal data for commercial gain is basically a prerequisite to other privacy improvements. Once you get such abuse under control, then other privacy invasive activity becomes more obvious.
No, and regulations are some kind of a problem because they give people the false sense of security. People should not use things they don't trust. Yes proving something is false(/does not happen) is hard, but that's one more reason why critical services should be selected carefully, obviously if one cares deeply about things like privacy and security.
This type faux shock type of comment is infuriating and tired beyond belief.
Pop-ups and cookie banners have absolutely nothing to do with the regulations. At all. They're simply a mechanism to collect consent for the data that they do store, and they're intentionally obnoxious in order to push you towards accepting immediately rather than dealing with the faff.
Uh, ok. So the result of the regulation is a bunch of obnoxious as you can be stuff that’s not effective (per the service). If the regulation didn’t exist, neither would the obnoxious banners that indicate compliance but in fact don’t mean compliance (according to the article). What is truly shocking is people say “it would work if only companies just followed the spirit of the regulation instead of the minimum required to avoid sanction” - sure, it’s true and we also wouldn’t need prisons if people would just follow the spirit of the law.
EPrivacy is a stupid regulation, no matter how well intentioned. It’s ended up polluting everything with obnoxious garbage that everyone (except I’m sure the random HN privacy nerd) clicks through blindly and aren’t really adhered to anyways for the HN privacy nerd who spends more time reading popups about privacy than the actual content they were after.
The broader problem is that the government, courts, police and everyone else who is otherwise in charge of enforcing rights in this country does not know the first thing about technology.
We have dealt with this problem plenty of times in the past by setting up specialized bureaus and giving them broad authority to regulate players in the space and deal with infractions (the SEC is a great example of this), but there is no such tech authority. Imagine a world where the smartest programming minds aren't just joining Google but also the agency set up to regulate Google. Imagine if in a post-scandal hearing you don't have politicians who can't figure out an iPhone ask the questions but someone who knows how to set up a network firewall and what 2FA means. Until that happens, just passing more and more laws is going to be meaningless.