> You almost always know how the formers’ work will be broken.
The thing is that there are enough people who blindly trust ChatGPT's answers, and they don't know in which ways they could be broken, and they wouldn't have the knowledge to verify the answers because they are asking about things they themselves know very little about.
But that doesn't need new peripherals, I could do that in my home WLAN network if they'd just install standard software for it on the phone (which you can fix by installing it from F-Droid etc.)
You can decompress the streams with mutool and edit the Postscript-like code in any editor, if you want. (That doesn't help a lot with editing text, of course...)
Same with programming: You just copy some old code and modify it, if you have something lying around.
Same with frameworks (Angular, Spring Boot, ...). The tools even come with templates to generate new boilerplate for people who don't have existing ones somewhere.
Don't forget that other manufacturers collect the same data, they just didn't have a collection of blunders that allowed access. No malicious intent this time, but it clearly highlights that companies need to learn to put policies in place to make double sure PII is actually protected to the degree the law already requires. We also need laws that force companies to make opting out of this kind of data collection much easier, or to make it opt-in in the first place.
I'm hallucinating, but IIUC, there is a philosophical loophole that makes tracking a car through factory installed means is a total wild west; because you don't know who owns or drives the car, but a car is sold to dealerships and resold to households to be used and maintained by anyone, you technically wouldn't know if it's driven by humans, just what you made seem to be doing something. Is that (still) correct?
More importantly also give people a way to check and use that date. Is my employee using the car personally (a violation of the laws allowing me to deduct the car)? did my employee really use the correct route? is my wife cheating on me? Did my kids really go to the library or were they racing across town?
the above is all I can come up with. For many none of that applies and so the data can only be used against us.
> ... need to learn to put policies in place to make double sure blablal is actually tralala.
no. they already have too many policies. and this and that. adding one more is how we got there. (of course they have policies for making sure PII is kept safe.)
You can use git subtree to convert between mono-repo and separate repos, without losing your history. You can even keep both styles up to date concurrently.
The thing is that there are enough people who blindly trust ChatGPT's answers, and they don't know in which ways they could be broken, and they wouldn't have the knowledge to verify the answers because they are asking about things they themselves know very little about.
reply