> How do software developers understand which parts of their software are being used and whether they are performing as expected?
By asking nicely. Not by siphoning data out of my system.
Where is this world going ? Do i really have to treat every program as spyware and restrict its internet access ? And anyway, why would a compiler need internet access ?
Asking nicely doesn't work in pratice because people willing to answer the question is not a randomly sampled group of people and usually highly biased. You can check how polling works and why it doesn't work.
The problem is always how to randomly sample people and AFAIK there is no good way except randomly sample people (a.k.a telemetry). You can argue that developers shouldn't care that much about their software are being used and whether they are performing as expected, but asking nicely really is not a solution.
APPENDED: PLEASE NOTE I'm saying asking nicely is not a solution to get correct data from statistical standpoint. You can argure the data accuracy is not important in this context.
LLVM, GCC, rust, zig, D, etc don't have telementry and they seem to be doing fine. What makes go special? They aren't trying to be more efficient than rust. It isn't more widely used than GCC. I would argue they don't need this telemetry at all.
FWIW, there is a proposal to add telemetry to LLVM [0] and Rust used to have telemetry [1], both off by default. Some things in the node.js world have telemetry enabled by default, like Next.js [3].
Some people are posting here as if this is already decided -- AFAICT, that's not the case. It's not even a formal proposal yet, and the stated intent was to start a conversation around something concrete. (For context, this is standard for how the Go project approaches large topics, including for example I think there were something like ~8 very detailed generics design drafts from the core Go team over ~10 years).
It sounds like the Go team is going to take some time to look into some of the alternative approaches suggested in the feedback collected so far.
In any event, this is obviously a topic people are very passionate about, especially opt-in vs. opt-out, but I guess I would suggest not giving up hope quite yet.
When did I say it isn't useful? I said they didn't need it. They also don't need to install a kernel module that records every keystroke and memory access, but it would certainly be useful data.
Have you asked core developers of those languages how they feel about this? I'd bet that you'd have near unanimous agreement that this class of information would be useful for informing support decisions.
Where I'd expect difference is precisely over the question of whether it's worth the cost & trust issues. That makes me wonder whether there might be some middle-ground here where some third-party like the Linux Foundation could run a telemetry service which is highly public in both its design and collected data, and whether that would be enough that many people would be comfortable with this kind of service without Google's business reputation entering into the equation.
Hence the second paragraph in the comment you're replying to. There's no question that this could be implemented legally but the ethical considerations are legitimate and warrant extensive discussion for an open source community. I think that the Go proposal has been derailed by a lot of hot takes based on scenarios far in excess of what's actually described, but I also think it's quite likely that many languages' core developers would look at that and agree that they don't want that information badly enough to spend time trying to negotiate those hazards.
What if your plumber installed a camera in your bathroom for informed support decisions? Sure would be useful for him to see where you put those "biodegradable" baby wipes after the deed.
Do you really think that's anything like the scenario being proposed?
Here's what I think would be a closer analogy to what's actually being proposed: the water company installs a smart meter allowing them to read usage data more frequently. They randomly sample 2% of customers' data and prepare a histogram showing the top daily usage hours and total usage over a monthly period.
The water company already has a connection to you (the pipe) and the smart meater measures that connection. The smart meter does not collect information how you use that water inside your house. This analogy would match a website collecting anonymous usage statistics for connections but would not match an otherwise offline program making connections to collect data.
I always wondered how much intelligence they gather from the Go playground.
Probably the data is too wild to make any use of, but then all the folks who have a clue using the playground as a kind of proof-of-concept play area... would be great for analysis.
Anyways, I'm not sure how well it would work, regardless of if it should be a thing at all. The majority of stuff I do, and the folks at work, we all have build pipelines that control internet access, use cache systems, etc... so I'm not sure it even matters.
I was actually wondering how much this would affect many build pipelines — if you're doing your builds in a container, the 7 day threshold would never even be reached unless you enabled caching for that file.
The main thing I think this would do is help them flesh out the non-mainstream configurations. For example, I've used Go on AIX but I'd be surprised if bug reports on that platform didn't hit basically every maintainer by surprise since statistically nobody tests on it regularly or even has access to do so. I'd think most of the value of systems like this would be finding out that the change they tested exhaustively on everything Google normally uses is also going to break 100% of some niche community.
I wonder if there's some trend where corporate backed, large "open source/source available" projects being more open to telemetry compared to more "open community" projects.
For non-corporate projects, the general thought process might be it's fine to leave efficiency gains and improvements on the table if it means violating some foundational principles. While in a business setting, those efficiency gains could be very tempting since it can translate to more money, market share or promotions and that way of thinking gets applied to the open source projects as well.
I think your analysis is correct, and would also add that corporate projects probably have both the culture of monitoring (they use it for everything else) and the expectation that they have to deadlines so they prioritize the idea of being able to ship something with the confidence that it either won't break anyone or they'll be able to know & react quickly if it does. Open source didn't used to have that same focus since it was rarely someone's job and there was more understanding that time was in short supply (“if it breaks, let me know or send a patch”).
It probably also helps there's more easy access to infrastructure for all of this: servers and sysadmins. Ignoring everything else, that's much harder for your more classic volunteer effort open source project: servers cost money, and maintaining them costs time (usually better spent on actually writing code and such).
> GCC is more used than Go and GCC does not have telemetry so GO doesn't need it is completly unrelated.
I mentioned that because an argument could be made that they're the most popular, or they're trying to be the most efficient language and that's why they might "need" this telemetry data.
> Most dev working on core tools want telemetry
Yet somehow, almost all open source, popular, efficient tools don't have telemetry. Like the ones I listed + python + Linux + coreutils + bash.
Of course they want this. But there are mountains of successful projects that work very well without it, and every one serves as evidence that this Isn't as important as they let on. We shouldn't subject our personal computers to google tracking because they want more tracking data.
In case anyone wants to mention that it's Google's language and they can do what they want with it, this is a perfect example of why people aren't happy with Google's control of the language
Which Java distribution? Maybe the Oracle JDK/JRE? I highly doubt the Oracle JDK/JRE has serious usage numbers because of it's (quite commercial) license.
Well, to bad. Because if you don't, you've built a piece of malware, by definition. You need to ask, live with bias, and draw your conclusions accordingly.
I'm confident that you're not jumping from "asking nicely is not a solution" to "using force or deception is fine."
It's just that, if the argument in my last paragraph did become widely accepted, it would lead to a pretty brutal society. (And, of course, it is becoming pretty widely accepted where software is concerned.)
While true in general, it is really not that necessary to have a representative sample for checking whether your software works. With large enough sample size a convenience sample will find almost all of the bugs that telemetry would have found.
The bigger question is why do they suddenly need this to such a degree that they need to do a backhand tactic to get the data? Most other programming languages have no such data and have survived and thrived for years, decades even before Go existed.
It's neither sudden nor backhanded — if either of those characterizations were accurate, we'd have learned about it from something shipping. Announcing it publicly before implementing anything and having a number of privacy protections built-in suggests neither is a fair description.
Reading through the use-cases at https://research.swtch.com/telemetry-uses one thing to consider is that these are the kinds of things people care about as a project matures — once you have a large number of people depending on your project, you have to worry more about unintended consequences as you evolve and since the user community has grown larger and the project more stable it's also harder to know whether the people you hear from regularly are representative. When a project is small or mostly used by enthusiasts you can be more confident that your understanding of what people are doing is relatively accurate but that just isn't true over time. For example, how many millions of Java developers never contact Oracle or the OpenJDK developers throughout their entire careers?
The same argument could be made by many other entities include government bureaucrats and LEOs. Justifying data collection with "it makes my job easier" is a very slippery slope IMO.
My code is on GitHub as is a large number of other peoples. They can try to compile it themselves with all the profiling and probes they want, on their own machines.
That's not the problem they talking about. It would be useful in some cases (and I'd be surprised if they haven't already considered experiments along those lines) but what you have on GitHub doesn't easily tell them about your build or runtime environments.
Here's a long list of questions Russ Cox was interested in - notice how many of them would really want something like “what is your CI configuration?”, and consider that while they could sample GitHub Actions configuration that doesn't tell them anything about the much more varied set of projects who don't use that service:
That's a very forced, contrived, un-impressive list. It reads like someone told them "hey, we want telemetry, please make up a list of uses that will sound plausible to programmers to justify it. Be sure to use big words like histogram a lot to shut down objections by sounding real smart."
Most of these things are in the category of bug prioritization. Why don't you... you know... talk to users?
If you want some statistics (oh sorry, histograms), maybe pull down the gigabytes and gigabytes of Go code residing on public repositories and places like GitHub and analyze all that. That'll tell you a ton about how people structure their projects, what architectures they target, etc.
Biased sampling only matters if the bias has some relevance to what you’re investigating. There is zero reason for software telemetry to care about privacy bias.
This may be by design, as it punishes those who dare to value their privacy.
eg, some options:
a) No telemetry. All users of the Go language get equal treatment/consideration by the developers ("best guess").
b) With telemetry. The users who opt out are thereby excluded from consideration ("punished").
It seems like over time, option b) would lead to problems for the Go language as the developers base their decisions on incomplete information, whilst believing that information is somehow more complete than option a).
I'm not entirely sure how biased the data would be; there needs to be a correlation between "frobs with telemetry" (whether it's opt-in or opt-out doesn't even matter) and any other metric. I'm not so sure there is one, although perhaps there could be – we need telemetry to demonstrate that :-)
Questions about bias in reported data are legitimate and a genuine topic of concern, but this:
> This may be by design, as it punishes those who dare to value their privacy.
> Questions about bias in reported data are legitimate and a genuine topic of concern
The problem is an obvious one but doesn't seem to be considered. :(
> Is just veering off in the conspiratorial.
Punishing people who value their privacy seems like it'd be strategically useful to Google, an ad company with demonstrated intelligent but sometimes ethically challenged people. ;)
You seem to have a problem understanding the concept of consent.
Allow me to help you. You ask first. If the answer is no, you don't do it. If you cannot master this skill, or refuse to, you are embarking on becoming one of the most problematic demographics on Earth.
There is no excuse for not asking, and honoring the answer. Sending data is not free, and I guarantee your telemetry is not that important, neither is your vaunted sampling. All you want is to know who your Users are, and the fact is, that is fundamentally gated behind them wanting to even bother telling you.
The world was already there, your bubble just burst. In recent decades tech companies and their programmers have done everything they can to collect data about users. Are you now surprised that the same companies apply the same practices for technical users of their software?
The solution, at least for me, is to remove that software from my life. Personally I vowed a long time ago to never work for a company whose business is to track people. It's no big deal for me, but I understand at least some may have hard decisions to make.
Being able to monitor the system is one of the big reasons to move to web based applications (another being e.g. ease of deployment).
If monitoring of the application isn't possible (And by possible I mean reliably statistically so opt-out not opt in) then that will drive the shift to centralized/web based applications even faster.
Ironically, those who are most wary of telemetry tend to be the same people who also appreciate being able to run local software over web based.
The irony I mean that the tracking will be there regardless, the suppliers of software will use "we can't monitor the app sufficiently at end users unless we centralize it" as the argument of not making client side software at all.
I write some paid local-only software, and I have no trouble not tracking users or usage. For example, here's the entire privacy policy for one of them:
> App does not collect any data.
> App works completely on-device. Messages do not leave your iPhone and are not shared with other apps.
> The essence of this privacy policy will not change.
--
For another software, the privacy policy includes details such as:
> App never uses the Internet, except to check for updates.
> Automatic update checks can be turned off in the application's preferences. Requests to check for and to download updates happen only through the domain domain.example. The updates server does not store or forward potentially identifying information, such as request IP addresses or user agent strings.
> License keys, obtained upon purchasing App, do not contain personal information, such as names or email addresses of license holders. License keys are validated locally; App does not use the Internet to validate license keys.
I mean that as a group, people who crusade against monitoring of client software on end users' machines, will in the long run be accelerating the shift towards more software being web based.
If I'm deciding between a desktop and a web client for my software, monitoring is a concern. If I suspect I'll either a) not get enough opt-ins for montoring or b) end up in a publicity shitstorm if I use opt-out, then I'll just opt to run the software on my server instead.
The amount of monitoring and tracking that is done on server side software is obivoulsy going to be a LOT more than what "anonymous usage data" would be for the client version. And that's a bit ironic (that it might have the opposite effect).
Obviously scenario isn't about this software specifically. A compiler isn't going to be "web based" tomorrow. It was more a comment about the software landscape and the privacy debate in general.
Microsoft has been pumping out proprietary crapware for decades, while those of us who cared about freedom and security have continued to ignore it, despite it even being pop culture at times.
The lack of basic ethics on this topic is a great example of what people mean when they assert that software engineering isn't real engineering. Software should represent the interests of its prospective users, period. Inserting anti-features that directly go against the interest of the user to benefit the author is a violation of trust, and is blatantly unethical.
But sure, keep convincing yourself that if users don't like you violating them a bit, it's just naturally pragmatic to abuse them even worse. Either way whatever you create can't be considered trustworthy.
As for the original topic, the real problem is that it would be a rug pull - classic Google. If this were a new language built with surveillance in the compiler, nobody would use it. But if the maintainers decide to add in hostile features after the language has gained wide adoption, it will take a lot of churn to sort out, fork, sandbox, etc.
OpenSnitch won't help if the dependencies you pull in come through the same server as where the tool uploads the telemetry to. Go works with a Google proxy server for this purpose by default.
You can disable the Go proxy and go for a more direct source fetch with an env variable, but in the same way you can also disable the telemetry.
Yeah, I think the only way around that would be for OpenSnitch to block all http/https requests and have a system-wide proxy that sees all the URLs, asks the user, does the approved requests and returns the results.
> Do i really have to treat every program as spyware and restrict its internet access ?
My feeling about Google is that they do not respect privacy, no matter the product. Invasion of privacy and tracking their customers are behaviors deeply ingrained in Google DNA.
Doing so for all other products has been incredibly profitable, so why not Go? It’s not like Go the language has a history of respecting the PL community (the resolution of Google stepping on the Go name was Google throwing their weight around and told the Go! author to pound sand), so why should anyone expect they respect their users?
> Do i really have to treat every program as spyware and restrict its internet access ?
If you're using an application from Google or Microsoft or any big tech company for that matter: yes.
> And anyway, why would a compiler need internet access ?
The go build tool will fetch dependencies for you. If you don't use dependencies or use a local network proxy instead of the Google one you can safely disable WAN access, but otherwise you'll end up with a broken tool.
That's true. However, because of the design of go tooling, you still call the same command line method to get the tool to compile your code.
This leads to the compiler having access to the internet and possibly exercising its access. In a perfect world of separated concerns the compiler binary wouldn't have any networking code at all, necessitating a go-get before calling go-build.
I understand why the Golang team decided to go with this approach, but it does have side effects that people coming from different tooling (say, C) wouldn't expect.
Sounds like a great reason to adopt Nix or the like! Run each tool in a sandbox appropriate to what it's supposed to do, and store the entirety of the inputs and outputs so they can be audited if the need arises.
Nix got there via reproducibility, but it's exactly what's needed to mitigate compilers with surveillance features or other backdoors.
It doesn't. It compiles without internet access. If it has internet access it can also send telemetry data. If the compiler refused to work in all cases where it couldn't send telemetry data, I'd see the point...
Somehow we managed to create very high quality software before all this telemetry being larded all over every single thing.
Languages like Go have enormous test code bases with unit tests to ensure that every corner of the language is compiled and executes correctly. We don't need gobs of telemetry to phone home about how people are using a compiler.
There's no guarantee that for some tests that anyone is actually using the edge case, and there's never 100% coverage.
The tests are the map, the actual use of the language is the terrain. The map is assembled largely by blindly guessing what the terrain is likely to look like.
That isn't very scientific.
The result is a situation where users will scream at the language authors if they break something, but they'll refuse to share any information about what they're actually doing with the language, which is perverse.
Absolute nonsense. Go is an unbelievably simple straightforward language, and any mature language always has exhaustive test suites to exercise implementations of it.
There are numerous compilers for C++, one of the hariest languages around, that were developed with no telemetry and work just fine and cover the entire language spec. Some of these compilers like G++ and Clang++ manage to target many different architectures with very few issues.
The obsession with telemetry is a mix of laziness and I strongly suspect user-hostile surveillance motives.
We're just going to have to lock down OSes. Giving any application carte blanche network access is like when apps had open access to all of system RAM and the whole filesystem back in the MS-DOS / Windows 3.x days.
Maybe unauthorized (by the user) telemetry should be like invalid memory access and trigger something akin to SIGSEGV. That'll send a message.
Paying how? People tick a box saying "If you tick this, we'll report how many cache misses your compilation passes did, and in return you'll get a gift card after 10k compilations"?
Why give away a gift card when they can use the data collected from your computer via the compiler to sell the opportunity to sell you a targeted gift card?
Not sure if serious, we are talking about data collection with zero chance of any practical targeted use.
The discussion would be a wholly different one if the data could be used for anything to e.g. sell, target ads or even deduce how/whether an individual has even used the application.
You sounded like you meant it could be used to target individuals, and not just draw statistical conclusions about the software. Those are quite different things. Every company that does "anonymous usage statistics" (which is basically the norm these days at least for larger apps) obviously have some use for the data or as you say they wouldn't be paying for the functionality.
or 'tick this box or you're not allowed to access X, Y and Z parts of the tooling".
There's a big difference from functionality not existing to do this, and functionality existing that can change at any time. rsc's attitude about this comes with an air of "I want to alter the deal. Pray I don't alter it further"
> 'tick this box or you're not allowed to access X, Y and Z parts of the tooling'
That would be asshat-y but as long as no PII is transmitted, it would stop short of being a legal issue at least.
This is the thing though I think: people don't like to give up control (in this case, about what their software does). It's not about privacy, it's about control.
>By asking nicely. Not by siphoning data out of my system.
THIS !!!
I suspect like many people, I feel that if they had said "We're going to introduce this ... but don't worry, it will be opt-in", then they would have avoided 99.99999999999999% of the negativity.
Instead, by making it opt-out, they are basically doing a classic Google shit move.
And that's before we start getting into questions about GDPR and opt-out.
Yeah fwiw I am strongly opposed to the proposal, and Russ Cox acknowledged that they considered many of the posts against telemetry or for opt in only as valid discussion, and gave examples of some of the inappropriate comments they referred to as being in "bad faith". I appreciated his clarity there in answering about what he considered bad faith posts.
There were definitely comments that got quite heated and probably fell outside good discourse practices, largely because people are very passionate about privacy and also tend to get very upset when they feel betrayed.
> There were definitely comments that got quite heated and probably fell outside good discourse practices, largely because people are very passionate about privacy and also tend to get very upset when they feel betrayed.
We're supposed to be adults, not 13 year olds. I'm not perfect either and a jerk at times – most people are – and that's okay, but I'm not making excuses for it. Have some standards.
Most of the hidden comments were hugely disrespectful and provided no argument or anything of value. If you start comparing opt-out telemetry to rape then you're simply a jerk (yes, really: https://i.imgur.com/eLVs9ev.png) I don't care about "yeah but people have strong feelings". I'm tired of excuses for jerkery; I find this enabling of it even more egregious than the jerkery itself. I have strong feelings about a lot of things, but I also manage to keep them in check most of the time on account of being an adult with some standards. Nowhere is this as easy as in online discussions, because you can literally step away from the computer at any time and come back to it an hour or day later.
Personally, I found the link between sexual assault and tracking to be expected: they're the two scenarios in which consent even gets mentioned or discussed in daily life. When it comes to sex, it's critical and mentioned all the time in media and internet posts; in tracking, it's in the forms of "I consent" buttons on every other badly designed web page. In how many other places will you find a consent form? Medical procedures, perhaps?
Obviously, Google's data hunger is worlds less harmful than sexual assault, but I think the difference in how consent is treated is actually relevant for the discussion.
When it comes to sex, it has taken many decades (or even centuries) of hard work, but at this point society is finally starting to get the message across: you don't have consent if it's not given explicitly. Society is still in the process of getting everyone on the same page here (sadly) but progress is being made.
While compiler data collection isn't as important, the idea that consent means "you haven't tried saying no enough" is completely absurd. This means is no good comparison for consent because every comparison where a normal person deals with consent will make the suggestion opt-out seem horrible. Most actions happening to a person without consent are or should be illegal, with perhaps lawful arrests and the judicial system being the exception.
If we, as a society, really do value explicit consent, we shouldn't mix and match what is and isn't consent based on what serves us. Consent is important in software too, even if some companies or developers don't see it that way.
Google devs especially should know this. People I know have felt a real sense of violation when they found out how intensely Google tracks its users. I've seen people get creeped and grossed out by their phones because they were asked to review a restaurant they were near, realising that Google knew exactly where they were and how long, and that the tracking had gone on for much longer before that. I, myself, felt a little violated when I found out how much data Microsoft's dotnet tool has been sending out after getting "consent" during an automated install and first use, despite knowing the data can't track me down as a person.
The impact may be incomparable, but developers seem to severely underestimate the emotional impact their blatant disregard for other people's privacy can really have just because they don't feel the same way. Perhaps another way the comparison is more apt than it would seem at first glance.
People can choose to use Go or they can choose not to use Go. They can choose to disable telemetry if they do use Go. Nothing is forced upon anyone. Rape is. You do not choose to not get raped. The entire premise of rape is that any choice is taken away.
You can write long philosophical stories about it and none of that matters anyway. The accusation is pretty clear, and it should be pretty obvious to everyone that going off on a rant about "this is as dirty as rape!" is unconstructive and inappropriate.
Furthermore there are countless things that happen to everyone every day without their explicit consent and it's far less valued than you think it is. It would be unworkable if we did. Did I consent to air pollution? Noise pollution? Billboards next to the road? The laws of my society? Having a passport of the country where I happened to have been born? You replying to my comment? People emailing me? In practice consent is often either implied and assumed, or doesn't operate at the individual level (e.g. it operates via voting).
You trust your distro to make choices that align with your preferences. Personally I've always preferred distros that change upstream as little as possible, but other distros make other choices on this, which is also entirely reasonable.
Assuming this proposal is implemented as stated – which I don't expect it will – then I'd expect some distros to disable it by default. Go already provided the tools to set these kind of things globally in a recent release (1.20 or 1.19 IIRC) with a global go.env file.
I don't think the accusation is "this is as dirty as rape" is made at all. It's easy to overreact when someone brings up a topic like sexual assault because nobody in their right mind likes to be associated with such horrendous behaviour. However, the framework society has set up around consent is still useful.
Consent is implied (or assumed) in most day to day events. However, privacy law in the EU has brought the topic of consent into the internet and onto computers, whether you want it or not. That's because digital privacy and all manner of things it accompanies has been regulated because the industry has failed to implement the necessary safeguard by itself.
You may feel like such telemetry doesn't need explicit consent, like you or me commenting on posts here, and that's a valid opinion to have. Others viscerally disagree.
I personally think digital consent is a field of computer ethics that has been ignored for way too long. I'm very happy with the GDPR and only wish government would start enforcing privacy laws more often. You're free to disagree with me and I can understand your position if you do.
For those that do consider software tracking something that requires consent, there is a real emotional impact to seeing your privacy be violated by software and devices you otherwise trust. Ignoring that because you disagree with the underlying notion can only lead to misunderstanding and exaggerated flame wars. It'll also generate distrust and friction within your community.
To all of the points you have listed where consent is usually implied or assumed, there are groups that disagree. Air and noise pollution are regulated for a good reason; laws around those are often vague because these topics are extremely situational. I do actually think most billboards should be banned from public spaces for both ethical and safety reasons. There are large groups that consider the laws in the places they've been born in to be unjust and are angry that they're subjected under them, and demand (usually through a misinterpreted or made up legal basis) that they and their peers never agreed to any of those rules. There are even laws in place about who is allowed to email you, because you did not express a desire to receive ads from random companies, though enforcement is usually completely absent.
For what it's worth, I 100% believe that the author behind this proposal is well-intentioned. The proposed methodology has been carefully constructed to respect privacy as much as possible. I also completely understand the wish for opt-out as most people simply won't choose to act on requests for opting in. However, a project like Go isn't run by a single person, and the company that does run it cannot be trusted when it comes to privacy concerns.
> Air and noise pollution are regulated for a good reason
Regulated, yes, but that doesn't mean "I consented to [...]". Even a single person driving a car pollutes the air and noise, and none of the people who involuntarily come in contact with that car consented to that, much less explicitly consented to it. This applies to almost all motorized forms of transportation to at least some degree.
But that doesn't mean these are unreasonable things to do, or that explicit consent needs to be given. The basic rule for public life is "do what you will, unless you unreasonably inconvenience others". Although there is wide disagreement on what exactly "unreasonably inconvenience others", almost everyone across the ideological spectrum mostly agrees on that basic principle. Generally speaking, if you don't unreasonably inconvenience others then it's accepted that consent doesn't need to be given. "Unreasonable" is key here, because many things can (and do) inconvenience others, but that doesn't mean it's unreasonable because we all got to live our lives.
> You may feel like such telemetry doesn't need explicit consent
I never actually said that; I mostly wanted to comment on "I think consent is far more complex than what you're saying".
I don't actually know what I think of this. Mostly I'd like to move towards a future where "consent" is de-emphasized because if something is considered bad by the overwhelming majority of the population no one should be doing that at all in the first place.
Once the invasive tracking stops (i.e. becomes regulated) there probably isn't much need to worry about fairly innocent telemetry as this mostly seems to be a matter of trust ("I don't know what you will do with this data").
I don't really know what that would concretely look like in practice (e.g. what the exact laws should look like, how to enforce it, etc.)
If a ruling class or inner circle has already decided on a course of action, you rarely change anything with good discourse practices.
CoCs are always protecting the inner circle and their opinions. In the early days of open source, flame wars were considered a legitimate way for voicing opinions. Now, in corporate controlled "open" source, the dominance hierarchy must be protected.
None of that is true in the absolute terms you used. Flame wars also drove away potential contributors and even if you weren’t personally involved they could sap your desire to participate or set negative reputations for an entire community.
Similarly, codes of conduct are set by many groups as a way to be welcoming, letting prospective participants know that they’re not going to get flamed or hit on for trying to participate. Yes, some corporations might use moderation for ulterior reasons but that happens anyway – it’s like arguing that we shouldn’t have cars because people will use them to commit crime.
> If a ruling class or inner circle has already decided on a course of action, you rarely change anything with good discourse practices.
Quite true. Especially in the worksplace where RFC/design meetings are only done to validate what has already been decided.
> CoCs are always protecting the inner circle and their opinions
CoC are 99% not doing that. Of course people can abuse it but I don't see it in practice most of the time. It's just that the abuse/drama gets more attention.
> In the early days of open source, flame wars were considered a legitimate way for voicing opinions
Flame wars have always been discouraged. As soon as one starts, mods usually step in and shut it down.
> Now, in corporate controlled "open" source, the dominance hierarchy must be protected.
Yes, very few corporate-backed open source projects are truly open to collaboration, especially when it changes the feature set. If you want to submit bug reports, help users or a PR for a tiny bug, that's all good.
In their defense, the employees of those companies do the most work so they get to decide. But they do benefit enormously from having the code open (something they very often forget).
Every Hundred Flowers Campaign is followed by an Anti-Rightist Campaign.
Chairman Mao instituted the Hundred Flowers Campaign purportedly as a way to solicit community participation in building China's new communist society. The CCP assured everybody, hey, we're totally open to criticism, feel free to be open and honest, we could use your ideas for how to improve.
The reality was made clear in the Anti-Rightist Campaign, in which critics of Mao's plan or regime were put on a list of those to be arrested as "rightists" and imprisoned or killed.
I mean, some folks in the thread were absolutely arguing in bad faith.
A lot of people just go ahead with the "telemetry=bad" message + critique of telemetry in general without even taking a look at the actual proposal[0] (which has some important points regarding collection frequency, data shared, and architecture). Not saying there should be no critique, but blanket statements that often criticize aspects that are irrelevant here don't help.
They're really trying to approach this in a way that, to me, is as reasonable and transparent as effectively possible (yes, I'm on the side of "opt-in telemetry doesn't work because of the majority of people who don't care either way; won't turn off if prompted, won't turn on either", so being opt-in is in fact unreasonable to me).
The Go toolchain already in a sense "calls home" by default to the Go module proxy hosted by Google.
Anyway, it's open source, they're the maintainers, it's up to them, while if you want, you can fork it. Telemetry is immensely useful to understand what features or technical improvements are worth prioritizing to benefit the actual average user the most, and not just the vocal minority. Telemetry being opt-in breaks this by once again prioritizing the vocal minority that turns it on.
> I mean, some folks in the thread were absolutely arguing in bad faith. A lot of people just go ahead with the "telemetry=bad" message + critique of telemetry in general without even taking a look at the actual proposal
It is still telemetry, other platform's toolchains dont have it, it's not bad faith for rejecting it.
> Anyway, it's open source, they're the maintainers, it's up to them, while if you want, you can fork it.
I've heard plenty of back and forth about adding telemetry to the Rust compiler because of hard to spot bugs.
LLVM, GCC, etc. mostly depend on the crash reporter from distributions, and I know several Clang devs who have been thinking about adding some additional telemetry.
No I won't mention names here because I do not want the innocent to be harassed because I posted on fucking hacker news.
> The Go toolchain already calls home by default to the Go module proxy hosted by Google.
I don't think it's fair to describe the Go proxy as "calls home"; no one says that about Rubygems, PyPI, npm, or any other package tool. If anything, the Go proxy is the only package system that actually allows disabling it quite easily and pain-free. Try using Ruby without rubygems or React without npm: no doubt possible, but it will take a lot more effort.
> I don't think it's fair to describe the Go proxy as "calls home"; no one says that about Rubygems, PyPI, npm, or any other package tool.
The Go equivalent of `gem`, `pip`, `npm`, `apt`, `cargo`, etc, is `go get`. There's no issue with `go get` connecting to internet.
On the other hand, `go build`, `ruby`, `python`, `node`, `gcc`, `rustc`, have no business doing anything that requires the network (obviously `node` etc are interpreters, so the user's program could have networking code, but that's the user's code, and not the interpreter's code).
So, `go get` using Google's proxy by default, fine, this is Google's tooling, we can just avoid using `go get`.
But the problem is that this proxy sneakily creeps into the `go build` command as well, so you can't just "not use `go get`" because you have to set flags for the compiler as well.
You're going to have to run "go get" anyway, or "gem", "pip", etc. I don't really see what concrete difference it makes, except that the current approach is *much) more convenient.
Well, it allows you to track which packages are used how much and gather ip addresses of your users.
IP addresses being the only "personal" data the new telemetry proposal would plausibly be able to gather as well, which a lot of folks are complaining about.
Just to be clear, I don't think the Go module proxy is a bad thing, just saying that we've already crossed that "threshold" of potential personal data collection long ago.
> Well, it allows you to track which packages are used how much and gather ip addresses of your users.
So does Rubygems, PyPI, npm, or any other package tool.
If you want to have a broader discussion about package tools: be my guest. But people misrepresenting and banging on about the Go proxy specifically got boring a long time ago.
I haven't misrepresented anything. I have also mentioned I don't think the Go module proxy is a bad thing. I think it's a good thing. I understand why it's there. We share the same opinion I think, in general.
What I am arguing is that the people who are saying "this new telemetry design could allow them to gather ip addresses - which are 'personal' data" are arguing over a ship that has sailed years ago.
You're right; I probably read too much in your comments while they're more nuanced than I took them for. My apologies. I think we actually mostly agree. It's just that people bring up Go proxy every time anything is discussed about Go (and some distros like Debian even disable it) with what I can only describe as unfair accusations that are levelled against Go uniquely and not against any other system with similar behaviour.
No, but the go tool/package manager doesn’t necessarily, by default, have to proxy each requested module through Google run servers. It can fetch modules directly from the source.
That said, I don’t think that the go tool’s current behavior in regards to the module proxy server can be considered as phoning home.
The proposed metrics collection however absolutely would be bad phoning home behavior.
Opt-in telemetry not executing for users who would like it to be executing, because they don't know about it or forgot, makes it an under-advertised feature.
Opt-out telemetry executing for users who would not like it to be executing, because they don't know about it or forgot, makes it a spyware.
The rest of the discussion only makes sense to have if you agree there are circumstances when implementing spyware is justified.
It's doing that to provide a functionality required by the user, which results in informed consent. If you do computing without a user's informed consent, you're building malware.
The problem with opt-out telemetry is that as soon as you admit people won't voluntarily turn it on, you've admitted it's a user hostile behavior. The usability of opt-out doesn't override the reality that it's not ethical to do it.
I think if the Go authors do move forward with this proposal in any form, a fork is guaranteed. I've heard multiple sources express interest in doing so already.
> The problem with opt-out telemetry is that as soon as you admit people won't voluntarily turn it on, you've admitted it's a user hostile behavior.
That is in fact not the case. This is just one possible explanation for users not turning it on, one that I don't believe is even close to a meaningful amount in the grand scale of things.
As I wrote in the first message, a huge amount of people don't care either way. They won't turn it off if they're prompted, but they also wouldn't bother turning it on. They just don't care. I'm one of these people.
> As I wrote in the first message, a huge amount of people don't care either way
Do you have any data to support that claim?
Anecdotally, almost all programmers I've spoken to have expressed some kind of complaint about how much data all our devices collect these days. Many of them even saying that they're afraid to speak their mind on the Internet because they know someone might connect their words to their true identity.
A large number of non-programmers have mentioned to me how freaked out they are by their Android phones showing them ads for things they've just mentioned in proximity of their phones for the first time.
Considering the massive surveillance we're under, I think that "most people care, and don't want telemetry if given the choice" is much more precise than "most people don't care".
If we consider the dictionary definition spying has a sense of personal and private information being collected. For telemetry, one of the first questions would be whether the data being collected is the same — for example, something like the version of the C compiler you have installed doesn't seem like nearly as sensitive.
The other part is the question of exactly what consent means in this context. If I sneak into your house, I'm spying. If I record what you do when visiting my house, that's arguably still spying. If I say I'm giving you a free meal but at the bottom of the menu there's a warning that I use statistics about which meals are the most popular unless you ask me not to, that doesn't feel like it's in the same category as the previous two even if there's a pretty strong argument that it's best to ask you the first time you show up.
I don't think such real-world analogies serve any useful purpose, they only make things more confusing.
I'd define "spying" as any data collection that users are not informed of.
Go tool is an open-source development tool - people expect open-source development tools to serve the users, not the developers (to the detriment of the user). That's the whole point of the whole open-source (and free-software) movement. [A significant number of] users do not want their tools to send any data from their machines that they themselves haven't allowed it to.
To create an open-source development tool that sends data by default is a deception tactic - it works because most people trust open-source tools, and because most people won't know the data is being sent by that particular tool.
The deception part is what bugs me the most. I may or may not be against telemetry in general, but I am definitely against deception of the users. And providing a few lines in the release notes counts as "informing the users" as much as changing a few lines in the TOS counts as "informing the users" - in my opinion, doesn't.
> I don't think such real-world analogies serve any useful purpose, they only make things more confusing.
Isn’t it the same as using a term like “spy” with significant outside connotations rather than more common technical terms like “telemetry” or “anonymous usage data collection”?
Similarly, “deception” implies malice or subterfuge. One typically doesn’t associate that with extensive public comment so using that term seems like it’s contributing more heat than light.
> Isn’t it the same as using a term like “spy” with significant outside connotations rather than more common technical terms like “telemetry” or “anonymous usage data collection”?
"Spying" is not an analogy, it's a verb that describes the act of collection information that you aren't authorized to collect. In case of opt-out telemetry without user consent, it's a perfectly fitting word.
One could also argue that "telemetry" is under-descriptive, since it does not account for the fact that it's turned on by default.
> Similarly, “deception” implies malice or subterfuge.
Deception is the act of giving people false information for the sake of achieving a goal, the ethics of the underlying motive is irrelevant.
Again, the reason I think “spy” is an editorial choice to express an opinion is that in common English usage that work implies hostile intent. It's not just that some data is being recorded but that it's done with the understanding and desire to harm the subject. Contrast with “telemetry”, which is value-neutral and simply describes the mechanism of collecting data and transmitting it to be recorded somewhere else.
I don't think it's appropriate to use “spying” because there's no evidence of hostility here and considerable evidence to the contrary in the form of the public discussion and the proposal starting with various mechanisms limiting what is collected, how frequently it's reported, and preventing covert manipulation of the telemetry system.
The very act of collecting information without consent is the act of hostility.
The rest of the technicalities are irrelevant, because none of them have anything to do with consent. There is no class of data whatsoever that is okay to collect without users' consent.
This is an ideological statement. You’re free to believe it’s true but it’s not how those words used in normal discourse and you’ll probably find your evangelism more productive if you clearly state this up front.
> If I do see you in shop, then Im collecting informations about you, yet you didnt consent to me anything.
This isn't a good analogy. By going in public I'm implicitly consenting to being seen, because it is impossible for me to go in public without being seen - being seen is a necessary condition for the action of me going out in public.
There's no such necessity when it comes to running a development tool on my own machine in my own home.
>Go tool is an open-source development tool - people expect open-source development tools to serve the users, not the developers (to the detriment of the user). That's the whole point of the whole open-source (and free-software) movement. [A significant number of] users do not want their tools to send any data from their machines that they themselves haven't allowed it to
Sorry, but I think you just made it up.
OSS makes stuff somewhat transparent and auditable, thats it.
Also users do benefit indirectly by telemetry because they get better tools.
Anyway what would be the other benefit of dev? That he has to spend his unpaid free time on fixing yet another crash report?
> as you admit people won't voluntarily turn it on, you've admitted it's a user hostile behavior
That is presuming people aren't turning it on because they don't like it, while the reason they aren't turning it on could be they are lazy and don't want to do any extra work.
If you had a HTML sanitizer that most people weren't turning on, you might just turn it on by default to improve safety (like Rails did at one point). Just because people weren't turning something on themselves doesn't make turning it on by default "hostile".
> That is presuming people aren't turning it on because they don't like it, while the reason they aren't turning it on could be they are lazy and don't want to do any extra work
The solution in these scenarios is to simply ask the user, with no default, whether they want telemetry to be enabled or disabled. To do so in a modal or required dialogue in fresh install and the update in question.
If the telemetry is worth changing defaults to you then IMO it should be worth a bit of friction to introduce it more ethnically.
I think a lot of users don't actually interact directly with the CLI. Think IDEs, editors, CI/CD. You'd be cutting them out, again skewing the stats heavily.
That's true. I think this would therefore need to be a bit of a breaking change for that tooling - systems that integrate with Go would need to be updated to deal with this prompt. Whether that's showing it to users or autofilling an answer with a --telemetry==x flag, that's up to them and their community.
Is that painful? Yes. Would it turn some people away from Go through this kind of friction? Sure. IMO those kinds of costs and pain points should be necessary considerations of adding telemetry.
Worth at least a minor version change, and therefore not too crazy that it would cause some annoyance.
Because the 90/9/1 rule applies to most things. 90% of people are content with the defaults, whatever they are. They have no opinion on telemetry, especially telemetry that is privacy respecting. They're not going to go to the effort of modifying configs for that purpose, but that's true either way.
They wouldn't turn it on if asked, but they also wouldn't turn it off if you told them about it.
I said most things. "Privacy preserving, minimally invasive telemetry" and "literal fascism" are obviously not comparable and it's disingenuous to try and equivocate them.
If telemetry is opt-out, there's a good chance most people won't be aware that it's even there - that's deceptive. At the very least, there should be a big scary banner saying "your data is being sent to our servers" on every single command - that's the only way you can assure that everyone who uses the tool will be informed.
> Are you assuming most won't if asked to?
The big argument to opt-out telemetry is that "opt-in doesn't work", so yes, I'm assuming most won't.
If my assumption is wrong, why not just ask people? They'll say "yes" anyway.
> At the very least, there should be a big scary banner saying "your data is being sent to our servers" on every single command - that's the only way you can assure that everyone who uses the tool will be informed.
It should probably say- "Ensure that the functions and libraries you use continue to be supported by sharing your usage statistics."
> If my assumption is wrong, why not just ask people? They'll say "yes" anyway.
There's not really a great place to ask, but it could be a checkbox in the installer, sure.
Too bad the linked article arguments are of poor quality and unconvincing.
There was a bug on MacOS that ended-up requiring XCode to be install in order for the Go compiler to work? That example is bad because:
1. it was going to be very obvious real quick
2. it did get discovered and fixed without telemetry.
... and both of my points above are irrelevant. Why? Because the problem raised here is not if telemetry is useful or not. It's not even if telemetry is good or bad. It's that discussions about telemetry silenced people who were against it.
Your point is that people should not be allowed to raise arguments that have not been approved as valid. I mean, the simplest reformulation is that "if we think you're wrong you will not be allowed to speak."
For real? Being wrong means you're not allowed to participate? (And by wrong, I mean "deemed wrong by the people in power to ban others.")
The Go Team's vehemence to not just have this telemetry as off by default rubs me the wrong way. Sure, that means less people will turn it on... but that's the point. Running telemetry on your machine should not be a requirement for A COMPILER.
It could even just be a prompt at first use, if an environment variable (eg. GO_DISABLE_TELEMETRY=1) is not detected. Example:
Do you want to enable anonymous telemetry? yes [no]
Your choice will be remembered and we won't ask you again.
See our engagements and details on what we collect here: https://go.dev/telemetry
In any good development org, yes. But I'm sure there are orgs out there where the team that manages the environment and the team that does Go builds are two separate teams.
Now the team managing CI (or QA which could actually be in a TTY) needs to notice these failures and communicate them (via a JIRA ticket of course) back to the development team. That team doesn't know what it's about and sticks it on the backlog. Eventually they give the right environment variable, but environment variables are a security risk so it needs to go through security review, and because this mentions telemetry data the privacy and legal teams want sign-off...
This might be an exaggeration, but I think it's worth thinking about the worst case scenario when building things like this into development tools that are used so widely.
Hey, if it can help highlight dysfunctional Kafkaesque processes in companies, that's a bonus.
If you can't update an environment variable yourself, or ask directly someone who can, all that on a CI system that's somehow bound to TTY, get out of here.
Isn't that a very strong argument against telemetry? If it's generally understood that users don't want opt-out telemetry, why are people insisting on it?
Not necessarily; many people just don't know they can enable it, or can't be bothered, or don't really know the details and have better things to do than investigate so opt-out "just in case" but wouldn't mind opting-in if they were familiar with the details.
At this point I think it's very hard to say what "most users" do or don't want exactly.
It's not a requirement. Something being "a requirement" would mean that it didn't actually work (compile) without it switched on. That's not the case. It's not required for the program to function and it's not required by the license. So it's not required in any way at all.
Even more so when they claim they don't need to sample more than 16000 reports! Like dude, you can easily find 16000 opting in across millions of users . Hell, if everyone in Google opts in, that is more than a sufficient sample. It's really a non-issue if you consider their problem statement. The fact that they are so adamant in the face of substantial feedback makes me think if there is some other ulterior motive at play.
That was a bit hyperbolic tbh, but it's not far fetched to expect about 1% opt-ins from your user base (also considering beta channels are generally opt-out).
A more reasonable way to go about this would be to launch this as opt-in first, and only bring forward the opt-out proposal if enough people don't opt-in. It doesn't have to go with opt-out on day one. This kind of requirement validation in not uncommon and I would be very surprised if the idea didn't cross them already. It would have been the far easier choice to make especially in the face of strong feedback, but the fact that they just chose to close the discussion instead of trying to move forward on the proposal doesn't bode well.
> the fact that they just chose to close the discussion instead of trying to move forward on the proposal doesn't bode well
The way I read that wasn't as a "buzz off, we don't want to discuss this", but rather as a "okay, we've heard all that was said, we will think about it". At some point everything that can be said has been said.
Did they need to close the discussion? Probably not, but it avoids the work and overhead of having to moderate things because it attracted quite a few comments that did need moderation (and will likely continue to attract them).
Right, "this thread is not productive, we will deliberate in private and come back with a revised proposal" is the vibe I get, which is probably the most constructive approach possible in a case like this.
I think a compiler that phones home is a great addition to my AI-powered terminal (https://www.warp.dev) and toaster with a camera that I never changed the default password on. /s
So, is it time to fork the go codebase and have a telemetry-free version?
I understand that there are uses for such data to help make the tooling better.
I don't understand how Google and its cult can be so disingenuously surprised that people are wary of it in 2023. This makes me trust them even less than normal - i suspect they'll follow the MS playbook and "accidentally" make the opt-out fail and eventually the idea of opt-out will be completely removed from the product because "everyone seems to be happy with it... No one turns it off anyway".
Opt-in greatly reduces the statistical quality of the data. That said, there are many forms of Opt-out. It's one thing to show a checkbox in the setup of a desktop applicatoin saying "send anonymous usage data" and have the checkbox pre-checked. That's a form of opt out, but it's not hidden.
It's quite different to have a system enabled which is only possible to switch off by googling a solution or digging in a settings menu. That's also opt-out, but it's a much darker pattern than the pre-checked checkbox.
Obviously, a command line tool doesn't have the luxury of an interactive setup, so the same patterns don't apply, but whether or not something is "opt out" or "opt in" isn't the entire story. I find the "default" to be the important thing. Developers need "the default" to be enabled to get reasonable coverage. That doesn't mean it needs to be "secretly" enabled.
This doesn't change what the parent comment said. Opt-out is inherently a dark pattern. The hope is that the person installing the software doesn't actually read and uncheck the box. It's no different to "install bing toolbar" for example being pre-checked and snuck in to directx.
> The hope is that the person installing the software doesn't actually read and uncheck the box
Yes. Because you have the "don't care" group being the important group. The users are split into a small group being positive (The would-opt-ins) and a small group being negative (the people arguing it must be opt in) and a large group who don't care.
The central "don't care" group is so large that the data is almost useless without them - i.e. you could just as well not have the system at all if you were going to do a full opt-in. So the unchecked box alternative really isn't an alternative.
Xcode command line tools require a one-time license agreement. FreeBSD ports will open an ncurses based configuration dialog if there are new options. Plenty of Java tools will prompt for JAVA_HOME if they can’t figure out where it is and the JVM forces providing arguments for enabling some license restricted features.
Telemetry from a private system should always be opt-in. I find it concerning that developers think they have some default right to data on systems they don’t own without explicitly getting permission. It may be technically possible, it may be statistically necessary, but it’s morally bankrupt and erodes the peer relationship that should exist in open source software.
> Telemetry from a private system should always be opt-in.
I have to disagree. I think it should be visibly communicated (not snuck in under the radar) but such cases default on should be acceptable so long as it is clearly communicated. Command line systems ARE inherently more difficult than the installer checkbox I'm used to (but by no means impossible to work with).
> Xcode command line tools require a one-time license agreement
> plenty of Java tools will prompt for JAVA_HOME if they can’t figure out where it is
This is easy enough if done from the start, but it's hard to do without breaking existing users's scripts if you have the requirement that "We don't stop and ask for permission in v2.0 if we didn't in v1.0 given the same arguments and environment, as that will break people's builds". And as a user of a command line tool I have to say that requirement seems much more important... This is why it's a good idea to have a /noninteractive switch for command line tools. You can state from v1.0 that without it, it might block at any point in any later version , so any script that doesn't use it made a mistake.
> developers think they have some default right to data on systems they don’t own without explicitly getting permission
What is explicitly asking here? Is the checked state of a checkbox part of whether it's explicit?
> it’s morally bankrupt and erodes the peer relationship that should exist in open source software.
Yes, OSS has their nuances, command line tools have their nuances. Having Google be the company at the receiving and has it's own set of complications. I don't think all the objections apply in all cases.
> The checkbox being pre-checked is still not legal in some countries exactly because it favors the non privacy protecting option
I think you are confusing this with systems that gather any PII, and regulations such as the GDPR and similar. This is about collecting data without any PII so any privacy regulation wouldn't apply here.
If the telemetry data is associated with the IP or contains anything that can be used to identify or track the particular Go installation (similar to a browser fingerprint), then it can be considered PII.
I don’t know how they can collect useful statistics when they cannot distinguish between 1000 invocations by the same user/environment vs. one invocation by 1000 different users/environments. And when they can distinguish, then that tracking is prone to fall under personal data processing.
Most off the shelf systems work with random pseudonyms: You generate a unique random number and store it only on the end user's system. Then you tag each data value with the random id one. These random identifiers do not count as PII so long as it is not possible to tie them to a real identity.
So while you are of course required to follow data processing regulations, pseudonymous usage statistics with random identifiers is not against any regulation I know about at least. The lifespan of the pseudonym can be as long or short as you want e.g. last one session, one week, one version etc. But even when completely persistent, they will not constitute PII.
Seems like the simplest compromise is to just force the user to answer the question one way or the other. No default - ask them "do you want to enable telemetry? We'll use it for X..." and only continue the upgrade or install or first-run once they have provided an answer one way or another.
Will it cause friction, with users and with tooling and automations? Absolutely. But I think the costs of that friction is not a tremendous ask in terms of avoiding the dark pattern that is defaulting to yes.
In a perfect world, at least. In this one, end users are never going to win an argument about telemetry with google-borne systems, and definitely not for actually benign telemetry systems like this.
> Obviously, a command line tool doesn't have the luxury of an interactive setup,
Why would the command line tool not have the option of an interactive setup?
Many command line tools ask you to say yes/no to things the first time they are run. Even if you want to reconfigure a tool it shouldn't be difficult to retrigger a decision tree.
> Why would the command line tool not have the option of an interactive setup?
Typically because (at least if it's already shipped) there are hard expectations about what it will do. E.g. that it shouldn't block waiting for input.
That might screw headless/CI uses. A shipped product probably also doesn't have the luxury of using a /noninteractive or /noTelemetry or requiring an env car or similar to block it, because the CI scripts are already written.
You can have it ask the user (block waiting for input) if its given a NEW command line option. But you can't do it in the absence of that option. That's why, for a shipped product, it's so much easier to make it opt out. The alternative would be to make it opt in via command line switch or env var - which would obviously be so rare that there is no point in doing it at all.
How many CI environment are (or at least should be) connected to the internet to begin with? That almost certainly eliminates the possibility of reproducible builds but also opens all kinds of security issues.
(I know sandboxes builds are not common place, but they really should be.)
Unfortunately it's not been widely adopted. My guess is this is running into the same problem that DNT=1 in the browser did, which is that different people mean different things by "tracking" and it's very hard to come to agreement on where to draw the lines.
This argument is nonsense. Everywhere you have to remember to use the non tracking compiler and everywhere you have to remember to set the variable are at best the exact same set. More likely it's actually more places that require remembering to set the variable since various build environments and tools tend to strip out variables in all sorts of places.
It also assumes setting the variable actually works - even without malicious intent, it's an approach that assumes there are never any bugs.
The complexity of the patch isn't the issue, it's keeping up-to-date with the upstream code. This can be done but it requires constant work, discipline, and it probably would need funding.
But the worst part is that it sends the message that we as a community will tolerate megacorps injecting telemetry tracking into our compilers, and our response will be not only to stay in their ecosystem, but to do extra work for them to keep us there. That's not the the message I want Google to hear; I want them to hear us tell them where they can take their telemetry and shove it. We do that with our feet.
This hypothetical Go fork/distribution (with telemetry disabled by default), should have an option to send bogus telemetry data to the Google servers. :)
So the occasional person that doesn't mind sending data to Google can leave it on, but enabled in that way instead.
Not quite sure what the logic of "I want to use this tool because it provides me value, but I also want to sabotage the development process of this tool" is meant to be.
To answer both questions: because the variables don’t always work.
If the code doesn’t exist, it can’t run. It’s the only way.
At the very least, it should be a compiler compiler flag and all code with telemetry should be in a separate library to make it easy to be sure no mistakes caused it to sneak it’s way in.
In practice, I can imagine the software packagers of various Linux distributions would set an environment variable to disable the offending behavior. This may be good enough for the majority of the software's stakeholders (users).
I hope software packagers for the various Linux distributions will disable this behavior, one or way or another, with or without the assistance of the Go developers. The official Go documentation can say whatever it wants, but in my experience the installation on Linux mostly happens via the distribution's package manager. The result would be that the software running for the majority of users will not send information to Google. It's not perfect, and won't do for some people (like me; I will remove Go and not write Go code in the future), but I think that'll remove the need for a fork for many people.
I've met quite a few (junior) devs who will follow guides instead of using their package manager because they come from Windows and aren't used to using their distro's tools.
I expect many professionals who do use Linux to run on stable possibly LTS versions of their distribution of choice and those rarely come with up to date tools for languages like Go or Rust or even Docker.
I wish installation would only come through package managers, but the amount of people I've had to help upgrade their Ubuntu to a newer version because they added a whole bunch of external repositories without realising the impact has taught me otherwise.
It's possible that distros will come out if the box with a /usr/share/go/env file even if Go isn't installed, but that'll take a while.
Good points, I could definitely be wrong in my estimate of the proportion of installations coming from the package manager. It may be that enough utility will be found in a fork. Nothing but to wait and see ;)
> So, is it time to fork the go codebase and have a telemetry-free version?
Or you know, be reasonable and set the GO_DISABLE_TELEMETRY=1 env variable, IF telemetry actually ends up being added some day.
I'm not in favor of any kind of telemetry in my devtools but there's no reason to go nuclear at this point.
Honestly the proposal by rsc is even reasonable to me, I'm mostly against it because I have no trust in Google regarding any kind of data, personal or not.
> eventually the idea of opt-out will be completely removed from the product
That's never happened anywhere, and it's entirely illegal.
Google isn't particularly well known for caring about the laws related to privacy.
Besides, opt out via env var is a broken idea. It forces me to go through all sorts of dockerfiles, ci config, dev machine scripts, etc to make sure its set. Not to mention the ongoing maintenance of keeping it that way. Id rather just use the go-with-at-least-some-minimal-respect fork and write scripts to verify that's the base docker image, installed golang, etc. It actually seems easier to do that than verify every environment that runs golang has that env var set.
That's my problem too. I have a website, then A LOT of companies know a lot about my users. From their ISP to Microsoft (I host on github) they all know when my users poop, when my users go on vacation, or when my users are cheating their husbands and checking in from someone elses computer.
It is so hard to create something, because of the dangers lurking everywhere. I think I need a reasonable creators license.
I see a lot of pro-opt-in arguments along the lines of "it needs to be opt-in by default, because the people we'd like to know from won't bother opting in, but wouldn't care either way".
Whilst I wouldn't argument that's an inaccurate description of the people you want data from, counter point: tough?
We're used to these liberties being taken in IT, so let's imagine a less technical scenario. You buy some lightbulbs from ShopX, and they'd really like to know your lightbulb usage habits in order to improve the efficiency of their lightbulbs (and nothing else, honest guv! but okay, let's give them the benefit of the doubt).
They could send you a questionnaire, and as rightly pointed out in these arguments, it's unlikely you'd bother with it. Or they could ... what?
Anything that involves them getting that information is a slippery slope at best, and a bit creepy already. This is the only industry where we've normalised this kind of liberty taking.
> This is the only industry where we've normalised this kind of liberty taking.
Or the only one where you're personally aware of it. The kind of low-frequency limited data the Go team proposed collecting is basically the same as, say, McDonald's telling store managers to report how many people ordered mayonnaise on their sandwiches so they can decide whether to continue stocking it. It's worlds better than the kind of individual patron tracking large retailers or credit card companies routinely do, but they don't advertise that.
There are two differences here: one is where data collection happens — since this is running locally rather than in the cloud we know it's happening — and the other is the larger question of Google's corporate reputation is affecting reactions. A lot of the takes on social media have been reacting to that larger issue, many by people who don't even use Go, and are alleging scenarios far beyond what would be possible with the proposed system. I think some of this also goes back to the risk of being open: I would bet that more than a few of the people who are very concerned about Go lang use tools like VSCode which collect far more data but not having the big public proposal didn't mean everyone was talking about it at the same time.
Ay but come on, likening it to stock keeping is a bit disingenious. A company's perfectly entitled to keep track of their stock, and they don't need anything from me to do it.
The credit card companies are more like it but I opt out of that game too, by not using them. Their practices are well known to me.
Yeah, that’s kind of what I was trying to get at in the second paragraph about the difference in where data is collected. In our hypothetical restaurant example, I think most people’s reactions would depend heavily on what specifically is collected and the granularity: almost nobody cares about someone saying they track what sells best, and only few would even care about their specific combination of items being tallied, but more people care if it can be linked to them in a loyalty program, and still more would care if it was something like an app tracking your attention on the menu. As described, I think the proposed system is more like the second of those four examples but a fair fraction of people are talking like it’s the fourth.
To be clear, “what if they extended the scope later?” is a fair question but I do think that needs to be carefully expressed as a significant hypothetical future change from the actual proposal.
As one of the commenters in the original GitHub discussion pointed out, the package managers of most open source operating systems ultimately will/should package Go with telemetry turned off by default, regardless of the Go project’s default behavior.
Good luck getting non-operating system biased data after that.
Collecting non-operating system biased data seemed to be one of the stated goals to making the telemetry turned on by default. Otherwise it was posited that Go users on open source operating systems would disproportionately choose to keep telemetry off when prompted in an opt-in model.
Leaving the (probably incorrect?) claim of suppression in the title aside for a moment, I just want to express my deep sadness about this decision.
I'm an ex-Googler who has admired Golang for many years and have been considering it for use in my hardware startup.
The fact that opt-out-only telemetry was even considered, much less appears to be already decided(?), makes me far less likely to use this tool chain at any time in the future.
The way detractors are being treated and the unwillingness of the maintainers to accept "no" as an objection indicates that the decision has been made already and what we are seeing now is attempting to socialize it and hammer out the details.
You see this pattern in other "free in license only" software where the important decisions are made without community input.
I see no such indication. This is tea leaves reading.
The Go team has come back on some other decisions in the past after they proved controversial. I can't predict the future, I'm not a member of the Go team, but I'll take the bet that the proposal will either be abandoned or that it becomes opt-in.
Thank-you. This is important point. It's not even an official proposal yet. And even if it becomes a proposal there's no reason to think it'll actually happen.
The speed at which this topic has escalated since the idea was raised last week is astonishing. The biggest mistake Cox has made is assuming people would at least read the three blog posts and debate the topic as presented.
I'm not thrilled at the idea of opt-out telemetry but some of the rhetoric is well and truly beyond the pale.
> The biggest mistake Cox has made is assuming people would at least read the three blog posts and debate the topic as presented.
Yes, I agree he was perhaps a bit naïve about this. On the other hand: I also don't really know how to do this type of discussion better. Russ obviously went out of his way to come up with a nuanced solution, which one could reasonably disagree with, but I have the impression many people didn't bother reading it, especially on HN here but also in the Go discussion about it. This thread is full of basic misunderstandings, or, if we want to be less generous about it, misinformation. I don't think most people do it on purpose, it is what it comes down to in the end.
One of the really great things about internet discussions is that anyone can join in. One of the bad things about internet discussions is that anyone can join in.
> Russ obviously went out of his way to come up with a nuanced solution, which one could reasonably disagree with, but I have the impression many people didn't bother reading it,
Telemetry is one of those topics that comes with a lot of baggage. Coupled with Google's baggage and we see the inevitable response. I have a similar impression as you - people saw the word "telemetry" and thought they'd read enough.
I thought Cox's posts were great. I came in thinking, "telemetry? hell no!" but came away thinking he made a good case. People can still disagree over it but some of the misinformation is absolutely unbelievable.
> I think it's pretty obvious that opt out telemetry rubs most people the wrong way.
This is basically re-stating the opt-in challenge: we know that some people are vocally upset by it but we have to guess at the relative percentages of the groups of people who know and do not mind, don't know but would be upset, or don't know and wouldn't mind if they did. People who are upset are far more motivated to comment about it, so anyone looking at social media is going to see a lot of messages from them but still not know what percentage of the total community feels similarly. I've even seen people who aren't primarily in the Go community but very active on privacy issues commenting on this, which is perfectly legitimate but also not representative of the median Go developer.
This exact percentages are irrelevant. Once you know that there are any people who do not want you to collect their data the only ethical thing to do is to obtain informed consent before collecting that data.
The percentages clearly do matter because otherwise people wouldn't be trying to pad their arguments by claiming large numbers support their position.
It's also important to remember that "data" covers a range of sensitivities, and vague assertions don't help the conversation. For example, the proposed system records less identifying information than visiting Golang.org but relatively few people would say it's unethical for that site not to have a mandatory opt-in requirement before loading a page - most of the interest tends to be in things like disclosure and retention policies.
DNS isn't enough, applications have been sneaking off data to hard-coded IPs and through other means for at least 15 years now. These days we have DoH as a standard, but way before that I spotted a Windows application execute HTTP requests to resolve IP addresses for their tracking domain.
With some effort you can MitM your phone with your own certificate authority and a tool like mitmproxy in transparent mode. It still won't detect all of the trackers (detecting MitM is quite easy with certificate pinning), but it'll show you even clearer how much you're being tracked.
Expect every API to also be a tracking endpoint, and expect every application you use to upload your location. There are apps that do well, but they've become the exception rather than the rule.
Especially with so much interest and momentum seeming to have shifted to Rust of late, I wonder whether Golang support is on or near Google's famously bloody chopping block, and adding analytics would help support a "data-driven" case for keeping it funded.
I don't understand why people insist on comparing Go and Rust, when they clearly don't compete against each other. Rust competes with C++, Go competes with Java. The only thing they have in common is being relatively modern and recent languages, but they just don't share the same goals and don't have the same promises.
I think in part it's because those historical separations were driven by multiple factors: not just things like whether you manage memory directly but how cumbersome the language was, the range and portability of libraries, etc. Some of Java's appeal was no longer fighting the myriad of different libc misimplementations across various operating systems in the 90s.
Rust is interesting because not only is it good at low-level control but it has very advanced language structures and rich libraries (unlike C), better ergonomics than Java, a great async implementation, etc. That expands the range of problems you might consider a “low-level” language for considerably since you no longer have the tedium of basic things like primitive data and control structures, non-ASCII text processing, etc. steering you towards other choices.
Go is an interesting middle language because it has some higher level features but also surprisingly limited types and C-style error handling, so you had the somewhat unusual case where people who wanted either low-level control or better robustness, advanced typing, packaging, friendlier compiler error messages, etc. might reasonably conclude Rust is a better choice. I'd still say Rust is harder to learn but the situation feels somewhat unique compared to the past where e.g. you really had to want the advanced features in C++ to pay the productivity tax of using it.
Okay, but in my circles at least - primarily professional fullstack and mobile dev, plus a wide range of hobbyists and aspiring pros - they've competed for the same mindshare for about as long as they both have existed, and Go seems to have lost out pretty decisively. The most recent SO developer survey seems to bear that out.
I don't know what goals Google, as opposed to the Golang team, has for Go. I do know Google has notoriously opaque and complex internal politics, and a long history of summarily axing projects that fail to perform over some set of internal metrics, no matter how large and invested a userbase they happen to have.
So I don't think it is unreasonable to consider the question. Sure, the technologies aren't fungible, but it isn't a technological question I'm asking; every longstanding human enterprise necessarily involves politics, and it's the politics I'm wondering about here.
> they've competed for the same mindshare for about as long as they both have existed, and Go seems to have lost out pretty decisively
I have no idea what you're talking about. I agree for the mindshare, because I started with Go and thought Rust looked more elegant, but after actually trying it it became pretty clear that they don't compete.
I don't know where you're getting that Go somehow "lost", it sounds like bubble reasoning. Again, it didn't "lost" because they don't compete with each other.
It's like saying Nadal lost against Messi because Messi is more popular when they're not playing the same sport, it doesn't make sense.
I wish I hadn't included the comparison with Rust. It's totally severable from the conversation I was trying to start, but no one seems to have noticed anything else.
While I sympathise with this sentiment, it isn't really feasible, since there are too many useful programs written in Go. Since Go is Free Software, both users and distros like Debian can take advantage of that and just patch out the telemetry or disable it by default. Same goes for any new telemetry that is discovered. That is a better way forward here, since we get the programs implemented in Go but none of the telemetry.
As always, these discussions get brigaded by trolls so it's no wonder lots of comments were hidden.
However, several hidden comments were, as far as I could tell, made with good intent. That does create an impression that certain opinions are not welcome. The argument for hiding these comments seems to be that they've already been used in other discussions (i.e. "opt-out is illegal under GDPR" being used in a discussion about opt-in/opt-out design despite not being mentioned in that thread before).
When I read the comments by the Go people, it looks like the implementation of opt-out telemetry has already been decided on. Objections like "have you checked if that's even legal" are ignored with a simple "I'm not a lawyer". As rsc puts it, "This is a technical discussion, not a negotiation.".
The discussion is clearly set up wrong and I doubt it will matter in the end, now. The Github discussion is focused on technical implementation, with the moral discussion already having been decided upon; that way, people with arguments against telemetry have no place to voice their concerns. Maybe the moderators simply couldn't imagine being anti-telemetry or maybe they don't consider the position relevant, but this approach was going to stress out the moderators and cause unproductive discussion from the very start.
Personally, I'm somewhat disappointed (but not surprised in the least) that this is being suggested, but I suppose this is what one should expect when running any kind of program from Google (or Microsoft, for that matter). You simply can't trust these companies and the open source developers that work for them to respect your privacy.
I have to say, though, the proposal doesn't seem to take the risk of bad data injection seriously. I myself would definitely run the theoretical AdNauseam style fork of Go if the team behind the tool choose to go through with an opt-out mechanism.
If I want to understand my program and where it spends time, I use a profiler. If Go maintainers want telemetry, they should provide an optional profiler that adds telemetry and whoever wants to opt in can use it after twiddling the appropriate knobs. Just claiming violation of CoC and forcing it on the entire userbase is atrocious.
If I have not made my code open source, stop snooping. If you want to know how your features are used, go and analyze open source codebases.
And it must be default turned off. Don't use dark patterns like default telemetry on and hoping that x% of users will forget to turn it off in a week and we will get our data. Smells bad.
To be honest, I'm pretty against telemetry in Go, but it's not about the compiled code itself. They want to telemetrize the go command and compiler itself to collect data about which language features used (I assume), and how the go command is used.
Again, I'm pretty against it. My red-line is opt-in telemetry, and even that's a place I don't want to arrive.
I think we are in agreement. The problem I have with the principle of forced telemetry is I have no way (or more precisely, time) to understand that their telemetry red-line stops with command line logging/analysis. So, I have to assume the worst with the toolchain.
Microsoft .NET also introduced opt-out telemetry (ie., turned ON by default and you could opt-out by setting an environment variable DOTNET_CLI_TELEMETRY_OPTOUT=1|true|yes). These are just dark patterns by big-tech and erode developer trust.
Both of us are completely on the same page. Funnily, I have made the same reference to the same paper [0].
The comment has a "let's talk about how we stop there and not move forward than that" tone, because frankly, I don't think Go maintainers and Google will say "Oh, we overreached, sorry. Let's abandon it and never talk about it again".
However, even discussing telemetry and being so adamant about it made me reconsider the extent I'm going to use Go in the future.
The first remark to start everything was sensational, and I'd love more specific citation to learn more about the claimed allegations.
Then, seemingly out of nowhere, one individual went on a major rant about "intersex" or something like that, and something-something Ada Initiative...
What I want is a technical discussion about "Google's opt-out telemetry proposal on the Go compiler", which raises some alarm... the word "opt-out" seems to suggest Google would require people to unsubscribe from their intelligence gathering? Seems sensational at face value. But is that even true? And what, if anything does gender-identity drama have to do with anything about that?
I don't understand the almost religious opposition to the telemetry the Go team are proposing. It involves no PII data, would be easy to disable for those who want to do so, is minimal in scope, and the use cases are well articulated.
Data collection always starts out like that. An independent company may get away with that approach. However, this is Google.
The problem is, we've seen it all before. When Microsoft added telemetry to dotnet, they stated the following:
> The feature collects the following pieces of data:
>
> The command being used (e.g. “build”, “restore”)
> The ExitCode of the command
> For test projects, the test runner being used
> The timestamp of invocation
> The framework used
> Whether runtime IDs are present in the “runtimes” node
> The CLI version being used
>
> The feature will not collect any personal data, such as usernames or emails. It will not scan your code and not extract any project-level data that can be considered sensitive, such as name, repo or author (if you set those in your project.json). We want to know how the tools are used, not what you are using the tools to build. If you find sensitive data being collected, that’s a bug. Please file an issue and it will be fixed.
All very useful information that doesn't tell them anything about you or your machine.
Since then, Microsoft has been steadily increasing the amount of data it's been collecting, including personal identifiable information in the form of a pseudonym based on unique machine identifiers. Once data collection starts, it only ever gets worse.
It doesn't have to involve PII data to be invasive. The heuristics applicable can be used to, in a roundabout way, fingerprint the type of software. And since it's being shipped to Google through the Internet they also know where the software is being developed and where it's being used. And this is just for starters. Once they've tackled the hurdles of people's resistance; having one foot through the door; it's easier to slide more and more telemetry functionality in. You may recognize this behavioral pattern from, uh, well, almost all of their other products.
"PII" is often a straw man based on a few narrow categories defined by corporate interests. I consider IP addresses and what code I'm working with much more personal and sensitive than say the association between my name and my social security number.
Doesn't matter, someone looking over your shoulder while you're using your own machine, without asking first, is spyware. Period. Doesn't matter a bit what they're collecting.
Objectively speaking, how valuable would that information be if it were to be leaked? Random weekly aggregated and sampled machine data associated to random IPs that in most cases would no longer be yours (and that’s in the worst case that both servers were compromised or misused by an employee during a specific time window, because IP is not stored in the analytics server but temporally in a proxy to prevent abuse).
I don’t know, I’m all for privacy but people get passionate too easily the moment someone even mentions the word telemetry and I think we should focus on actual privacy issues, like analytics in the context of ads, social networks (tiktok, fb, ig…), etc.
I wouldn't have thought of this before this issue came up, but I just searched for "software written in go", and sadly, there seems to be a lot of it. I wonder if all the folks outside of Google will remember to turn this off. If your motto is "ok, don't be evil didn't pay well", this is a genius move. Craft a language good enough to appeal to others, give it away, let a community grow, then hoover up data from everyone using anything written in our language. Unless, of course, they remember to opt out.
Other projects I have seen have done opt-out telemetry on the beta/testing/development versions of their software, and it's disabled on stable builds. Win-win.
Blabs on about the bureaucratic/ policy ends of collecting data from me and NOT what they want to impact? Geez not even one example of where a little data could've helped golang compiler progs and users? My immediate reaction? Def negative
I think this assertion and the various sentiment around it is nonsensical and partly demonstrates why "they" were actively moderating as they attempted to have a rational discussion about a topic with human beings online.
Some of the moderator decisions were stupid; many were not ("You can stick your telemetry where the sun doesn't shine" is no doubt sincere but it's not LKML I suppose); much of it seems to have been attempting to keep stuff organized and not-repetitive.
This article from The Register [1] has a good general overview, and it leads you to the GitHub discussion [2]. Searching for "bad faith" and "conduct" in that thread yields some results, but not strong enough for me to agree with the poster. There are hidden comments too, but not enough nor eloquent enough (IMHO) to support the claims of bad faith.
I look forward to the tool uploading random garbage telemetry to their collection endpoint (or dependency doing it every build). Not endorsing it but it's the logical next step if you try to do this in an open tool to a programming community. They have to be more careful than Google is used to be with their other products.
I think lads is only gender neutral in some groups and as a more recent usage. In the UK we used to have 'lads mags' which were magazines aimed at a young and specifically male audience.
In my mind, when people say "lads","guys" or "mankind", I just think "people". Does anyone really think they're litterally talking about men only? I feel like this kind of comment is just used to dismiss people.
I feel like this kind of comment is exactly the problem, isn’t it?
From what you’ve said - in your mind, addressing a group of people as “lads” makes no comment on their gender. Some people will definitely agree with that and not be bothered at all. But I don’t find it even slightly hard to imagine that if I were a woman in a group being addressed as “lads” that I might kinda bristle at at, as I know some people do.
It’s just weird to me that anybody’s first reaction to this sort of thing would be to double-down on “actually guys is gender-neutral” instead of just thinking “good point” and making a mental note for the future.
There’s not really anything to stop me saying ‘out loud’ on HN that I don’t like working class people, if that were something I wanted to say. In fact I think it would be rather less likely to get me downvoted than posting in support of gender inclusive language :)
I’d be intrigued to know who ‘these people’ are, though, as I am apparently one of them.
I thought this too, but apparently there is a whole palette of female versions for dude, guy, lad etc. [1]. I only heard “gals” before, never lasses or dudeens. And the “gals” also only occurred in spoken form, I’ve never read it anywhere (although I consume a lot of english media).
Maybe it’s because I’m not a native speaker. In Germany we have a huge gap between daily/ business communication and the communication from authorities. The latter don’t write „Sehr geehrte Studenten“, but „Sehr geehrte Student:innen“, so I suspect something similar is happening here too.
That’s why a “lads’ mag” is a magazine aimed at all genders, right? Not sure if you are British, but ‘lads’ has a much stronger gender connotation than ‘guys’ etc. There’s also a female counterpart of the word (lass).
American English speaker here. At least in the region where I live, "lads" is definitely gendered (plus uncommon and old-fashioned sounding and, applied to adults, a bit insulting), but "guys" isn't—which leads to friction when we interact with people from areas where, I gather, "guys" remains strongly gendered.
I have in my entire British lifetime literally never heard “lads” refer to a mixed-gender group of people, in the way that “guys” often is. Not to say it doesn’t exist, but I have just never encountered it at all.
No, it doesn't. It refers to the subset of Go maintainers who are (allegedly) using the CoC to suppress opposition. I ask again: which specific Go-maintainers have both suppressed opposition with CoC and also are non-lads?
Anyone can read the tweet and decide for themselves. To me the intended reference is clear. The “Go maintainers” are accused of doing various things in the first paragraph and are then referred to in the second paragraph as ‘lads’; no salient subset is introduced as a potential discourse referent. In any case, my comment was obviously based on my own understanding of what was said.
> the Go maintainers are now claiming that objectors to Google's opt-out telemetry proposal on the Go compiler - yes, really - are arguing in bad faith and violating the Code of Conduct, and their comments are getting hidden.
> well done lads, you just keep doing that as hard as possible. i'm sure it'll work out great.
> The “Go maintainers” are accused of doing various things in the first paragraph and are then referred to in the second paragraph as ‘lads’; no salient subset is introduced as a potential discourse referent.
The accusation does not state that each and every Go maintainer is personally doing this. The accusation is targeted at "the Go maintainers" who are (allegedly) doing this.
The poster is vague about who exactly who they’re accusing of what. I can’t read their mind, so I can’t tell you if all the people they are thinking of are men or not. But I can tell you who they refer to in their tweet: the Go maintainers. If at any point they wish to make a more specific accusation, then the full resources of the English language are available to them.
Uhhh... What does it tell you, exactly? That the OP is from the UK where "lads" is a common expression? Or that the OP is not politically aligned with leftist agenda, since they're using gendered expressions?
"Left-Right" is not a gender issue; it's a class issue.
I think the term "leftist" is used mainly in US political debate, to refer to whatever US conservatives don't like. For heaven's sake, they refer to Biden as a leftist.
Not sure what exactly you're trying to say. I asked if you opposed to the term "lads" because of UK reasons or political reasons. You could have just said political reasons - which is the answer to the question I posed - but no, you went off this wild tangent about how you oppose the term "leftist". Ok, you're on the left politically, that's why you strongly oppose using gendered term "lads" and also the term "leftist". Got it.
May be irrelevant here but I stumbled across a new programming langauge called V [1] that is heavily inspired by Go and does not rely on LLVM rather directly generates C.
Operating system is already a spyware and next arms race is for everyone to arm their compilers. Good going, product managers.
I agree that I (in theory) quite like some of the design decisions, but as far as I know nothing of this actually works in practice. Like, it's not even close to being production ready, and people are seriously doubting whether the language will ever even come close to being a viable option.
I'm not up to date on all of the issues (in the code or in the community), but the last thing I heard is that the developers of V are still working on the "auto-free" memory management model (which is supposed to infer when 'free' needs to be called), but it's not working (and afaik proven to be undecidable in general).
Edit: Another reason why I'd be rather skeptical of this language is the claims it makes on the 'Compare' page: https://vlang.io/compare
It starts off stating that "V was created because none of the existing languages had all of the following features:" and then goes and lists an incredibly ambitious list of features, which essentially amount to "As safe as Rust, but Go's simplicity and fast-compilation time, and a small compiler with zero dependencies", and more. These are hard problems, especially if you're also working on developing an experimental memory management technique.
This is the purpose of "Code of Conducts", to silence dissent. They are sold to you by people who say they'll only silence the bad sort of dissent, but lo and behold the definition of 'bad' isn't locked to the seemingly reasonable scenarios with which CoCs are promoted. Try to oppose the CoC now that you see what it's really for, and the CoC will be used to silence you as well.
The telemetry they’re going to add is going to help the go team improve the language and it’s ecosystem.
It’s going to be utterly inconsequential for everyone who leaves it enabled.
People who are raging about the change are not crusaders for privacy and freedom, they’re petulant children who “don’t like the idea of it” and are throwing tantrums.
If it’s that important, there are plenty of languages that put idealism over pragmatism to choose from.
> Well the problem is the privacy bait and switch isn't it?
What is the privacy bait and switch? The whole idea behind "anonymous usage statistics" is that it's statistical and anonymous, so that it by definition has no privacy consequences.
If that doesn't hold up (e.g. it leaks a username, a path, anything) then there should be outrage. But this discussion isn't about that. Or about the fact that it could be worse, or have a bug.
The interesting question is: is it acceptable to have telemetry that is anonymous and statistical? And if not - why? The argument "it doesn't respect my privacy" can't apply then.
Deconstruction of anonymity is an academic and agencies proven success.
It can be called anonymous when it goes through TOR to be sent, etc… which is not even considered afaik.
If you rely on the receiving party to not store your IP and other sent details, then it’s not anonymous. Sadly, some companies get evil, get compromised or get internal threats…
And yeah, opt-in is minimum or the software is called a spyware. And yes, with this definition recent Windows is a spyware.
If you want an argument about privacy, watch the insightful talk by Halvar Flake on security vs. death squads… chilling but sadly true.
If we disregard a small command line utility and look at a larger software package, so that it already does network connections (or won't function) then the key thing is TRUST. That can be achieved by organizational transparency, open protocols, open source. It's hard and slow to build and easy to tear down. But if you use the software, you trust the provider.
The simplest argument for this is: if you don't trust the provider then why on earth would you trust that the "no" actually means no!? The compromised or evil organization would send your data anyway. And if they are really evil they wouldn't do it as part of the telemetry they'd just exfiltrate it in whatever way necessary. That's why I think the telemetry thing is a much smaller issue than it seems when it comes to privacy. It's already blind faith.
> When you install a compiler who's respecting your privacy, and without opt in it starts sending telemetry, you can't expect that to go well.
Are the currently deployed and used versions of the go compiler going to suddenly start sending telemetry? Or is it just a future version that someone would have to deliberately upgrade to?
Personally, when a new version of golang comes out, I like to read the release notes and then adjust my build environments with the necessary environment changes for the new features.
By asking nicely. Not by siphoning data out of my system.
Where is this world going ? Do i really have to treat every program as spyware and restrict its internet access ? And anyway, why would a compiler need internet access ?