Author of the post here—as another commenter mentioned, this is indeed a bit dated now, someone should probably write an updated post!
There's been a ton of evolution in dev tools in the past 3 years with some old workhorses retiring (RIP Phabricator) and new ones (like Graphite, which is awesome) emerging... and of course AI-AI-AI. LLMs have created some great new tools for the developer inner loop—that's probably the most glaring omission here. If I were to include that category today, it would mention tools like ChatGPT, GH Copilot, Cursor, and our own Sourcegraph Cody (https://cody.dev). I'm told that Google has internal AI dev tools now that generate more code than humans.
Excited to see what changes the next 3 years bring—the pace of innovation is only accelerating!
> I'm told that Google has internal AI dev tools now that generate more code than humans.
Given that code is not an asset but a heavy liability the elephant in the room is the question: Who is going to maintain all that huge piles of generated code?
Wake me up when there's an AI that can safely delete code… That would be a really disrupting achievement!
> Wake me up when there's an AI that can safely delete code… That would be a really disrupting achievement!
Big tech already has depandabot-like bots doing this: dead code removal, refactoring and other things a linter can warn you about all get turned into automated pull requests (or PR comments). These are things a linter would tell you - if you had the patience to wait several hours to lint a gigantic monorepo; there probably will be more tooling support based on LLM trained on the huge code bases.
I'm not talking about dead-code elimination (that's something that compiles do since decades, without AI), and not about something like Scalafix's automatic refactorings (which are actually deterministic and correct, because it doesn't use AI), but about some true AI that could simplify code—and remove in that process at least 80% of it, because that's usually just crap that piled up over time.
Like said: Wake me up when when you can show me metrics where the use of AI shrunk code-bases by significant large amounts. For example going from 500 kLOC to 100 kLOC, or something like that. (Of course without loosing functionality; and at the same time making the code-base easier to understand). That would be success.
Everything that goes in the opposite direction is imho just leading inevitably to doom.
It's absolutely fantastic. I feel disabled coding without it in my spare time. Imagine a pair programming session with someone that can read your mind and knows the entire codebase. I don't know how much I'm allowed to talk about so I'll leave it at that.
I don't have access to copilot so I can't compare but I'd wager it works a lot better for Google internally because the training data is customized to Google.
Interesting, I didn't find it that impressive when I tried it a couple months ago and fairly quickly went back to regular old-fashioned autocomplete. What language(s) were you working with out of curiosity?
I'm at Google and am not entirely sure what you are referring to, but I'd love to try it. Could you provide an internal codename I could search for? Or is it integrated somewhere in Cider (the name is public knowledge so I'm not leaking anything by using this) that you could guide me to (i.e. X dropdown -> 3rd option) while preserving ambiguity?
I’m fascinated that of two people working at the same company, one raves about how an internal tool is a complete game changer and indispensable while another isn’t even aware it exists.
I'm fascinated that at a 175k employee company, two employees I know nothing about, in business areas or positions I know nothing about, could possibly use different tooling for their day to day duties.
*Surely* at your place of work, the cleaning staff is familiar with the CTO's dearest tools and vice versa?
That’s an unnecessarily snarky answer given the two people above are clearly both in similar roles. The parent comment was asking where to find the option in the tooling they both shared.
Snarky replies comparing CTO to cleaning staff don’t even begin to apply.
It's a conversation that isn't relevant to anyone outside the company. One commenter, who may or may not work in Google, is asking for an internal codename.
You're the downvoter here so why does my comment bother _you_ so much?
Close friend worked at Google Brain, now Deepmind. Said he presses tab 70% of the time and it just works. When shown my paid version of Copilot said it felt his auto completes had higher accuracy.
Besides the tools mentioned, what are some other AI tools that can be used to accelerate coding for established codebases? I have some money to try out new tools, but am wondering if there are that are less "black boxy" and would work with a company's private instance of Azure ChatGPT?
With powerful high-level languages you can reduce boilerplate to almost zero. (And for the rest you can use code-gen.)
Copy-paste on steroids (AI code assistants) OTOH will lead to doomed code-bases even faster than manual copy-paste did in the past. People are going to learn this really soon. Maybe two or three years left until the realization will kick in.
> One of your competitive advantages after leaving Google will be to apply these experiences to bring great new dev tools into your new organization to boost your own productivity and the productivity of your teammates
Please don't do this blindly. Your new organisation likely won't be similar to Google. If a xoogler would onboard with this attitude that any other tooling and process must be inferior, they wouldn't last long.
tbh the vibe I get is that Google seems to be great at building tools that teach people nothing about how to collaborate with anything outside their current company... so they're both hamstrung when they leave, and then they continue to build new tools that don't work together with anything else because they don't understand the community at all.
Bazel is a perfect example. Bazel projects are horrifically (arguably impossibly) hard to use to collaborate with other Bazel projects outside of your control, and its abilities encourage you to do (useful!) things that don't hold up outside its little isolated world. It's a super great system in a lot of ways, but it is anathema to open source development.
At times it feels like it's probably extremely destructive in aggregate, particularly as more and more companies blindly mimic it, and that worries me.
I think that's quite an uncharitable interpretation of what's happening - Google doesn't really have bad intentions here imho, but with every tool you suffer from a conflict of goals: Should it be as good as possible for the current environment or should it work "well enough" in as many environments as possible?
That's also the conflict between external libs/frameworks vs. NIH. Yeah, sure. Sometimes NIH is a matter of arrogance, but it's also a fact that something you build yourself can work far better in your environment and with your constraints than something you took in and then banged on until you have a reasonable fit.
The corollary is of course that open-sourcing such a thing is often rather useless at first, especially if the company that does it (and Google imho is one of the worst offenders here) doesn't really put in the work to make it useful for a wider audience. Which is partially a function of the companies' devs being more used to working with very fitting tools. With those tools which are useful enough you often see a period of adjustment then. A few releases which don't bring much in the direction of new features, but "only" things you need to make it work in more environments.
To your last sentence: I think that in general it's a bad idea to blindly use or emulate what some (especially big) company did. Just because it works well in their use cases, doesn't mean it will work well for others. It's cargo cult programming, which is unfortunately rather popular in our hype driven industry ("New fancy library by x, we have to use this too or we will be done for in six month!")
> I think that's quite an uncharitable interpretation of what's happening - Google doesn't really have bad intentions here imho...
OP didn't talk at all about intentions. OP talked about results.
> ...but with every tool you suffer from a conflict of goals:
Sure. But it seems to me that OP notices that Google's really, really bad at choosing the "Make it so that people who aren't working in a megacorp who can afford teams of dedicated tool wranglers can easily use the tool and interoperate with others who use the tool." option when presented with it.
And like, why would they ever choose that? They have zero need to, because they are a megacorp who can (and do) hire teams of dedicated tool wranglers.
What do you think about Bazel makes it hard to collaborate with other projects (any more than other build systems)?
I mean, very few people use Bazel so you're likely to have to implement Bazel support for any projects you use, but that's no different to other build systems (e.g. until it became a de facto standard I frequently had to implement CMake support for third party C++ libraries I used).
Some of the in-house stuff was built before alternatives were available or scalable, and now there'd be a high cost and friction to trying to change it. Happens in many big companies that have been around for a few decades, it's not because of bad intentions.
When I left Facebook, I missed all the internal tools the company developed. I currently work on dev tooling at my company (though leaving soon). So I have some thoughts as well.
Having been at facebook prior led me to believe that we can definitely 2x productivity if tools made by these large organizations are open sourced and maintained.
I will probably write my thoughts like the author (if anyone cares), but the bottom line is I consistently find many existing open source projects missing critically useful features and/or don’t focus as much on real developer productivity (I admit that can be very subjective).
Eg Bazle is fantastic when it works (all deps are accounted for) but it is not unusual to find github comments that tell users to add patches because the maintainers (core and contrib) haven’t fixed the problems (some over 2-3 years old) or because they hardcode a version of Go….
Another example is VS Code. facebook (now meta) offered everyone to its heavily extended VS Code. While I didn’t use that as much since I am a terminal and a Vim developer, now that I use VS Code almost daily at work for writing Go, I really do appreciate the customization facebook engineers did - because too often someone (both engineers and non-engineers) would come to us and complain about some issues with the built-in git UI, trying to figure out how to construct a working workspace settings, switching between git and arcanist. The facebook’s version had all this figured out really nicely. With minimal training someone could submit code very quickly without ever touching the terminal.
I can go on and talk about how awesome ods is compared to prometheus and its query UI. Finally, I miss Scuba too (and Honeycomb founders were involved building Scuba).
Sure, facebook tools don’t always work and there are issues and bugs like any software, but overall I felt more productive.
Maybe it is because I did’t maintain them - now I do as a full time engineer at work, I feel the pains. But then again, if google and facebook maintain these as open source would be really nice.
How would you rate VS Code for Go? I use the official Go extension but am not impressed. My main gripe is that I often have to manually add a local import that should have been found and suggested when I typed the name of a variable or function exported from an adjacent package. I much prefer Goland, but use VS Code because it is the only editor my company's internal AI code assistant supports.
VSCode for Go is pretty much just gopls (the language service). So this is all true for Go plugins in vim, emacs, etc because they all use the same base, but not GoLand.
IMO: gopls is only just barely good enough to stop people from building an actually good replacement. Highlighting and build error stuff is reliable, but it regularly misses extremely straightforward type lookups, navigation, and autocomplete hints and that makes it untrustworthy in pretty much all cases. Basically everyone I know who uses it (somewhere between 50 and 100 people that I've talked with about this) relies on plain text search to get around and find things rather than doing type-driven stuff, because otherwise they fail to discover a lot.
All of which means it's insane and awful to me, barely better than "it doesn't support Go but Go looks close enough to C that it mostly works"-style ctags. And sometimes worse: even unsupported-language-fallback stuff in random tools sometimes gives much better autocomplete than gopls, which for me semi-routinely fails to hint things defined in the same file.
---
GoLand by comparison is wildly more capable, stable, and controllable. It actually works, and e.g. if you "find usages" you get a complete and precise result. Highly recommended if you do a lot of Go work. The single exception is that it doesn't support GOPACKAGESDRIVER (because it isn't gopls), so if e.g. you've got a complicated/abnormal Bazel setup it won't integrate well.
Surprisingly it works pretty well for me. Where we work we have a weird go and non-go module setup but everything works with almost no additional tweaking on vscode side, just exporting some env variables.
But Python? I hate the testing setup. We have a mono repo, while not as big as meta’s, full discovery eats cpu. So we had to tweak our settings, per team, to only discover team specific dirs. Also, there is no way to kill a running test (at least no obvious shortcut or button-wtf)
Have you ever had VScode add an import prefixed with your vendor path instead of real module path? It doesn't happen too often but it's goofy when it does.
Is Google still an org whose methods are to be emulated? Maybe small parts of it? In general the vibe I get from Google is a scerlotic company completely overrun by bureaucracy. And it’s been that way for a decade.
Google's problems are in leadership, planning, and prioritization. They aren't in their developer tools and technical foundation. I think there is some connection between the monorepo and aggressive deprecations, but that's not the root cause.
Any organization that designs a system will produce a design whose structure is a copy of the organization's communication structure.
— Melvin E. Conway
But the technical and developer layer does not operate independent of the leadership, planning, and prioritization layers, right? They don't strike me as separable.
Google didn't copy Microsoft, and Netflix didn't copy Amazon, and Apple and Microsoft and Yahoo do their own thing also. Every big tech company claims its culture and tooling is the best and yet they mysteriously share none of it.
And in any case, your company isn't Google. And it doesn't succeed or fail based on using this or that tooling.
Identify your own pain points and solve those instead. Maybe you need a code search tool and buy it from the author, maybe you don't.
Nobody at Microsoft claims they have the best tooling. Culture, for sure.
But tooling is a sad joke. Hell, we can’t even paste a git diff into a Teams chat because the crappy RTF box mangles the output. And Microsoft literally owns GitHub. But we go home early, the stock is up, everyone is Mormon-level nice to each other. Hard to complain.
Google dev tools serve Google scale needs. Not only your 2 people startup has no need for these, but they introduce friction that might kill your company.
Not just google scale needs, but google-specific needs, regardless of scale. Having your entire company in a monorepo is something that Google went all in on and makes work, but it’s arguably not really necessary, even at Google scale.
nope, that's completely incorrect, google does "compartmentalization of confidential, sensitive, or secret business logic and programs" in the monorepo. it's also massively increased how often it does and expanded how it works as management trusts employees less and less and less over time.
If you really care about speed of iteration, connect REPL to a running process and develop with it. Enterprise development practices are not about speed
Can developer at Google run a copy of service in production environment and interactively develop it via REPL or some other hot code reload technique? It won't break any users if load balancer doesn't route real traffic to your instance.
But I left Google early this year to work at a startup. I can tell you that the confidence I have with Google's tooling, testing infra, presubmits etc. are miles ahead of the confidence I have with our current pre-commits. But at Google we had millions of daily-actives and at the startup we have dozens, so if something breaks it's not too bad.
Do any other company start as many new things as Google? Google builds things extremely quickly for being that large. Then they shut them down just as quickly, which is another matter, but it is hard to argue that they are slow at building products.
I wonder why Prometheus/Grafana are so popular in the monitoring stack. They seem to promote a culture of 'Hang a screen over there and stare at it'. I always have to do a lot of work finetuning them, and some important metric will be missing.
My experience with zabbix was just the opposite, BTW. Ugly, but practical. The defaults let you quickly set up monitoring, and then nagstamon or other alerting allows you to forget it exists until bad things happen. Then you browse trough the templated graphs where a spartan but interesting indicator of what happens will exist somewhere
I don't think they promote the "Hang a screen over there and stare at it" approach. In our company we use both, but Promteheus has an Alertmanager and I only ever open the UI for debug when the alert is raised.
In fact, I truly hope no one has a screen with Grafana in their office. I agree: if anyone has those - you are probably not doing it right.
This is from 2020 and it is out of date. Phacility is no longer offering hosted Phabricator and Phabricator itself is no longer actively maintained. Also, graphite.dev is a solid option in the code review space. It provides better GitHub integrated code review like reviewable.io, but also provides a CLI with a more Mercurial like workflow and replicates workflows available inside of Google and Facebook.
Which BigTech (10k+ employees) has the worst internal dev tooling? I have heard horror stories about some build systems at Microsoft but curious how others fare.
Not sure this counts, but adobe had no org wide tooling afaik. Each team could pick its CI, code review, repo , issue tracker etc. I was on a smaller product but it was chaos. I especially felt it when we had some CI issues and no expertise to help. It was a lot of time out the window trying to get the basics working.
I'd be interested to hear these horror stories about Msft (are they from the modern era? 2010s? 2020s?). On my old team there, I really liked our build system (Azure Devops/Azure Repos, basically all-in-one tool from issue tracking to Git to CI/CD deploying to Azure....
The horror story I used to hear was from the pre-cloud era-- when you set up a new computer to build, you'd need to wait a day while the initial pull & build finished before you could really do anything.
That's not true, I was able to get working environments with all of the tools I used at Google within a month.
That being said most of them were overkill for startup or small teams. Found it kind of funny even though the tools are "open source" all the name's are different the documentation is non-existent.
Some of the open source stuff is just better anyway.
A good many of the less critical internal tools were have-a-nice-day tested. Meaning, if you looked at them cross-eyed they fell over. Teams put them out, gave a tech talk, got promoted, and then lost interest and moved on.
Note I said "less critical." Some of the important ones were really excellent, e.g. Dremel.
Almost 1.5 years in, here is my opinion on Critique:
Github < Critique < Azure DevOps (or actually, I think it's called Azure Repos)*
*might have changed because I was at MSFT until mid 2021, but development and review used to look sort of like this: https://www.youtube.com/watch?v=dGCid5W-HK0. I just felt the UX was really clean and clear, plus it immediately tied into the walled garden of tools like private package repos, CI, and Azure deployments.
To answer your question, Azure Devops / Azure Repos (at least at the time) was not inside the Azure portal. It was its own website. Bit of a confusing name, eh?
Based on the comments in this thread, it seems that Google is using AI to replace software engineers. That explain the continuous layoffs. Let's see for how long they can continue moving from a engineers company to a managers company.
that's how they get the good jobs after being laid off. Nowadays google engineers have nothing special besides being easy to brainwash and fit a big machine.
how on earth did your day lead you to deciding to slander en masse 10 000 people who had some bad luck, who you know nothing at all about including their skills?
How does not thinking Google engineering (and then their engineers) are special become slandering? You guys really take "not being special" as an insult? Too bad for those 10 000 ppl.
There's been a ton of evolution in dev tools in the past 3 years with some old workhorses retiring (RIP Phabricator) and new ones (like Graphite, which is awesome) emerging... and of course AI-AI-AI. LLMs have created some great new tools for the developer inner loop—that's probably the most glaring omission here. If I were to include that category today, it would mention tools like ChatGPT, GH Copilot, Cursor, and our own Sourcegraph Cody (https://cody.dev). I'm told that Google has internal AI dev tools now that generate more code than humans.
Excited to see what changes the next 3 years bring—the pace of innovation is only accelerating!