A major reason agentic LLMs are so promising right now is because they just Figure It Out (sometimes).
Either the AI can figure it out, and it doesn't matter if there is a standardized protocol. Or the AI can't figure it out, and then it's probably a bad AI in the first place (not very I).
The difference between those two possibilities is a chasm far too wide to be bridged by the simple addition of a new protocol.
Having A2A is much more efficient and less error prone. Why would I want to spend tons of token on an AI „figuring it out“, if I can have the same effect for less using A2A?
we can even train the LLMs with A2A in mind, further increasing stability and decreasing cost.
A human can also figure everything out, but if I come across a well engineered REST API with standard oauth2 , I am productive within 5 minutes.
I happen to live in one of the few districts in CA that has a republican representative. I was looking forward to voting him out but then CA got gerrymandered and now we'll likely have a Democrat representative next term.
I didn't like our republican representative but it seems kinda shitty that the folks who did like him and voted for him suddenly didn't get a say in who their representative ought to be. I mean, sure they probably voted No on 50 but most of the yes votes came from outside of our district.
Edit: I strongly hate gerrymandering but I also acknowledge the need for the democrats to play dirty because the Republicans are, and "being the better person" doesn't seem to be a viable political strategy anymore.
Yeah, that's true, never thought of it that way. The way CA did it is certainly better, but I think it's still solving the wrong problem though — political power should not be able to be swayed by politicians moving borders around, with or without a vote. Redistricting is important because every district should have equal population, but they should be drawn by independent committees (and in many states they are, just for "some reason" a bunch of states decided to do it with a partisan spin).
Correct me if I'm wrong, but you'd still have to compile it from source on nix, no?
On my relatively powerful workstation, Erlang/BEAM takes about 7 minutes to compile.
We're working around this currently by having a fat devcontainer image, pre-built with Erlang inside (from source) by our CI. It chews through CI minutes unavoidably due to how docker layer caching works.
It would be awesome to just download and unpack a tarball, regardless of which distro you're using.
Nix is centered around the local Nix store and binary caching.
As long as the specific version of Erlang you’re using is present in either your Nix store or the global cache for your OS and arch (at cache.nixos.org), you should not need to compile anything.
And if you rely on custom builds, you can just setup your own binary cache. This is similar to remote caching in Bazel.
We do exactly this at my dayjob - we have (multiple) very specific combinations of (erlang, elixir, hex, rebar3) that we use which are pinned to exactly the versions we need. We have a private Nix cache so we only have to build them once.
That said, learning nix and setting up a nix cache is still a lot of work. Docker buildx might offer you some more knobs to cache portions of your build in a finer-grained manner without having to take the nix plunge.
Nix is enormously complicated, kind of unstable and not well documented.
I get that if you've gone through the pain of learning it you get a system with some very nice properties. But casually suggesting "maybe try nix" is a bit like telling someone who wants to listen to Mozart "maybe try playing a piano".
OP is already trying to do something pretty un-casual:
> If you want to control the exact version that's being used across your team (via `asdf` or similar), this practically means you'll end up compiling the BEAM over and over...
So I think it is perfectly appropriate to suggest a sharp tool.
Zig being able to (cross)compile C and C++ feels very similar to how UV functions as a drop in replacement for pip/pip-tools. Seems like a fantastic way to gain traction in already established projects.
Jellyfin and a domain name with a dynamic DNS update will do that for you, no problem.
In the house: NFS read-only for desktops and laptops; Owntone to send music to Wiim Mini or stereo receivers (Yamaha, Denon, Marantz, Onkyo -- all of them are compatible).
Yeah, disappearing songs (or content in general) is the part that I HATE about streaming. I'm happy to pay for the service, but unhappy to have content yanked at any moment. Especially with songs I used to love, it's like having part of my identity erased against my will.
If anyone has any tips on how to "back up" a Spotify account (and systematically detect yanked content) I would love to hear them.
Similar thing happened to me. I borrowed steamdeck, realized that i don't like handhelds but i liked gaming on Linux. So i bought a laptop that replaced my macbook for work but it games better than steamdeck.
I use my Steam Deck as a game console, connected to my beamer and with 4x PS5 controllers. Works great for that with the Steam game mode. But yes it's been a long time since I played anything on the big screen.
If they aren't making money either way, I'd prefer they focused on the core product.
Or charge for an actually useful feature like Firefox sync which is currently free.
reply