We're at a point in the LLM curve where there's two huge, polarized groups of developers:
- the ones who don't see any value on AI for coding and dismiss it as a fad at every change they get
- the ones who are in love with the new tools and adopting as many as they can on their workflows
I know the arguments of the second bunch well. But very curious about what the "AI is a fad" bunch thinks will happen. Are we going to suddenly realize all these productivity gains people are claiming are all lies and go back to coding by typing characters on emacs and memorizing CS books? Will StackOverflow suddenly return as the most popular source of copy-paste code slop?
> Are we going to suddenly realize all these productivity gains people are claiming are all lies
I'll grant you that many have become adamant that LLMs suddenly, out of the blue, became useful just last week, which is much too soon to have any concrete data for, but coding agents in some shape have been around for quite a while and in the data we have there isn't offering of any suggestion of productivity gains yet.
And I'm not sure many are even claiming that they are more productive, just that the LLMs have allowed them to carry out a task faster. Here's the thing: At least my experience, coding was never the bottleneck. The bottleneck has always been the business people squabbling over what the customers and business need. They haven't yet figured out how to get past their egos.
The most promise for productivity seems to be from lone startup founders who aren't constrained by the squabbling found in a larger organization and can now get more done thanks to the task shortening. However, the economic conditions are not favourable to that environment right now. Consumers are feeling tapped out, marketing has become way harder, and, even when everything else is in place, nobody is going to consider your "SaaS" when they believe the foundational LLMs will be able to do the same thing tomorrow.
> Are we going to suddenly realize all these productivity gains people are claiming are all lies and go back to coding by typing characters on emacs and memorizing CS books?
If you have not learned CS, how do you expect to separate the LLM wheat from the chaff?
> Will StackOverflow suddenly return as the most popular source of copy-paste code slop?
Coding sites manually populated by humans are dead.
confusing any law with "moral principles" is a pretty naive view of the world.
Many countries base some of their laws on well accepted moral rules to make it easier to apply them (it's easier to enforce something the majority of the people want enforced), but the vast majority of the laws were always made (and maintained) to benefit the ruling class
Yeah I see where you are going with this, but I think he was trying to make a point about being convinced by decree. It tended to get people to think that it should be moral.
Also I disagree with the context of what the purpose is for law. I don't think its just about making it easier to apply laws because people see things in moralistic ways. Pure Law, which came from the existence of Common Law (which relates to whats common to people) existed within the frame work of whats moral. There are certain things, which all humans know at some level are morally right or wrong regardless of what modernity teaches us. Common laws were built up around that framework. There is administrative law, which is different and what I think you are talking about.
IMHO, there is something moral that can be learned from trying to convince people that IP is moral, when it is, in fact, just a way to administrate people into thinking that IP is valid.
I don't think this is about being confused out of naivety. In some parts of the western world the marketing department has invested heavily in establishing moral equivalence between IP violation and theft.
The battery lasts for 12-15 hours and how long you manage to stretch that out to is up to you. The site says "On average, I use it 10-20 times per day to record 3-6 second thoughts. That’s up to 2 years of usage."
So, you can use it for 30 seconds a day to get 2 years of usage. They also mention using it to control your music, the lights in your home and are assuming STT accuracy of 100% using a local model. The fallback of having to manually transcribe it from the audio recording if the STT fails is going to happen often enough that it'll get annoying. You're not going to be sending any messages to people as that will take a lot longer than 30 seconds a day.
I suspect the battery will last about 3 months if you just use it how you'd think it can be used. You would need to really discipline yourself to keep to a limit of 30 seconds of use a day.
I think they will need to end up adding some charging contacts to the surface once they start sending these things out into the wild or sales will be very limited.
How do you know how much time you have left on the ring's battery? Do you have to keep checking the app? Why not just use your phone for notes then.
> Servant leadership seems to me a lot like curling parenting: the leader/parent anticipate problems and sweep the way for their direct reports/children.
That's not what "Servant leadership" is. It's about _letting the team lead_ - and they can come to you if they need help - instead of _pushing the team_. So in practice it's the opposite of anticipating problems. If something, servant leadership gets a bad rep for being used as an excuse to let people fall on the sword, instead of protecting them
The rest of the post is just describing the role of "Management".
Google APIs in general are hilariously hard to adopt. With any other service on the planet, you go to a platform page, grab an api key and you’re good to go.
Want to use Google’s gmail, maps, calendar or gemini api? Create a cloud account, create an app, enable the gmail service, create an oauth app, download a json file. Cmon now…
Don't forget the tradition of having to migrate to a new API after a while because this one gets deprecated for "reasons". Not just a newer version, but a complete non backwards compatible new API that also requires its own setup.
To be fair, that might have changed in recent years. But after having to deal with that a few times for a few hobby projects I simply stopped trying. Can't imagine how it is for companies making use of these APIs. I guess it provides work for teams on otherwise stable applications...
Yeah, I'm not a dev and not using AI at all but had a need to create oauth keys and some APIs for some project... sometimes it works sometimes it doesnt and it's so complicated...but got it working in the end, thos it stops working after some time, it was like, Google, really?
I know not accessible across all API's, but the point of AI Studio is you can sign up and we just make an API key for you automagically, no extra button clicks or the like.
> IQ probably doesn't mean much of anything. But it is one of only a handful of ways we have to benchmark intelligence.
IQ means a lot of things (higher IQ people are measurably better at making associations and generating original ideas, are more perceptive, learn faster, have better spatial awareness).
It doesn't give them the power to predict the future.
It is less meaningful than that. It identifies who does well at tests for those things. That is not the same thing as being "better" at such things, it often just means "faster". IQ tests are also notorious for cultural bias. In particular with the word associations, they often just test for "I'm a white American kid who grew up in private schools."
And I say this as one of the white amercian kids who did great on those tests. My scores are high, but they are not meaningful.
When I was a young kid my eldest sister (who was 17 years older than me) was an educational psychologist and used to give me loads of intelligence tests - so I got pretty good at doing those kinds of tests. I actually think they are pretty silly, mostly because I generally come out very well in them...
It somewhat indicates better pattern recognition so I might give them advantage on predicting things in general. Not that it will make them prophets or oracles. But Prediction from higher IQ person is more likely to be correct. Not that world cannot be illogical and go against those predictions.
- the ones who don't see any value on AI for coding and dismiss it as a fad at every change they get
- the ones who are in love with the new tools and adopting as many as they can on their workflows
I know the arguments of the second bunch well. But very curious about what the "AI is a fad" bunch thinks will happen. Are we going to suddenly realize all these productivity gains people are claiming are all lies and go back to coding by typing characters on emacs and memorizing CS books? Will StackOverflow suddenly return as the most popular source of copy-paste code slop?
reply