I like the engineering part at the top but projecting AI perspectives blindsided through the lens of LLMs is effectively "looking backwards".
So this is nice
> productionizing their proof-of-concept code and turning it into something people could actually use.
because it's so easy to glamorize research, while ignoring what actually makes ideas products.
This is also the problem. It's a looking back perspective and it's so easy to be miss the forest from the trees when you're down in the weeds. I'm talking from experience and it's a feeling I get when reading the post.
In the grand scheme of things our current "AI" will probably look like a weird detour.
Note that a lot of these perspectives are presented (and thought) without a timeline in mind. We're actually witnessing timelines getting compressed. It's easy to see the effects of one track while missing the general trend.
This take is looking at (arguably "over") LLM timeline, while missing everything else that is happening.
So this is nice
> productionizing their proof-of-concept code and turning it into something people could actually use.
because it's so easy to glamorize research, while ignoring what actually makes ideas products.
This is also the problem. It's a looking back perspective and it's so easy to be miss the forest from the trees when you're down in the weeds. I'm talking from experience and it's a feeling I get when reading the post.
In the grand scheme of things our current "AI" will probably look like a weird detour.
Note that a lot of these perspectives are presented (and thought) without a timeline in mind. We're actually witnessing timelines getting compressed. It's easy to see the effects of one track while missing the general trend.
This take is looking at (arguably "over") LLM timeline, while missing everything else that is happening.