I always kinda figured that AGI would need to be sort of similarly modeled like a brain, for which LLMs could at least fit the function for language. Meaning AGI won't be LLM based, but maybe parts of it could be.
Kinda, I mean so our brains shift once we are able to write things down instead of having to memorize completely, similarly when the internet happened we now could find answers quickly and stopped remembering or writing down things that were easy to find. So now thinking will shift again, dumber might be the right word, but it might not be, our thinking would shift away from computation type of knowledge and lean more toward making good judgments or having clearer goals type of knowledge.
I'm not sure how it's confusing. Is it any more confusing than "v8" also being a type of internal combustion engine or blender also being a kitchen appliance?
I've seen quite a bit of this too, the other thing I'm seeing on reddit is I guess a lot of people really liked 4.5 for things like worldbuilding or other creative tasks, so a lot of them are upset as well.
There is certainly a market/hobby opportunity for "discount AI" for no-revenue creative tasks. A lot of r/LocalLLaMA/ is focused on that area and in squeezing the best results out of limited hardware. Local is great if you already have a 24 GB gaming GPU. But, maybe there's an opportunity for renting out low power GPUs for casual creative work. Or, an opportunity for a RenderToken-like community of GPU sharing.
The great thing about many (not all) "worldbuilding or other creative tasks" is that you could get quite far already using some dice and random tables (or digital equivalents). Even very small local models you can run on a CPU can improve the process enough to be worthwhile and since it is local you know it will remain stable and predictable from day to day.
Working on a rented GPU would not be local. But, renting a low-end GPU might be cheap enough to use for hobbyist creative work. I'm just musing on lots of different routes to make hobby AI use economically feasible.
The gpt-oss-20b model has demonstrated that a machine with ~13GB of available RAM can run a very decent local model - if that RAM is GPU-accessible (as seen on Apple silicon Macs for example) you can get very usable performance out of it too.
I'm hoping that within a year or two machines like that will have dropped further in price.
You are absolutely right that a rented GPU is not local, but even so it brings you many of the benefits of a local model. Rented hardware is a commodity, if one provider goes down there will be another. Or in the worst case you can decide to buy your own hardware. This ensures you will have continuity and control. You know exactly what model you are using and will be able to keep using it tomorrow too. You can ask it whatever you want.
I mean - I 'm quite sure it's going to be available via API, and you can still do your worldbuilding if you're willing to go to places like OpenRouter.
I think a lot of our diseases if looked at genetically instead of symptoms-wise that we will probably find out that it's actually multiple conditions that we just group together for manifesting in similar ways. I've felt this in my own life with ADHD things where it seems to me that there are at least 3-4 different types of ADHD and that they respond to treatments/medications differently, and this makes me think that many other conditions might be similar, especially if we start looking at them genetically.
This is such a deep hole of complexity. (My wife is a pharmacist, I make computers do what I tell them to, and she has convinced me that her field is far more complex than us software people can imagine.)
To pick one story my wife has told me, take the example of Multiple Sclerosis. In the 1970's and 1980's, thanks to the MRI machine, there was finally a good diagnosis tool for MS: can you see the lesions in the scan? If you can congrats, you have MS. If you can't, it might be early MS where the lesions are small enough that we can't see them (generally they are visible several years after initial symptoms). But there were a lot of people who had MS-like symptoms and no lesions, so diagnosis of Chronic Fatigue Syndrome (named just in 1970, in contrast to MS which was identified in the 19th Century) started to rise. Most of those people would have been diagnosed with MS in 1950, but now we can rule that out and so they go into the new bucket instead. What is going on with patients in that CFS bucket? It's a mystery. Is it one thing or many? Who knows! Is it genetic or environmental? Who knows!
And how does one get original flavor MS? It's not fully genetic- comparing identical and fraternal twins we can see that it's not purely genetic but there must be a genetic influence. The current most accepted theory, according to my wife, is basically Long COVID- but for the Epstein-Barr virus (what causes Mononucleosis) instead of COVID-19. So if your immune system is somehow susceptible to this (the genetic component, which we apparently do not understand), and it encounters EBV (and there is a bit of bad luck? Who knows!) then somehow the immune system gets confused and starts attacking your own nervous system.
We computer people are simply attacking much more tractable problems.
There are multiple brain conditions that are increasingly being suspected of being caused by viruses. They are the ultimate DNA editors. And some can remain dormant for decades before reactivating.
Which is why it was so puzzling to see the response during the last pandemic. More so with people concerned about mRNA vaccines and conflating that with 'DNA changes'. If one is concerned about their DNA, they should avoid viruses and do whatever they can to help their immune system fight them as quickly as possible.
The odds of a virus giving us something beneficial like a placenta are minimal, the drawbacks are just enormous.
More likely the profit would then be in dealing with animal husbandry types of modifications instead. Cows/chickens that don't get sick, that kind of thing.
If LLama goes away we would still get models from China that don't respect the laws that shut down LLama, at least until China is on top, they will continue to undercut using open source/model. Either way, open models will continue to exist.
yes that was the gist of Ira Glass's quote, but he also added to it that it makes you feel frustrated when you have taste but are not creating things that live up to that taste, but that as a young artist you should push through that.
Here is a copy paste of the quote:
“Nobody tells this to people who are beginners, I wish someone told me. All of us who do creative work, we get into it because we have good taste. But there is this gap. For the first couple years you make stuff, it’s just not that good. It’s trying to be good, it has potential, but it’s not. But your taste, the thing that got you into the game, is still killer. And your taste is why your work disappoints you. A lot of people never get past this phase, they quit. Most people I know who do interesting, creative work went through years of this. We know our work doesn’t have this special thing that we want it to have. We all go through this. And if you are just starting out or you are still in this phase, you gotta know its normal and the most important thing you can do is do a lot of work. Put yourself on a deadline so that every week you will finish one story. It is only by going through a volume of work that you will close that gap, and your work will be as good as your ambitions. And I took longer to figure out how to do this than anyone I’ve ever met. It’s gonna take awhile. It’s normal to take awhile. You’ve just gotta fight your way through.”
― Ira Glass
I can't remember it's name because I didn't get around to reading it, but there was one sci-fi book where something like Gattaca happened but it happend in a very corporate way, meaning you could buy preset options (wealthy would do custom), everyone became mostly the same height and look etc, or they could tell which preset their parents chose etc, the problem was later on when diseases happened it was catastrophic since the genetics were too much the same.
Looks like Chemical Garden trilogy by Lauren DeStefano might be a fit - but as it seems to be a cross between typical young adult fiction of 2010's and poorly made romance novel, I do hope there's something less.. pulpy.. with the same premise.