Very minor and tangential request for future recordings - show the shell prompt and/or put the commentary in shell comments (prefix with "# "). A step closer to IRL.
Speaking of which, an interesting thing to contemplate is if it is worth automating what you did, or if making the videos happens rarely enough that you'd start from scratch with a new manus or other ai session.
It is using the llm library, so you do the plugin and model management through that. Let's say you've already gotten ollama installed and the `gemma3n:e2b` model. Then you use the llm cli to add the ollama extension:
... also I don't know why it kept saying my value is None :) . The "6" is doubled because one must have been a print and the other is the return value.
…I read the whole article at OP’s link, many comments off this thread…I even clicked into the college course material in https://news.ycombinator.com/item?id=44468452 …and not once did it occur to me why it was called “wet mode”…not once…
…until your comment. Here! Take my “lived through the 80’s and 90’s” card.
Hm. I'll improve the documentation to make it slightly more obvious. When I add after_midnight with malicious compliance mode and bright_light() to freeze implementations then it should be more obvious.
i helped Chris Callison-Burch design a class at upenn, called interactive fiction, which is a similar context to what Simon suggested. the real magic is that it reframes hallucinations as creative story telling. the usecase is SUPER fun if you imagine the LLM as a dungeon master telling a story that gets expanded over time.
the framework he and I built kept track of the game state over time and allowed saving and loading games as json. we could then send the full json to an LLM as part of the prompts to get it to react. the most neat part, imo, was when we realized we could have the LLM generate text for parts of the story, then analyze what it said to detect any items, locations, or characters not jn the game state, and then have it create json representations of the hallucinated objects that could be inserted into the game states. that sealed the deal for using hallucinations as creative story telling inside the context of a game.
i wasnt officially part of upenn at the time, so my name isnt listed on the site, but we wrote a paper about some of the things we did, such as this one, and you'll see me listed there https://www.cis.upenn.edu/~ccb/publications/dagger.pdf
Sounds similar to AI Dungeon which I believe ran on a fine-tuned version of GPT-2 "all the way" back in 2019. And honestly kind of reminded me of the "Mind Game" in the novel, Ender's Game.
Just want to say that I'm not an ai guy at all, but this has made me more excited about it than anything in a while. Really cool! Did you also do the one where you put "spells" in your code?
Sort of. Compact NMC Li-ion cells from laptops and phones often use stuff like cobalt, supplies of which are much more limited and problematic than of lithium. The newer LiFePO4 chemistry does not use it, and, importantly, is rather hard to ignite. Its energy density per unit mass is lower, but it's not that important for stationary installations.
Speaking of which, an interesting thing to contemplate is if it is worth automating what you did, or if making the videos happens rarely enough that you'd start from scratch with a new manus or other ai session.
reply