Hacker Newsnew | past | comments | ask | show | jobs | submit | more shahahmed's commentslogin

arguably you can reduce even more latency by keeping the model on-device as well, but that would mean revealing the weights of the fine-tuned model.

If the user preferred reduced latency and had the RAM, is that an option?


This is true, but only if you have a GPU (/accelerator) comparable in performance to the one backing the service, or at least comparable after accounting for the local benefit. This is an expensive proposition because it will be sitting idle between completions and when you're not coding.


Not for this fine-tuned model yet, but Cody supports local models: https://sourcegraph.com/docs/cody/clients/install-vscode#sup....

I just used Cody with Ollama for local inference on a flight where the wifi was broken, and it never fails to blow my mind: https://x.com/sqs/status/1803269013310759236.


Looking at their GitHub page or seems like they are using existing LLM services. It should be possible to modify cody to make it work with a local llm


You don’t have to modify anything. We support local LLMs for chat and completions with Ollama.

https://sourcegraph.com/blog/local-code-completion-with-olla...


the model is probably most of the "secret sauce" of cody, so if they gave that away people could just copy it around like mp3s. my guess


Completely incorrect, as Sourcegraph has not historically trained models and Cody swaps between many open source and 3rd party models.


it seems like this is only for YC companies to launch


does anyone know how the artist creates these drawings (Procreate, etc.)? They're super impressive. big fan of the Metroid ones.


You can do this with pretty much any semi professional painting program and a drawing tablet/display - procreate, photoshop, Krita, corel paintshop pro, etc. I looked through his website and couldn't find anywhere where he said what he used, and there isn't enough evidence to say one way or the other.

My favorite little known factoid is that when I googled his name Wikipedia says he's the guy who drew the pastafarian parody of The Creation of Adam, title "Touched by his noodly appendage".


I also put up lots of ref pics on it when painting in Photoshop on the main (the Wacom is mapped to main).

http://androidarts.com/ProfileFAQ.htm


From my time hanging out with Arne on IRC I would assume he's using Photoshop (or opencanvas version oc11b72 ;-) )... I'd be surprised if he's on a PS version newer than CS2 though. I think Adobe added a subscription model to later versions and back in the day (ca 2000) he was mostly just using regular round brushes. If that's still how he's working, I don't see any reason at all why he'd have upgraded. Perhaps he's written his own raster editor by now, though (another low-odds bet tbh).

You can read his art tutorial on his main page. But essentially, I think he strives to get a clean result (no scribbling to "find" the form) with as few brush strokes as possible, which takes a lot of practice. He's also does a lot of pencil drawing, obviously.


I would bet Photoshop + Wacom or Procreate given the simple brushwork.


Nice! glad to hear CBT-I worked for you, and thank you!


subscribed to the channel. great video. i would be interested in your iOS learning journey.


i do throw a paywall on most of my projects. i'll both probably remove the paywall and this project entirely. but i was curious to see how people behaved with this bot and my question is answered! still no one has compared it to character AI which is what i was most interested in


that's really interesting. thanks for trying it. yeah, it doesn't seem to have a consistent notion of time and hallucinates regularly.


Would be happy to take it down if asked! this is an experiment for me, and don't expect it to be a product


>Would be happy to take it down if asked! this is an experiment for me, and don't expect it to be a product

Putting a pricing page up was almost certainly an extremely bad idea.

By putting the pricing page up you moved yourself out of the "hey I'm just a fan it's just fan fiction" category and into the "unlicensed commercial use of a registered trademark category." The later is a much more serious category to put oneself in, and involves a much more serious set of lawyers than the bots who send out cease-and-desist and takedown letters.

Removing the pricing page will not make the fact that you had shown a pricing page go away.

IANAL but I'd take the site down very quickly and never put it back up again. Just receiving a letter from one of the brand's many law firms will almost certainly end up costing you thousands of dollars drafting an appropriate response in hopes that it doesn't escalate. You really don't want that to happen.


Having a pricing page with disclosed pricing tiers, one of which has an active (even if broken) signup link might produce a different impression to, say, the legal team of an IP owner who came across this site.


good insight! I use a lookback window of one response, because it gets pretty expensive to do more.


Why is it expensive ? you can't just paste the plain text of the whole interaction as input ?


OpenAI charges per token - so whatever I send as context is counted. This easily 2x, 3x, 5x's the costs depending on the length of the chat.


Yeah, the OpenAI API regularly takes >10secs and I use a serverless backend. it's unfortunate!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: