Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
[flagged] Deploy your own GPT-3 API with Docker and fly (github.com/queercat)
24 points by mayyue on Feb 4, 2023 | hide | past | favorite | 9 comments


So this builds a 100mb docker image that proxies an http request to openai? Am I missing something? I don't want to discourage new engineers from tinkering, but can someone please explain why this is reaching the front page of HN?


I agree with you, the caching mechanism and ways to communicate with the API from your scripts would have been way more interesting to read about


~somewhat +agree re. these gpt novelty hacks/projects, however note the kitty is doing more than just proxying a request:

https://github.com/queercat/node-chatgpt-api#features

(I feel we're back in the 90s and someone has released something called Mosaic .. same vibes. A frenzy is coming and right now we're in potential low hanging fruit riches just like those html (yes, just html) millionaires -- I know at least 2)


i thought it was interesting and few (3-4) people would like it. i ABSOLUTELY did not expect for anyone to actually look at this with any amount of thought nor care.

it is a hack that this works and is guaranteed something that does not make OpenAI happy. rewriting to reflect this.


The real work happens here: it looks like this library has figured out the secret model name that allows you to make API calls to the ChatGPT model in the same way you would usually make calls to the other GPT-3 models: https://github.com/waylaidwanderer/node-chatgpt-api/blob/mai...

That project's README (and the commit history to that README) tells the full story: sounds like people have been hanging out on Discord collaborating on reverse engineering ChatGPT: https://github.com/waylaidwanderer/node-chatgpt-api

One thing to note: you still have to provide your own OpenAI API key in order to call the model, so if/when OpenAI decide to shut this hole down it will likely be easy for them to restrict who can call the model and potentially take measures against accounts that have used it in this way.


Why like this, and not just using the API directly through Python or sth?


Because they are trying to get inference for free by scraping chatGPT. Of course, "free" here also means "mostly unavailable" given the reliability of chatGPT recently.


Even then, it could just be used as a library / piece of code. No need for a separate Docker instance, right?

Unless it's being used to spin off on multiple servers so IP addresses change often?


There's also the option of caching and sessions with https://github.com/HazyResearch/manifest




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: