Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hope the author responds, because that stuck out to me too!

My assumption was that their prompt was something like "typescript cloudflare function" and they just used the resultant code to see types in action inside their IDE.



Author here.

I just got access to OpenAI codex. I used edit function and asked it to modify the JS hello world, to add typescript annotations.

This is the git commit following that gpt conversation :)

    -export async function onRequest(context) {
    +export async function onRequest(context: {
    +  request: Request;
    +  env: { [key: string]: string };
    +  params: { [key: string]: string };
    +  waitUntil: (promise: Promise<any>) => void;
    +  next: () => Promise<void>;
    +  data: { [key: string]: any };
    +}) {
Not as minimal as the code I posted, but it got me over the stumbling block.


lol, I was 50% sure that you were being sarcastic about using GPT-3, hehe

But if anyone is edgy enough to use AI as a shortcut, I guess it would be the guy who did telemetry ingestion on a t2.micro :D

Regards Jonas,


GitHub Copilot is GPT-3 under the hood, so maybe that way?


I use copilot, but not for this. I find raw gpt-3 conversational interface to be much more useful for getting over various small programming puzzles.


Prompted correctly, I find that Copilot can do pretty much everything GPT-3 can except generating long-form prose.

You're probably right that it's more flexible, but then again I don't want to constantly switch out of IDEA.


I think we need work in conversational interfaces into IDEs. Tab-completion is insufficient




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: