My assumption was that their prompt was something like "typescript cloudflare function" and they just used the resultant code to see types in action inside their IDE.
I just got access to OpenAI codex. I used edit function and asked it to modify the JS hello world, to add typescript annotations.
This is the git commit following that gpt conversation :)
-export async function onRequest(context) { +export async function onRequest(context: { + request: Request; + env: { [key: string]: string }; + params: { [key: string]: string }; + waitUntil: (promise: Promise<any>) => void; + next: () => Promise<void>; + data: { [key: string]: any }; +}) {
But if anyone is edgy enough to use AI as a shortcut, I guess it would be the guy who did telemetry ingestion on a t2.micro :D
Regards Jonas,
You're probably right that it's more flexible, but then again I don't want to constantly switch out of IDEA.
My assumption was that their prompt was something like "typescript cloudflare function" and they just used the resultant code to see types in action inside their IDE.