Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

These particular instructions make me think interesting stuff might happen if one could "convince" the model to generate JSON in these calls.


Escaping Strings is not an issue. It's guaranteed about UX. Finding a json in your bio is very likely perceived as disconcerting for the user as it implies structured data collection and isn't just the expected plaintext description. The model most likely has a bias of interacting with tools in json or other common text based formats though.


Most models do, actually. Its a serious problem.


I remember accidentally making the model "say" stuff that broke ChatGPT UI, probably it has something to do with that.


Why? The explanation given to the LLM seems truthful: this is a string that is directly displayed to the user (as we know it is), so including json in it will result in a broken visual experience for the user.


I think getting a JSON formatted output costs multiples of a forced plain text Name:Value.

Let a regular script parse that and save a lot of money not having chatgpt do hard things.


Strict mode, maybe, I don’t think so based on my memory of the implementation.

Otherwise it’s JSONSchema validation. Pretty low cost in the scheme of things.


Now I wanna see if it can rename itself to Bobby Tables..




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: