Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What on Earth are you storing in JSON that this sort of performance issue becomes an issue?

How big is 'large' here?

I built a simple CRUD inventory program to keep track of one's gaming backlog and progress, and the dumped JSON of my entire 500+ game statuses is under 60kB and can be imported in under a second on decade-old hardware.

I'm having difficulty picturing a JSON dataset big enough to slow down modern hardware. Maybe Gentoo's portage tree if it were JSON encoded?



> What on Earth are you storing in JSON that this sort of performance issue becomes an issue?

I've been in the industry for a while. I've probably left more than one client site muttering "I've seen some things ...".

If it can be done, it will be done. And often in a way that shouldn't have even been considered at all.

Many times, "it works" is all that is needed. Not exactly the pinnacle of software design. But hey, it does indeed "work"!


Insurance price transparency can have 16gb of compressed JSON that represents a single object.

Here is the anthem page. The toc link is 16gb

https://www.anthem.com/machine-readable-file/search/

They are complying with the mandate. But not optimizing for the parsers


I've seen people dump and share entire databases in JSON format at my job....


I've seen tens of millions of market data events from a single day of trading encoded in JSON and used in various post-trade pipelines.


Ah, that's a dataset with a size certainly intimidating, and in an environment where performance means money. Thanks for pointing that out!


In my case, sentry events that represent crash logs for Adobe Digital Video applications. I’m trying to remember off the top of my head, but I think it was in the gigabytes for a single event.


Chrome trace format files also use JSON and can also become large and are a pain to work with.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: