Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

200mb of data is not a large file, and chromium tabs have a memory limit of something ridiculously low so actual large 20-100gb datasets render this useless.


Sometimes you need a scooter, sometimes you need a truck.

I think a snappy interface for <1gb datasets is really neat and super useful for certain kinds of data


This echoes my thoughts exactly. Right now, we're actually more limited by the JS UI so a couple 100 MBs is the most you can do in a browser otherwise the UI becomes really slow. There's a lot of room for improvement - we're using React and that's causing a bunch of un-needed re-renders right now that we don't need. We probably need to create our own DAG based task management system and use Canvas to render everything - with all that, workflows on much larger files will hopefully become usable.


You can now use the file api to work with terabyte sized files if your disk can handle it.

https://developer.chrome.com/docs/capabilities/web-apis/file...

Browser is pretty powerful nowadays.


This is certainly true - I'm not saying "large file" in the colloquial sense of the "big data" but rather as in - a file you might want to open in Excel/Google Sheets. I've worked actual large datasets before - upwards of 500GB - pretty often before an I really wouldn't think about using my laptop for a such a thing!

We are thinking of making data connectors to major DBs though so you should be able to do a similar style visual analysis while keeping the compute on your DB.


I looked up the limit and as of 2021, tabs seem to have been limited to 16GB which is moderate in size for an in-memory dataset. However, I know WASM has a hard limit of 4GB without Memory64. Data size is all relative.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: