Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

GP is likely saying that “building with AI” these days is mostly prompting pretrained models rather than training your own (using PyTorch).


Everyone is fine-tuning constantly though. Training an entire model in excess of a few billion parameters. It’s pretty much on nobody’s personal radar, you have a handful of well fundedgroups using pytorch to do that. The masses are still using pytorch, just on small training jobs.

Building AI, and building with AI.


Fine-tuning is great for known, concrete use cases where you have the data in hand already, but how much of the industry does that actually cover? Managers have hated those use cases since the beginning of the deep learning era — huge upfront cost for data collection, high latency cycles for training and validation, slow reaction speed to new requirements and conditions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: