Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

An Nvidia GPU with CUDA would've been nice...



Would definitely have been nice.

But if you're thinking about scientific programming, it's not so much effort to use AWS GPUs. I do that, and don't think I would buy an eGPU even if it was CUDA capable.


Isn't AWS expensive in the long term? I want to have a 1080 Ti but I don't even have a desktop to begin with.


AWS is expensive if you are an extremely heavy user.

Checking out whether the pricing works for you is an easy exercise. A p3 instance costs $3.06 per hour at the time of writing. It runs an Nvidia V100, which costs around $10k. So you've got yourself around 3000 hours of use before it makes sense to buy your own.

If you want something cheaper, you can go with the p2 instances at $0.90 an hour. These cost around $2k, so you're looking at around 2200 hours of use before it might become economical to buy your own.

I don't want to sound like an AWS fanboy, but I do believe it's a good democratising catalyst for deep learning and scientific computing.

EDIT: p2 instances run on K80 GPUs, which I neglected to mention above.


It depends. Small instances are a few bucks an hour, so if you just need a GPU for like a hundred hours then it’s great. If you are running the instance 24/7 for months on end, the equation changes. The big instances are cool though because they give you access to hardware that would be pretty unobtainable otherwise

I really wish they had a lambda of gpus




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: