Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>> Image hosting is expensive at scale, and someone's got to pay for the compute/storage/network..

Bit Torrent would beg to differ.



That's a neat idea but probably unworkable in practice. Container images need to be reliably available quickly; there is no appetite for the uncertainties surrounding the average torrent download


> That's a neat idea but probably unworkable in practice. Container images need to be reliably available quickly; there is no appetite for the uncertainties surrounding the average torrent download

Bittorrent seems to work quite well for linux isos, which are about the same size as containers, for obvious reasons.

IMO, the big difference is that, with bittorrent, it's possible to very inexpensively add lots of semi-reliable bandwidth.


Nobody is going to accept worrying about whether the torrent has enough people seeding in the middle of a CI run. And your usual torrent download is an explicit action with an explicit client, how are people going to seed these images and why would they? And what about the long tail?


Nobody needs to be seeding if only one download is active. You could self host an image at home on a Raspberry Pi and provide an image in a minute.

Nobody's CI should be depending on an external download of that size.


We are talking about replacing the docker hub and the like, what people "should" be doing and what happens in the real world are substantially different. If this hypothetical replacement can't serve basic existing use cases it is dead at the starting line.


> enough people seeding

the .torrent file format, and clients, include explicit support for HTTP mirrors serving the same files that's distributed via P2P.


Archive.org does this with theirs. If there are no seeds (super common with their torrents—IDK, maybe a few popular files of theirs do have lots of seeds and that saves them a lot of bandwidth, but sometimes I wonder why they bother) then it'll basically do the same thing as downloading from their website. I've seen it called a "web seed". Only place I've seen use it, but evidently the functionality is there.


I'm pretty much convinced the people at Docker have explicitly made their "registry" not be just downloadable static files purely to enable the rent-seeking behavior we are seeing here...


Cache images locally. Docker has enough provisions for image mirrors and caches.

Downloading tens or hundreds of megabytes of exactly the same image, on every CI run, on someone else's expense, is expectedly unsustainable.


People who need "reliably available quickly" can pay or set up their own mirror. Everyone else can use the torrent system.


Not a bad idea. Have the users seed the cached images.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: