I was starting to think this was impressive, if not impossible. 1B vectors in 48 MB of storage => < 1 bit per vector.
Maybe not impossible using shared/lossy storage if they were sparsely scattered over a large space ?
But anyways - minutes. Thanks.
Edit: Gemini suggested that this sort of (lossy) storage size could be achieved using "Product Quantization" (sub vectors, clustering, cluster indices), giving an example of 256 dimensional vectors being stored at an average of 6 bits per vector, with ANN being one application that might use this.
Yeah, it's under "Non-SI units accepted for use with SI", so more ISO than SI, but has standards for how it's used with SI units, esp since M has meaning there