I was mussing this summer if I should get a refurbed Thinkpad P16 with 96GB of RAM to run VMs purely in memory. Now that 96GB of ram cost as much as a second P16.
I feel you, so much. I was thinking of getting a second 64gb node for my homelab and i thought i’d save those money… now the ram alone cost as much as the node, and I’m crying.
Lesson learned: you should always listen to that voice inside your head that say: “but i need it…” lol
I rebuilt a workstation after a failed motherboard a year ago. I was not very excited about being forced to replace it on a days notice and cheaped out on the RAM (only got 32GB). This is like the third or fourth time I've taught myself the lesson to not pinch pennies when buying equipment/infrastructure assets. It's the second time the lesson was about RAM, so clearly I'm a slow learner.
The thing that is supposed to happen next is high-bandwidth flash. In theory, it could allow laptops to run the larger models without being extortionately costly, by loading directly from flash into the GPU (not by executing in flash)
But I haven't seen figures of the actual bandwidth yet, and no doubt to start with it will be expensive. The underlying technology of flash has much higher read latency than dram, so it's not really clear (to me, at least) if they can deliver the speeds needed to remove the need to cache in VRAM just by increasing parallelism.
Video games have driven the need for hardware more than office work. Sadly games are already being scaled back and more time is being spent on optimization instead of content since consumers can't be expected to have the kind of RAM available they normally would and everyone will be forced to make do with whatever RAM they have for a long time.
That might not be the case. The kind of memory that will flood the second-hand market could not be the kind of memory we can stuff in laptops or even desktop systems.
By "we" do you mean consumers? No, "we" will get neither. This is unexpected, irresistable opportunity to create a new class, by controlling the technology that people are required and are desiring to use (large genAI) with a comprehensive moat — financial, legislative and technological. Why make affordable devices that enable at least partial autonomy? Of course the focus will be on better remote operation (networking, on-device secure computation, advancing narrative that equates local computation with extremism and sociopathy).
I can't dynamically resize my desktop to any arbitrary resolution in Wayland, whereas under X11+Nvidia it's just "nvidia-settings --assign CurrentMetaMode=..."
This is mandatory for me as I'm constantly connecting into my workstation from other devices via Moonlight or Chrome Remote Desktop and I want the streaming resolution to match the resolution of my client.
Your "bunch of projectiles" is what, 20? You should be able to do 20,000 on even the lowliest integrated GPU these days.
As a senior game/graphics programmer looking at the screen caps, I see a game that should never be using more than 1% of the CPU and a few percent of the GPU.
> You should be able to do 20,000 on even the lowliest integrated GPU these days.
It all depends on how things are organized. If each object needs to run logic every frame, you can start to run into severe performance issues way before you get to 10k. 60fps means you have 16 milliseconds to check everything in the scene. This can be a lot of time but you only have to make one small mistake to lose any advantage.
We have to reach for data oriented techniques like ECS if we want to keep the CPU ~idle while managing 10k+ entities in the scene. This stuff is a big pain in the ass to use if you don't actually need it.
Definitely the first time I've been asked to buy an expiring access pass just to zoom in on an image.
I get that this is cool and it was difficult to make, but there is zero chance I would purchase without knowing how far I can actually zoom in. Maybe allow people one full zoom before the paywall?
The bulk of the issues mentioned in the article seem to fall into the "Why doesn't the BBC give equal time to both objective reality and my bullshit right-wing talking points?" category.
While I agree that the BBC did some shit that a news organization shouldn't have done, disproportionate coverage does not by itself imply bias - thinking otherwise is classic both-sides-ism.
reply