Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
TH3P4G3 Thunderbolt eGPU Dock Review (2022) (egpu.io)
59 points by mvolfik on July 21, 2023 | hide | past | favorite | 40 comments


Just as this technology is being commoditized in a high quality way, Apple decides to drop support :(


Devils Advocate: the issue is that theres no OEM ARM drivers afaik.

I am quite convinced Apple would have loved to throw in a Radeon GPU in their mac pro, but the limiting factor is the drivers themselves.

You can effectively use eGPU docks for other PCIe devices, like 10GBe cards or NVMe enclosures. Its a waste of space though.

FWIW, I have a razer chroma X and I cant even fit modern GPUs into it anyway.


Did apple ever have eGPU support? and even if they did, they had a nasty fallout with nvidia some time ago.


I have a 2020 Intel Mac Mini where it works reasonably well with a Razer Core X and some midrange 2018 era AMD card. Nvidia cards were never supported but AMD got to the point where they were reasonably plug and play for awhile. It isn’t without its warts but I even got it working with Bootcamp to the point that I was playing Windows games natively on Mac with eGPU. Not enough juice to run Cyberpunk on full settings but it can run 2016-2018 games that take a decent amount of power pretty well. No Man’s Sky is an example. I find it especially helpful for running CAD software

Support has always been somewhere in between the hacking required to get a hackintosh to work and the “it just works” of native Mac stuff. Usually just requires a specific sequence of keyboard presses, clicks, or events to get it working


Yep!

I have a 2018 Macbook Pro and a pretty interesting/fun/skeletal eGPU setup made out of a Thunderbolt-to-M.2 adapter (intended for SSDs), M.2-to-PCIe adapter for the card, and a fanless 600W ATX power supply. I use it with an AMD 5700 XT and 6800 XT. Hooked up to a 55” LG OLED TV. It’s been very fun to operate.

It Just Works to a degree, and takes some tinkering as well. Quite a bit of tinkering actually.

The dedicated cards make windowing around in macOS snappier and smoother. It’s noticeable everywhere. IntelliJ is way, way snappier (it has Metal-based GPU acceleration these days.) The 6800 XT is noticeably smoother and faster than the 5700 XT.

For me, the responsiveness is worth it working as a programmer. 100%. Caveat: I enjoy the tinkering. Bought the setup to learn things and mess around.

Also, it’s a bit unstable. The GPU driver will lock up from time to time; I think it’s power supply fluctuations in my setup having a bad effect on a card running super tight video timing to push 4K 120Hz over a DisplayPort-to-HDMI-2.1 adapter. The wall power circuit breaker has too many devices on it (my fault). What happens is that the GPU or GPU driver sometimes locks up and the card becomes unresponsive and I can’t “safely remove” the device from the macOS menubar thing. When I unplug it, the Macbook Pro hard-poweroffs with a PFFFFT-gasp from its fans. It’s pretty clearly a systemic weakness. Might even be unfixable at the motherboard chipset level given how brutal the hard-poweroff is. –I’m not that surprised that they dropped support for eGPUs, given that adding support in their ARM M-series CPU platform would have needed some different core design decisions.

Playstation 3 emulation runs great, in macOS.

That said, if you’re biologically wired to be sensitive to responsive and smooth GUIs everywhere, then a fast GPU works really well with macOS. What you really want if you’re like this is not an eGPU but an M2 pro or better with HDMI 2.1 output and an LG OLED TV. This gives you 4K at 120Hz, HDR, variable refresh rate, and super low output latency. Full viewing angle too. It’s magical.


Yes, when they first switched to Thunderbolt 3 they were promoting eGPUs and co-developed one with Blackmagic.

https://mashable.com/review/blackmagic-egpu-review


For a while I enjoyed having an egpu which allowed me to share my amd graphic card between my laptop and my linux based desktop.

Nowadays, I'm only using it for my linux based desktop and I'm thinking of getting rid of the egpu enclosure.


not only did they have support, they offered an egpu/vr goggle pair around 2017.

the support eventually improved and was dropped with apple silicon.


I am sure they will come out with a lightning plug based aGPU standard soon :P


This is a badass barebones eGPU dock! Had no idea this kind of thing was available. So good, thats for sharing all!


There are many options for these eGPUs on the market. Have used a few different ones over the years for a variety of reasons. Let Aliexpress be your guide.


The link doesn’t load for me but in my experience eGPUs are buggy whether it’s a Mac, PC, Lenovo, Dell… in fact Thunderbolt in general is a buggy experience everywhere. It had so much promise.


> …in fact Thunderbolt in general is a buggy experience everywhere.

On Macs at least, Thunderbolt is extremely dependable. I don’t have as much experience with it on other platforms, but I’ve never personally experienced a problem using external SSDs on my Intel NUC with TB3 support.


Like Thunderbolt SSDs?


I think the problem is ultimately that there never was going to be benefit even if it was assumed to work perfectly.

You spend $150+GPU+PSU cost, tether your laptop to a wall outlet, and add size and weight to your “portable” kit.

That doesn’t compete all that well with just buying a higher end laptop or waiting a few years to buy the next generation of laptop. If you’re willing to go through all that expense and compromise of portability you’re probably willing to have a separate desktop system and reap the benefits of having a second more fully modular system.

It reminds me of that Voyager III concept car where you could detach a small car from the rear minivan section. Cool idea for an auto show, but why not just own a second vehicle?

https://www.hotcars.com/plymouth-voyager-iii-concept-car/


I moved to egpus a few years back and it has been great. I have a desktop when I need one for gaming and a laptop when I am on the go.

anyone who uses a 2nd monitor is also “tethering” themselves to a desk, not sure that is a strong argument.

Futher, cpu speed increases have kind of plateaued, while GPUs have not.

Finally, most laptops max out around ~150-200watts of available power. While that $3000 Alienware laptop may have a 4080, its not a real desktop gpu, which in an egpu enclosure can pull 450 watts.

Finally, a goid egpu box is also a dock, one cable into the laptop, plug and go.


If you're willing to carry around a second monitor and an eGPU, what's the point of even having a laptop as part of that setup?

For example, consider a small form factor desktop in a case like this one in the DAN C4-SFX: https://www.youtube.com/watch?v=XiCwDRzLHDk

What percentage of extra weight/space is this compared to carrying a laptop, eGPU, and second monitor? Could you not just carry two larger sized portable displays and the ITX desktop system and have the same outcome (but with better modularity and price/performance value)?

In my opinion, most people who want portable gaming are fine with buying a typical gaming laptop which they'll probably get many years of above-console quality gaming out of. That laptop-wattage 4080 you mentioned is more than good enough for playing literally any game. The people who feel like "as long as it fits in a duffel bag it's portable" and want the most performance possible should probably skip the eGPU and just travel with an ITX build desktop PC.

In other words, an eGPU fits in a weird middle zone between those two solutions that doesn't make a lot of sense. It's saving you a very slight amount of kit over just carrying a desktop around with you, and it's the most costly solution.


You don't carry a second monitor and eGPU around. Those sit at your desk and your mobile laptop becomes a gaming desktop / dev workstation whenever you want it to.

Then when you need your laptop back, you unplug 1 cord, put it into your backpack and go about your day.


I really wish these would work with Apple Silicon.


I was going to ask if this would be good to use with a Framework laptop... but it doesn't look like you can even get Thunderbolt for the Framework...

Update (I can't reply to the reply because of HN's "posting too fast" bullshit): Thanks for the reply. I thought a TB port would have to be one of the modules, and none is listed. But I guess TB ports are built-in, based on this: https://frame.work/blog/framework-laptops-are-now-thunderbol...


No, they do (unless I'm misunderstanding what you're saying).

Framework laptops with 11th gen Intel boards seem to unofficially support it, and 12th-gen and beyond boards are officially thunderbolt certified. This applies to all four ports on the Framework 13.

Source: https://frame.work/blog/framework-laptops-are-now-thunderbol...

If you're seeing something otherwise let me know, though!


Unfortunately, the BIOS update for the 12th-gen for full TB compliance has been in beta (and has known issues) for more than 6 months now, with no final release in sight. Most TB devices work fine without the update, though, from what I understand.


Thunderbolt isn't a port type, it's an interconnect. You can read a summary of this here:

https://www.reddit.com/r/UsbCHardware/comments/1225cru/how_d...

But, the takeaway is, you wouldn't see a "thunderbolt" port separate from a USB-C port; the physical interface doesn't differ, just how the data is marshalled along the wire.


I never said you would see a separate port. But there ARE USB-C ports that do not support Thunderbolt. Therefore I expected TB capability to be called out in the module description for the USB-C ones.


You're missing the entire point.

If I have a laptop with two USB-C ports, but only one supports thunderbolt, that is because one is wired to the xHCI hub and the other is wired through PCI-e via the thunderbolt interconnect. The physical connection is the same, so there would be no reason for Framework to sell a "thunderbolt-capable USB-C".

If you're going to argue about something you don't seem to have comprehension of, at least try to consume the provided answers before you stubbornly dig in.


I'm not missing the point, blowhard. I stated a verifiable fact: The description does not confirm that the Framework's USB-C modules support Thunderbolt. So you're bloviating on something that nobody brought up or is relevant.

Maybe you're replying to the wrong person. If you don't understand comment threading on Internet forums, you need some remedial instruction. I have no idea where you should turn for that. But step 1 is to stop lashing out against random people because you're frustrated.


And yet, you still don't understand a single thing, and just want to argue. Which seems to be common for you, based on your comment history.

So here, in elementary bulletpoint recaps of this conversation.

A) all USB-C ports are physically capable of handling Thunderbolt. There's no such thing as a Thunderbolt USB-C.

B) the most recent Framework laptop is Thunderbolt certified and tested, so your USB-C ports will support Thunderbolt.

If that simplification isn't enough for you, maybe avoid more technical communities. Just stick to TomsHardware.


Accidentally leet name.


is it an accident though?


the page


This is quite cheap compared to name brand eGPU docks—is there any reason to buy a $400+ Razer dock or other similarly expensive ones over this?


It's cheap because it doesn't come with a case or a PSU, and appears to only be available through AliExpress (not necessarily a bad thing, but could make manufacturer support tricky).

A case might be nice if you prefer not to have a bare GPU on your desktop and if you want to slow down the dust buildup. SFX power supply units appear to run you about $100 for something like 650W. The power delivery is limited to 60W (in terms of how much power it'll deliver to charge a laptop), which is sufficient for many laptops but probably not all of them (I think the last generation of Intel MacBook Pros might want more than 60W).


There are some downsides to this kind of barebones design at the end of the article. The Razer enclosure (and many others?) include a power supply (and presumably fans?) as well as an outer casing that hides the cables and protects from dust and other hazards (i.e. making it safer to move around than an open design).



PCIe 4.0 looks promising for completely eliminating the large bottleneck with eGPUs.


Does anyone have any benchmarks of using this or similar eGPUs for ML workflows with Nvidia cards on Linux laptops?


No, but, you should get excellent results. Once the model is loaded, it should be able to take full advantage of the GPU assuming you have a model that will fit within the VRAM of the GPU. Many higher end ML workstations that offer multiple GPU support on the motherboard, e.g. six and eight way GPUs at least, such as those from ASUS, use a PLX controller which is effectively splitting the bandwidth of one or two PCIe ports across multiple devices. And an eGPU dock is effectively a PLX for a single device talking to Thunderbolt. Device to device communication should still retain the full PCIe bandwidth capability, and you can see that reported in the screenshots of the article. The biggest issue is obviously going to, as always, be driver support. You could even consider an 8-way GPU "dock" such as the Cubix where you could plug your $1,500 laptop into 8x A100 80GB GPUs. 16 GPUs if you cascade them.


Does anyone have the real lowdown on why no Apple Silicon support for eGPUs? Can’t Apple write their own drivers?


This is ATX and not PCI express. It’s been designed to use old cards…


It does appear to be a PCIE slot, it's also written on the board.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: