Hacker News new | past | comments | ask | show | jobs | submit login
Blackmagic eGPU – Thunderbolt 3 external graphics processor (blackmagicdesign.com)
260 points by gregpower on July 13, 2018 | hide | past | favorite | 271 comments



From the FAQ:

> Can I upgrade the GPU chip in the Blackmagic eGPU?

> No, the design has been optimized for quiet operation so it’s better suited for creative customers. This means the design is not a simple chassis with a PCIe card plugged in, but an integrated electronics, mechanical and cooling design that cannot have the GPU chip upgraded or changed.

$699 and no ability to upgrade makes it pretty clear that they're targeting business customers.


> $699 and no ability to upgrade

Sounds like a perfect fit for the Apple ecosystem.


I'd bet 99.9% of people buying PC Laptops never upgrade a single chip... Apple's non-pro lines are likely not much different. Apple largely sacrificed that ability (in a few product lines) in favour of creating some of the thinnest laptops available. I don't remember them ever doing that before Macbook Air, for now reason.

That said, in this case, I don't see what is being gained by not making the GPU swappable or at least accessible (although it might very well still be). I'm assuming they could simply only offer official support for the one it contains but let people do whatever they want with it (which again might be the case in practice, they just aren't announcing it officially).


> I'd bet 99.9% of people buying PC Laptops never upgrade a single chip...

Some huge percentage of that is because many if not most laptops shipping today have wired everything down. You can't even update the Wifi chip in most laptops today, even though the whole reason we put Wifi on a daughtercard in the first place is because we knew the standards were going to have to move...

Meanwhile, I'm typing this from a five year old laptop that I would have happily upgraded by adding more RAM if possible, and I know a lot of people that would be more than happy to do the same, since CPU technology hasn't really been improving in leaps and bounds and neither has laptop design in the past five years.

Let's not be afraid to call it what it is: It's planned obsolescence. They want you to have to buy a new one in 18 months when they ship version 2.


You are the first person I've ever heard complain about not being able to upgrade the wifi in their laptop.


Make it 2 then. I want a good chipset with good drivers, that does not crash or has bad performance for weird reasons.

Until last year, I physically removed the default cards in any of my laptops to put a "known good"" Dell ath9k in place. I bought them in bulk, and I may still have some spares

i am now investigsting "modern" alternatives


Make it three, mostly because the wifi card in my laptop was a nightmare to get working in linux


I have an 8-year-old laptop. I'd love to upgrade the wifi to something that can connect to 5GHz APs.


They do make usb wifi dongles.


Ooh I actually did this a long time ago! Upgraded the Airport card in my PowerBook.

But, you know, once in 30 years of owning laptops.


Wifi and Bluetooth would both be great to be upgradable.


> They want you to have to buy a new one in 18 months when they ship version 2.

My laptop is 4.5 years old now and I'm starting to think about replacing it. It was upgraded the week it was purchased, with a bigger hard disk and 16 gigs of memory (and a US keyboard, but that doesn't count) and that was the last upgrade it had. The next one will probably be never upgraded, as the manufacturer offers the configuration I want (at the time, Dell didn't). I expect it to last about 5 years just like this one.

Would I love to have a faster laptop? Of course I would. Can I justify the cost in terms of productivity gained? No way.


HP laptops are a huge exception to this rule. We commonly upgrade RAM and drives on our laptops.

One of the elitebooks I saw recently doesn't even need a screwdriver to open the laptop.


Same with the Lenovo T-series. You can upgrade both batteries, both RAM chips, the LTE (or additionally a second SSD here), hard disk, wifi, until a cople a years ago the CPU, the monitor, keyboard, touchpad etc. All parts are available from Lenovo.

I guess the business Dells are the same?


Lenovo x230 checking in here. I've upgraded/replaced: battery, RAM, display panel, wifi, ssd (both in mSATA and SATA slots), keyboard (x220 keyboards FTW), BIOS (coreboot). The only thing I haven't replaced is the motherboard/CPU, because it doesn't seem worth the effort, even though it's technically possible.


There's a tradeoff of laptop size/weight and expandability. The market builds what most consumers want, which is aesthetics.


Why is this comment being down voted?


Because it’s intentionally obtuse to assume the only reason you would ever sacrifice upgradeability is for planned obsolescence.


But is it really far fetched? I am certain some people somewhere have considered this a big plus and weighted in their decision more than perhaps many of the other reasons for making HW impossible to upgrade.


> But is it really far fetched?

Yes, making things upgradeable generally adds weight and size to components. Sockets aren't "free", calling it planned obsolescence is absurd. They've been designed to have wired in components for a reason. You may not agree with that reason, say size constraints, but that is the constraints it was designed under. Weighting any decision is: how much will this cost our design? (note, cost here is not just money, its time, size, weight, manufacturing cost, complexity, etc...)

Upgradeability is a feature, if nobody is willing to pay for it, guess what will happen to the feature? Same thing as happened to our own species tails, they became obsolete.


> Same thing as happened to our own species tails, they became obsolete.

And no ape had its tail cut in the name of evolution. It happened slowly, as older apes were replaced by newer models.


To hit their noise target they'll have substantially modified the gpu package to have much more passive cooling than normal for this gpu. Making that work in a way that allowed end-users to slot in another gpu from an off-the-shelf package would be really hard.

Maybe their next iteration might have replacable gpu cartridges or something? But I can totally see why they wouldn't prioritise this. Not everything companies do is done out of malice.


Curious: how exactly do you see that the comment is being downvoted?


You must be an app user, like myself. It lacks functionality. You can tell if you log into the website.


TIL that there is an app!! I always log in from the website.


I work in a PC environment with 2k+ clients and even more side customers. 90% of PC customers upgrade their old laptops with more ram and SSD. Also some upgrade their network card, replace the dvd with a HDD/SSD, etc.

Because the CPU is not a problem and no noticeable difference in performance from generation to generation, makes no damn sense to replace their laptops.


That's just some silly underspeced nonsense then. I have a PC environment of 5000, and it's definitely in the single digits. PC's are speced appropriately prior to purchase, with a lifetime of 5 years. Sure occasionally users will request, and IT will approve an upgrade based on individual demands. But who the hell has time to upgrade 90% of the machines, after you've already spent the time migrating / deploying them? At 5 years, you're already turning over 20% of the environment yearly.


> I have a PC environment of 5000, and it's definitely in the single digits. PC's are speced appropriately prior to purchase, with a lifetime of 5 years

You get easily get more than 5yr at less cost if you spec for upgrade ability up front (good CPU + motherboard specs) and then do a mid lifecycle ram, storage, etc upgrade. Rolling upgrades can save even more money if done properly because the hardware coming out of use becomes your service parts for the stuff that's getting up there in age.

Regardless, the best solution is whatever fits your specific needs best.


Of course you can, but that's not how depreciated assets work in the corporate world. Especially when warranty/support ends after 5 years with companies like Dell.


I agree... Its a laptop. 5 year refresh cycle is plenty time for the device. Your looking at a few hundred dollars to get small gains to keep a device running an extra year or two. When by that time the market will have things 90% better ha.


I miss so much my 2009 MacBook Pro, where you could take out the DVD drive and replace it with an additional SSD for a performance boost.


Ha, I just did that this week. I have a late 2011 MBP and I noticed it was dragging a bit so I upgraded it from 8gb ram to 16gb and bought a 500gb SSD to replace the factory HDD. The next day I decided I wanted more space for my pictures so I bought the little enclosure and swapped out my CDROM and just reused the factory 1TB HDD. Like $220 later and it feels like a brand new machine.


you were using a 7 year old Laptop with a HDD in 2018 ? Some people don't seem to have any hardware requirements at all.. While i could tolerate a CPU from 2011, using it without an SSD feels so slow today, it's barely usable tbh. I switched to SSDs in 2009 in all my machines and never looked back, despite the price at the time.


Yeah I don't have a good excuse for waiting except it was my wife's primary computer for the last few years while she completed grad school and I didn't use it much to notice how much slower it seemed as compared to a modern SSD-based machine. Definitely wish I had done it earlier now.

It was an old work machine that I got to take with me when I left that job so it was far overspecced for casual use.. though it feels so much better now.


Not everyone can afford a new MacBook when their old one works fine, no matter how much HN shames them for it.


I did pretty much the same thing on my late 2011 MBP. And then the graphics went tits up. Argh!


Did the same thing for my 2012 MBP.


Any chance you've got a link to reliable instructions?



Yep, I used their instructions for the hard drive replacement and then again for the CDROM swap out. I used to build my own machines so I was pretty confident going in, on my scale the HDD was about a 3 difficulty but the CDROM was maybe a 6.. you need to disconnect several tiny connectors to clear enough space which always makes me nervous about breaking one but it went fine in the end.


I'm still using mine, with a patched version of OSX High Sierra installed. No problems.


How has your experience with that been? My MBP (early 2011) won't be supported by MacOS Mojave, so I'm anticipating a similar patch.

I may end up staying with High Sierra, but I'd really like to try out the dark mode


Only problems have been after the occasional update, when the patches haven't been updated. A few times it's taken 5-6 reboots to start OSX properly. The best way is to delay updates as long as possible!

Apart from that, no problems at all.

http://dosdude1.com/highsierra/ and the related forum posts were excellent.


You mean the old PowerBook Series where you could easily swap the batteries on the go and swap the DVD drive for an additional battery.


Still have my G3 500MHz somewhere - used that for my entire first year at university in 2008. Battery life was amazing.


Oh yeah, let's make the MBP double thick so there is space just in case you need an upgrade...


I recently bought an Aero 15. It's less than half a cm thicker than the new MBP and only 300g heavier. It's got a free SODIMM and NVME slot once you get the right screwdriver. There's plenty of reasons not to buy this specific laptop, but it's one of many that are in the same weight/size category that are both cheaper than the top end MacBook Pro and outperform it.


What % can be upgraded? This seems like a self fulfilling prophecy, just like mobile phones have moved towards non-user serviceable batteries. Great way to bootstrap an accessory ecosystem into existence.


> I'd bet 99.9% of people buying PC Laptops never upgrade a single chip...

I've added RAM and replaced the HD with a SSD on a laptop I've owned since 2005,and would upgraded more chips if it was possible.


You are the 0.01%


No, I'm really not. In fact, the only reason this isn't done more often is because some OEMs defraud their customers by forcing obsolescense into their products, such as intentionally designing them to be very impractical if not impossible to do basic stuff such as cleaning air ducts or replacing a HD.

If you asked any macbook pro owner if they were willing to spend money to double its RAM or HDD capacity I'm sure you wouldn't get many negative answers.


But for what cost???

The price of upgradabilty is a heavier, thicker machine with probably less battery life (because they now squeeze the cells into every nook and cranny).

The ability to upgrade a machine is not a no-cost argument.

If given a blanket question of - ‘would you like to be able to upgrade your laptop?’ I’m sure the answer would be 100% Yes!

But, if you said ‘would you like to be able to upgrade your laptop but it will be half an inch thicker and a pound heavier?’ You’d get quite a different answer.


It's not. People buy Thinkpads all the time, with their actual screws, separate drive bay, and swappable RAM. Half an inch and a pound is right - such a low price to pay.


They're not forcing anything. You want an upgradeable laptop? Buy one. I'm sorry the fact that a lot of people like thin and light designs is to painful to you, but those people do have a choice.


That is still an extremely small minority who would upgrade.

Based on what I've seen from non-technical friends and family, the standard approach to laptops is to buy a $300 netbook, fill it with crappy browser extensions and malware over a few years, complain that the computer is "too old and slow", throw it out and buy a new one.


> That is still an extremely small minority who would upgrade.

It's really not, and it's quite obvious that the baseless assertion that no one wants to upgrade hardware is rather mind-numbingly absurd.

There are businesses dedicated to repair, upgrade and refurbish computers, including laptops. There are businesses dedicated to sell used hardware which offer clients a choice of processors and other components, including different makes and models. It's utterly absurd that we're supposed that this basic need simply vanishes in a specific form factor.


Coming from a 2012 rMBP (from which this post is being typed out), the ability to double RAM/HDD by myself would be probably the end-all be-all


0.1%

but even that is obviously wrong because the 99.9% he claimed out of thin air.


You can still buy other thunderbolt 3 enclosures. https://support.apple.com/en-us/HT208544


Indeed - exactly.

And it's wasteful - once the GPU is out of date you have to dispose of a perfectly good TB3 enclosure.

Another anti-consumer decision made under the auspices of design and practicality that consumers want.

I cant imagine a single customer has ever asked for a non-upgradable eGPU.


That thing is upgradeable. Just replace the boy wih a new one. Just plug the cables. No screwdriver required!

Who cares that 50% of the components in both boxes are functionally identical? Certainly not all the not overly technical folks out there.


No, they asked for a quiet one.


Wire up a fan speed control.


I upgraded the ram and disk on my 2012 13in MacBook Pro.

The lack of upgradability on the newer machines is a real turn off.


The option is to buy it fully specced. A new 6-core i9 with 32 gigs of DDR4 amd 4TB of flash storage will probably keep you happy for the next 5 years. That means your budget is slightly above US$ 1200/year of happy computing with a Mac.

If you don't need macOS or an ultra-elegant thin computer with an incredible screen, you'll probably be happier with a PC from Dell or Lenovo with similar specs and an updated machine every 2 years. But they are neither elegant, nor thin. At least they run Linux very well.

My current personal laptop is 4.5 years old and I am now considering replacing it with a machine with specs similar to the PC I described - non-thin, non-elegant, decent-but-not-incredible screen, a non-bleeding-edge CPU, as much memory as possible, some flash storage and a hard disk and I don't expect to upgrade it in less than 5 years.


If happy computing includes trips to the Apple Store every few months because the keyboard got stuck... Again... Maybe. If happy computing means no physical escape or function keys... Maybe. If happy computing means no warranty after 3 years... Maybe. If happy computing means no accidental damage warranty whatsoever... Maybe. If happy computing means an oleophobic coating that will almost definitely fail at least once over the lifetime of the device... Maybe. If happy computing involves not being able to access the data on that 4TB SSD when the laptop bricks itself, necessitating constant backups... Maybe. If happy computing involves knowingly spending double the price of the storage and the RAM otherwise because Apple knows that you won't be able to replace them... Maybe.

But hey, it's thin and light and aluminum, so all is forgiven, right? Don't get me wrong, the MBP is a pretty gorgeous device, but it's pretty hilarious to say that all is good and well just because it's thin and light and aluminum.


> The option is to buy it fully specced. A new 6-core i9 with 32 gigs of DDR4 amd 4TB of flash storage will probably keep you happy for the next 5 years. That means your budget is slightly above US$ 1200/year of happy computing with a Mac.

The problem is that not everyone has the budget to buy a fully-speced machine at the time of initial purchase. Also, many laptop vendors charge a significant premium to buy such configurations.

Apple partly solves this by simply not even selling the low-speced "base model" that other vendors would advertise. But still, buying low/mid-range now and upgrading later can be significantly more tolerable on the wallet than going-for-broke every major upgrade cycle.


> The problem is that not everyone has the budget to buy a fully-speced machine at the time of initial purchase

Yes. In some markets Apple offers financing options right on their website. In case of businesses, they can deal directly with their banks.


My HP laptop died, the integrated GPU died. And HP support said they do not manufacture that motherboard anymore. So what's the point of having it upgradable? Sometimes it works, other times not.


It's always been the case with laptops that if the mainboard or some part ON the main board died, well, that's that. But being able to upgrade RAM, HDD or wifi was always great so you could eek a couple more years from it.

People who want upgrade-ability aren't wrong for wanting to buck the throw away society we've become.


I choose upgrade-ability too, it's just that I was shorted by HP choosing bad quality and lack of spares. I flipped the coin and I lost it seems.


"HP support said they do not manufacture that motherboard anymore"

You can probably still buy one of those mobos and swap it out. Just look on ebay.


I think the gain is in footprint size and noise levels. I have a Sonnet eGPU box, and it is pretty massive, nearly the size of a microATX case. It also has a fan on the box and the fan of the GPU cooler itself.

This design looks a lot smaller, and I'm sure it is a lot quieter as well. If you don't want those things and do want upgradability there are plenty of other options available already.


> I'd bet 99.9% of people buying PC Laptops never upgrade a single chip...

You're right, they don't. They ask someone else to do it. So, I'd bet your number is way off.


Even with the chunky laptops, I only ever really upgraded the wifi card, ram, and storage...


>Sounds like a perfect fit for the Apple ecosystem.

I've read statistics that most users, business or not, don't update anything, even when the hardware gives them the opportunity.

For businesses it doesn't even make much sense to update: they can just lease or buy new computers and have them as a tax writeoff.

As for most people, they just buy a new PC every 5-6 years (can be 10 years for the less interested in tech), and that's it.

It's a small niche of tinkerers and "power users" (actually power tinkerers -- there are users that do much more important stuff with the computers than them that still don't tinker at all) that ever does upgrade stuff.

Of course that's includes many people in our tech bubble, but you'd be surprised how many programmers don't care for upgrading and tinkering, and just buy a new laptop every so often.

As a programmer myself (CS degree and everything) I know I hate tinkering with hardware with a passion, but I know lots of others that do too. Of course I also know many dev friends and colleagues that do like building custom rigs.


They should charge double for something that's 20% thinner but 30% slower.


Reason Apple make upgrades impossible is so they can charge ridiculous prices for their built in upgrades.

the new mac has a 4TB SSD option for £2800. You can buy a 4TB top of the range samsung SSD for £800.

£2000 profit is a good motivation for sticking stuff down.


SATA SSDs are not top of the line - a 4TB nvme card is more like $3k:

https://www.amazon.com/HighPoint-rSSD7101A-4TB-NVMe-Drive/dp...

The samsung SSDs have about 15% the read speed Apple is advertising.


Compare the HP EX920 (1TB for $300 on sale) to the SSD in the... 15" MBP 2017, or something.

HP EX920: http://www.legitreviews.com/wp-content/uploads/2018/04/hp-ex...

Apple: https://www.notebookcheck.net/fileadmin/Notebooks/Apple/MacB...

Same AS SSD benchmark.

Now you can argue that since SSD performance increases with size, maybe that's why. Then prove it.


I think it's not the best deal, but it's so much better than other jokers in this space, because, the alternative I've found is Razor Core series (https://www.razer.com/sg-en/gaming-laptops/razer-core-v2) which is a joke - $500+ ($750 where I live) and it doesn't even include a GPU in it!

I'm seriously interested if there's some sort of DIY out there wherein you could just buy a bunch of chips from eBay, put together some enclosure yourself and use your own graphics card. That'd be the dream. Any money not spent on enclosures is money well spent on a good GPU.


Even before the more polished products existed, people have hacked together external GPU enclosures, often interfaced to the host's PCIe bus via ExpressCard slots or ribbon cables hanging off of Mini PCIe slots. And more recently, with the interest in cramming many dual-slot GPUs onto a single motherboard for cryptomining rigs, you can find cheap, "barebones stand-alone PCIe slots" (for lack of a better term) for hanging GPUs off of. A quick search turned up a couple of examples:

https://www.aliexpress.com/item/Mini-PCI-E-to-PCI-E-1X-Expan...

https://www.aliexpress.com/item/60CM-PCI-Express-1X-To-16X-P... (Ingeniously, on this one, they've re-purposed USB 3.0 cables to carry the high-speed PCIe lanes).


The problem with all of those has always been that they only worked with an external display - there never was any way to send the image back to the laptop's own display.


LookingGlass on Linux allows displaying from other GPUs' framebuffers.

We use Resolve in VM with hardware passthrough and can monitor its display output from within another VM this way.

Apple nails the prosumer, but I really don't see many using their hardware in industry.


It actually is possible with nVidia GPU-s, but you'll be taking a huge performance hit as some of the tiny bandwidth is now used to send the rendered image back to the laptop and then to the screen.


No matter what, sending it back to the internal display is going to mean a significant performance hit. There's only so much bandwidth in a TB3 connection.


These are for mining USB 3.0 cables are the standard they have the correct number of pins for a single lane and have decent shielding.


The best resource for DIY projects in this space seems to be egpu.io. I found their guide kind of helpful: https://egpu.io/build-guides/


Here is an overview I think is helpful [1]. The sonnet eGPU-case looks like its the best value for the price. And the gigabyte box on #1 maybe if you like to buy the GPU with the box.

[1] https://egpu.io/external-gpu-buyers-guide-2018/


http://barefeats.com/ which in the Apple community is known for graphics testing of Apple products and other solutions has done a lot of work with external GPU enclosures.

One vendor (Sonnet) [1] has a 580 solution at 499. B&H has the no GPU unit at 299. So there are options.

[1]https://www.sonnetstore.com/collections/egpu-expansion-syste...



One thing to note about this; it only provides 15W USB-PD. This is grossly inadequate for pretty much any laptop you'd want to connect to it, so one-plug operation isn't practical.


awesome thanks! Does that work with 1080ti? I know Apple hasn't got official Nvidia drivers but Nvidia has provided their own for a while now


Razed has a $300 alternative that is explicitly supported by the Apple ecosystem.


There are tons of other eGPU enclosures on the market that take normal PCIe GPUs. This one is interesting because you can use it with the UltraFine 5k Display which connects via Thunderbolt.


Or buy a Dell UP2715K or Philips 275P4VYKEB both of which connect via dual DisplayPort... Heck, even the Iiyama XB2779QQS can be a good candidate for most eGPU -- it requires a DisplayPort 1.4 but that is now a feature of many video cards, it's cheap because it's only 6-bit + A-FRC.


The Dell and Phillips parts seem to have been discontinued, and were in any case more expensive than the LG one. The Iiyama might be okay if you were okay with its limitations.


They're targeting DaVinci Resolve users. It's not a great deal for gamers and related nerds but there are lots of Apple-compatible enclosures out now. https://egpu.io/ is a good resource.


I'm wondering how much work in Davinci you can do with the AMD ipgu in the new 15" mbpro.. enough to grade 4k raw footage? Or do you need an egpu?


I don't know anything about DaVinci Resolve but if you scroll through the horrible unlinkable site you'll get to a performance graph and that graph has a tab for the new 15" MBP - it claims the eGPU is about twice faster for various operations in Resolve.

If you look up the parts themselves, the Pro 580 (in the eGPU box) is roughly 2-3 times faster than the RX 560X that you can get in the new 15" MBP.


There's probably a market for an eGPU enclosure which can sit in a shared office without sounding like a small jet engine, though. Some of them are quite noisy.


"the design has been optimized for quiet operation"

this is important to me, glad to hear it!


I dont get really this.. I have a stock air-cooled asus GTX 970, and for almost all use cases on my PC it is completely silent. The fans only turn on when the card temp goes over a certain value. The only thing that causes this is a modern taxing AAA type game (eg. The witcher 3). And when I'm playing a game like that Ive got my headphones on.

I can be running Unity, visual studio, adobe premier, illustrator, photoshop, discord, trello, github + a million browser tabs and whatnot and it maketh not a peep.

So, unless ure AAA gaming an average of the shelf GPU these days is passively cooled already.


You say you don't get it, but then you boast how your gear is basically silent. Great, that's what I want too!

But I'm not talking about your custom-built gaming PC, we're talking about an eGPU case for a Macbook.

My experience with these type of peripherals (eg external hard drives) is they often say they have a 'quiet fan' but it's not really quiet. And I think it's something I read about being a problem with other eGPU cases.

I have this stuff in my living room. The Macbook is basically silent. I don't want a bunch of noisy fans to come on (they all add up...) every time I dock the laptop and try to do some work.

I won't be gaming. I might have headphones on to do some video editing, if my partner is trying to do her own work in the same room. But I don't want to be obliged to wear them just because of noisy hardware.


If you have to wear headphones when the GPU is at max, it can be more quiet...


I dont have to. In any situation where the fans have come on. I already am, cus I'm playing a game.


Ok, I'll elaborate.

It's very nice that you wear headphones when you're playing a game. Me, and probably a lot of other people, don't.

You not being able to hear the GPU when using headphones does not mean it's quiet. It means you can't hear it. Incidentally, any other people in the room or maybe even in the adjacent rooms would be able to hear it very well.


its likely targeted to video editing or vfx work where you're gonna need some rendering power


OTOH $699 sounds like a reasonable price for a GPU in that class? You're just trading an internal interface for an external one.


Except that the GPU they've got in their should have a retail price of $200-300. So there's a $400 markup for a PSU and usb hub.


Hmmm, you're right. I was under the impression the Pro had higher specs than the RX 580 but the numbers they published really don't seem to differ from the RX 580. Damn.


"Creative customers" sounds like some kind of patronizing euphemism.


In what way? Plenty of designers & photographers I know frequently refer to themselves and peers as "creatives." Creative is being used as a noun in the phrase.


No ability to upgrade? Sounds very familiar for every Apple user out there.


The upgrade is to swap out the entire unit and plug a new one into your laptop.


> ... plug a new one into your laptop.

... plug a new one into your new laptop.


It would have been nice if the GPU and active cooling were an upgradable module even if proprietary. There's a $400+ markup for what's effectively a USB hub and PSU.


Huh. They're targeting ... people who don't know any better?

I want to see rendering benchmarks (both 1080p/2160p) of this vs external Thunderbolt GTX/1060,1070,1080 enclosures (and AMDGPU equivalents). I mean if it works better and you do a lot of video, then yea, go for it. If it's only marginally better, buy something with a real PCIe slot that's upgradable.

Even business customers should not buy this if the benchmarks don't align. Seriously you're hurting everyone, even if you can afford it. Stop that. Planned obsolescent is terrible. Pay more and buy something you can upgrade, fix and repair.

Throwing out electronics after four years hurts the planet.


The value in the ability to upgrade is often misperceived as being worth more than it really is. If you buy something mid-to-top range you are really only going to be able to get a single upgrade cycle out of it out (where there is some substantial marginal benefit for the cost) before the rest of the system starts to obsolete itself. If you watch things closely and the tick-tock cycles are aligned then maybe you get two.

It’s a fun hobby. And if you have the cash to spend on always having the best GPU then you probably don’t care that for $1,000 you only get an extra 10 FPS or so.

These days I can’t even remember how old a machine is let alone what GPU it even has. “Why is this box so slow? It’s four years old? Holy shit, when did that happen?”

Even when I had a rig and was reading Tom’s and Anand daily I don’t think I actually ever upgraded a video card twice. Maybe it happened once? The cycles might have been longer back then, but by the time cards had advanced two generations we were moving from AGP 4x to 8X or to PCI-E or multi core whatever. I could easily just go back in time and buy whatever middle of the road Dell lunchbox was available and just replaced the whole thing every year or two and probably saved money for the same performance.

These days I’ll take a non upgradable whatever it is as long as it has a good warranty.


> The value in the ability to upgrade is often misperceived as being worth more than it really is. If you buy something mid-to-top range you are really only going to be able to get a single upgrade cycle out of it out (where there is some substantial marginal benefit for the cost) before the rest of the system starts to obsolete itself. If you watch things closely and the tick-tock cycles are aligned then maybe you get two.

I don't think that's true in this case.

The upgradeable part would just be an PCIe enclosure. PCIe 2.0 is over ten years old and would still be sufficient for many GPU applications with little performance loss. PCIe 3.0 is eight years old and fast enough that we will likely not see 4.0 for many years in desktops (or mobiles) [also 4.0 needs more expensive materials for all involved planars, so... yeah... not doing that for +-0 %].


It's interesting seeing arguments about how valuable upgrade-ability is, and yet comments like this further outline how upgrade-ability is overrated.

Clearly technology isn't advancing like it used to. I've been running a machine since 2011 when I built it. 15 years ago I maybe would've upgraded every 4 years. Now I'm almost 8 years strong.

Now everyone has different needs, and I think Apple and similar products haven't really been aligned with those needs perfectly, but how much, from now, can graphics REALLY improve, for most use-cases?


It’s never “fast enough.” PCI-E motherboards are actually out now. In three years we will have either PCI-E 4.0 or 5.0 as a standard.

And you’re right—PCIe 3.0 cards out today have plenty of performance for today’s applications. That why you would get a PCIe 3.0 card today. But it doesn’t matter if it’s fixed because you aren’t going to upgrade it.


> It’s never “fast enough.”

Actually that's not true. When PCIe 3.0 was introduced, it took quite some time before the (still marginal) performance improvements materialized. We're talking about a few percent at most in games and benchmarks (many are just neutral) between PCIe 2.0 and 3.0, simply because it's not a bottleneck. For not overly data-intensive GPGPU applications (e.g. video things) I reckon that's the case as well. Yes, sure, there are applications you can run on a GPU where PCIe is a bottleneck. PCIe bandwidth benchmarks, for example.

Don't expect PCIe 4.0 consumer-ish (i.e. what this is about) GPUs soon (2019 earliest). It's wholly unclear whether you'll see any performance improvements at all from that — just look at the bus utilization in games and benchmarks, you'll be hard pressed to see more than 10-20 % (which is precisely why running a GPU on a x8 interface (or 2.0) is not an issue).


If it’s not true then you are actually removing the only major benefit of upgradability all together. Just get what’s available today since it’s fast enough and will be for some time.


I think you misread my original comment.

I was saying that the Blackmagic eGPU in an upgradeable form factor would mean that you get a 200-300 $ (or something like that) GPU enclosure with some electronics in it (Thunderbolt hub, power supply etc.). Then you stick a current GPU into it. This would make sense because GPU replacement cycles are much shorter than the interconnect upgrade cycle. That's why I said that most applications are still fine today with the over ten year old PCIe 2.0.

So just to reiterate: PCIe bus speed is simply not a limiting factor for the overwhelming majority of GPU applications.


Four years ago I bought a desktop gaming PC (I'm normally a Mac user) with the idea that I could upgrade it over the years to keep it current but avoid replacing the whole thing.

Turns out in practice technology changes enough over that time to only leave the case, and perhaps the PSU. The motherboard, CPU, RAM and graphics card are all obsolete now. The SSD can be reused but newer technology is much faster.

I don't know what gets saved by upgrading parts rather than the machine. The same amount of electronics are disposed of (or sold on the second hand market)


> all obsolete now

What is it used for?

If you buy Intel, you know beforehand you won't be able to upgrade the CPU, since the socket is changed (intentionally, of course) every generation. But the CPU isn't the issue for e.g. gaming, and upgrading every generation or every other generation largely doesn't make any sense at all with Intel parts for the last eight years or so.

Since you're saying obsolete, you probably bought into 1150/Haswell. It was known this was the last pure DDR3 platform, but you can still get DDR3 memory new, if you need more. So that's not a problem, really. (And it's actually cheaper than DDR4 memory)

Graphics cards can keep up for years nowadays. I've used a midrange-highend card for five years before upgrading; I could still play new games at mid-to-high settings. Graphics cards can always be upgraded in any computer (if the PSU has enough power, which is rarely an issue).


> If you buy Intel, you know beforehand you won't be able to upgrade the CPU, since the socket is changed (intentionally, of course) every generation. But the CPU isn't the issue for e.g. gaming, and upgrading every generation or every other generation largely doesn't make any sense at all with Intel parts for the last eight years or so.

Is it really true that the socket changes every generation? My impression was that sockets changed every 2-3 generations, not every one. Not that that entirely defeats the point, but it expands the window.


https://en.wikipedia.org/wiki/CPU_socket

Intel seems to change desktop sockets every 2 years now. That includes multiple generations, but 2 years of CPU improvements is not significant in almost all cases. A new motherboard with each new CPU is the likely situation. AMD has supported sockets for longer, and that seems to be continuing.


Yes. You get one generation (one tick-tock, so for example, Nehalem+Westmere, Sandy+Ivy Bridge, Haswell+Broadwell, though differences between tick and tock are generally not worth it). 1151/H4 deviates a bit here, because Intel fubar'd their roadmap and weren't able to deliver even single-digit advances per revision any more. So H4 now has three generations of chipsets, two of which don't support CFL (Coffee lake).


I got the top-end i7 about 5 years ago, it's still fast. I have tried to justify upgrading it many times but I just can't see the benefit in replacing it with something that's just a bit quicker.

I can at least upgrade the SSD and GPU without throwing the whole damn thing out, but even then it's hard to justify.


Well any GPU you upgrade can be used in our next system. That said there hasn't been much movement there in the last 2 years or so since the rx480 came out.

I agree with the ssd, tho. The newer PCIe based ones in the m.2 formfactor areso much faster it feels a bit like a waste to buy a sata SSD.


> That said there hasn't been much movement there in the last 2 years or so since the rx480 came out.

Aren't the Vegas quicker, and the NVIDIA offering has been improving quite a bit over the last 2 years? I might not be paying enough attention, but I'm pretty sure the 1080 ti and 1070 ti's came out only recently.


Not really. I use my 4 year old CPU, power supply, memory and motherboard with a brand new geforce 1080. I could also upgrade the memory if I wanted to.

If you upgrade the CPU, you usually have to upgrade the motherboard. That's about it.


Agreed. Built a PC almost 5 years ago and the only modifications I've made have been adding more disk drives. GPU's starting to drag but I hardly game anymore so still works like a charm.


> The motherboard, CPU, RAM and graphics card are all obsolete now

How is that at all possible unless it was under powered in the first place? Maybe it can't play some of the latest titles at max settings but a 4 year old machine should be able to run just about everything decently.

What's stopping you from upgrading the graphics card? Looks like plenty of high end cards have PCIe3 support. RAM can always be upgraded, though few games will makes use of the 8GB that was pretty standard in gaming PC's nearly a decade ago. A new CPU won't gain you much gaming wise.

Most games target PS4 level specs anyway, which is now a 5 year old machine.


> The motherboard, CPU, RAM and graphics card are all obsolete now. The SSD can be reused but newer technology is much faster.

Unless you got a bulldozer CPU, I don't believe you. I bet that upgrading just the graphics card would give you 90% of the effect of upgrading everything.


I have a Vishera CPU and it is definitely NOT obsolete. If anything it ages much better that Intels due to better multicore support in modern apps.


It is not easy now but you can plan for upgrade of PC and then do it successfully. E.g. If you bought AM4 platform in 2017 you would be able to upgrade. If you bought AM3 platform in Phenom era you could upgrade to vishera/zambezi.

You can upgrade your storage independently, you can upgrade you GPU out of step from other components, you can upgrade GPU more often than CPU without compromising too much. And so on. You can for example keep your soundcard since 10 years ago with some fiddling with drivers. One of my SSDs is 6 years old and while not ideal case it is still much faster than modern HDD and I use it constantly.

What you also mention - case and PSU - those alone cost ~100-200$ each, so worthy saving for upgrade. My seasonic PSU will work for a decade at least, or more.

It is not easy, yes. But possible.


Just upgrade the GPU. Most of your problems are solved.

Upgrade the RAM at the start of a new spec and you'll be fine. CPU/mobo go hand in hand, but you're rarely going to need to upgrade that anyway.


Ever been to a place with a fancy receptionist exclusively running Google Calendar in one browser tab taking up less than 35% of a HUGE iMac screen?

A pretty enclosure can be worth more than its contents, just accept it as a piece of furniture.

I'm sure you wouldn't scoff at PC gaming rigs with pretty flashing LED lights after all.


"Yeah boss I need that $3000.00 laptop"

"and I absolutely need extra graphical horsepower to perform my job"

"but let's get the Razer core and fuck around for days getting it to work instead of buying the one advertised on the apple site because raisins".


Razer Core X is plug and play with Macbooks.

https://www.razer.com/gaming-laptops/razer-core-x

Now if you use a NVIDIA GPU, instead of an AMD one, then you need to do extra configuration.


Can u run cuda with egpus? How about cudnn, etc.?


Their site describes who this is for at great length. I'm not sure where the GP got the 'business customer' idea, it's not accurate.


If it really is quieter than the others, there's a market for that. Quite a lot of eGPU boxes are _very_ noisy.


I kind of like where this is going. Plug and play performance and less noise.

I'm not really interested in dropping thousands of euros on the latest gaming hardware. I'm not in the pointlessly blinking leds equipped PC market. And I'm done building my own PC after having to replace the power supply in my last effort three times.

I do tend to have up to date mac hw. For me the whole point of that is buying one and then using it for 4-6 years. I actually have the MBP 15" from last year with 16GB. It's not great for gaming though. My four year old imac 5k still gets better framerates in things like x-plane. Both have the issue that they run quite hot when doing anything 3D and cpu intensive. My imac needs to emulate a vacuum cleaner just to keep it from throttling cpu and gpu.

My imac 5k is coming up for replacement somewhere in the next two years. I'm probably going to drop around 3-4K on whatever replaces it. I'd totally consider something with an external GPU. I actually like the form factor but would spend money on an external GPU for casual gaming and VR.

I'd prefer something that focuses on just cpu/memory/ssd and then plug in storage and gpu via some cables. So, this is going the right way.

Apple should focus the next mac pro around this concept. They tried with the previous one but then messed up by soldering the GPU in, which turned out a deal breaker since the whole point of mac pros is 3d graphics. So, simply ship something with plenty of ports and CPU sockets and sell GPU, Storage, screens, etc. separately. CPUs are barely evolving anyway. I got about a 35% build speed improvement by replacing a late 2011 MBP with a late 2017 one. Both were quad core i7s. Seriously underwhelming. And yes, I max out 4 cores with my builds.


> I'm not really interested in dropping thousands of euros on the latest gaming hardware.

> I'm probably going to drop around 3-4K on whatever replaces it.

...


I'm not doing this for gaming though and what I buy is not optimized for games but for comfort, looks, and I just like running OS X.

So spending 600$ to actually be able to play games and plug in some vr kit, sounds very reasonable to me.


Sounds like you bought the cheapest underpowered PSU on the market. Buy EVGA, and buy more than your system will ever consume. You don't want to cheap out on the thing powering everything else.

Just copy a build off of Reddit or pcper. You don't need to think, just click buy!


I'm wondering why they didn't include a 10Gbps Ethernet port into this thing. They advertise it to professional graphics and video artists - don't these groups usually have to move around quite a lot of huge 3D data, image and video files?

It seems hard to imagine those people to either place all of that stuff exclusively on their new 8k$ MacBookPros with 4TB of SSD space (because even 4TB quickly runs out in that profession, and it makes it hard to work on the same project with other people) or to transfer it over 0,2Gbps (at best) WiFi connections.


Under heavy load eGPU completely saturates the PCIe bandwidth. Razer couldn't even get a USB mouse to not lag in their first eGPU product.


Two reasons: they have 40 Gbps thunderbolts and they love dongles.


I like that it has four USB Type A ports. This would essentially take the place of a Thunderbolt 3 dock for me, which is already $200-$300. I'm considering getting one.


You should wait and see what the performance/reliability of these are. In almost every case, the GPU completley uses up the Thunderbolt 3 bandwidth, causing lag / reduced performance of the GPU and peripherals.


Let's say I plug a bunch of external HDDs on this dock, and I want unplug this all. Do I have to dismount them all one by one?


The Elgato Thunderbolt Dock comes with a software utility that gives you a "Disconnect Dock" menu bar option, which unmounts and unloads everything in one click.


You can unmount them all at once. Select the disks that you want to unmount on your desktop, then press Cmd+E.


Yes I believe so. However, if you have "a bunch" of disks to attached, maybe consider NAS?


You can unmount it from OS X.


Apple disabled eGPU support for Thunderbolt 1 and 2 with macOS 10.13.4 [0]. Of course someone wrote a nifty script to unblock it and enable support for Nvidia GPU's while at it [1], but it's too bad a great upgrade path from slightly older MBP's (up to 2015) is not officially supported.

[0]: https://egpu.io/external-gpu-macos-10-13-4-update/

[1]: https://egpu.io/forums/mac-setup/script-enable-egpu-on-tb1-2...


I tried the "purge wrangler" script to support a VR DevKit on my 2014 MBPr 15". I immediately started getting kernel panics. On, like, the third day I had it installed, I had 3 panics, and finally just removed it. At least they made it easy to remove; my machine hasn't crashed since. As always, YMMV.


- Radeon Pro 580 - 300 USD

- USB Hub / Thunderbolt 3 Hub - 100 USD

- eGPU - 300 USD (Akitio Node, others)

It's reasonably priced given the components. If it only had upgradability, it'd be very nice, but it's not a dealbreaker.


If you ever see a Thunderbolt 3 hub for $100 let me know!


You're right. It's a large distinction. I was thinking of USB-C hub (this one specifically: https://www.kickstarter.com/projects/hypershop/hyperdrive-us...) and it only supports 5Gb/s vs. 40Gb/s


Pardon my ignorance, but do I need an external monitor to use this, or can I just plug this into my MacBook Pro and have improved graphics performance on the built-in display?


eGPU support is opt-in on macOS. Games and apps must enable support for eGPU to use it, regardless of external or internal display.

Here's a video of it in action with one of the first software that supports eGPU acceleration, DaVinci Resolve: https://9to5mac.com/2018/04/19/macos-egpu-performance-test-d...


This is the #1 question that has been bothering me since I heard about eGPUs.


It works without an external display as well.


Does it? Because i remember it worked during the betas but then Apple disabled support for the internal display... which made the GPU somewhat useless. Did they fix that lately?

Note: it works if you install Windows is also useless to me. Does it work with the internal display with OS X?


This is false. You can hack support for internal display acceleration, but it requires using a "Phantom" display dongle adapter and results in significantly worse performance because of the limited bandwidth available.


This is false as well.

The problem is, games and apps have to opt-in for eGPU support. That's been clear since the original eGPU support in macOS 10.13.4.

There is no universal eGPU support in macOS, it doesn't start acceleration on its own.

I believe Tomb Raider is one of the first games that support eGPU: https://www.tombraiderchronicles.com/headlines4083.html

DaVinci Resolve support it as well as mentioned in this video: https://9to5mac.com/2018/04/19/macos-egpu-performance-test-d...


Thanks, I investigated a bit more since this doesn't seem obvious (to me) reading the egpu product pages or reviews. This is what I found: https://github.com/mayankk2308/set-egpu so it doesn't look like things currently just work out of the box?


I was basing my response on a friend who has a setup like this working. He’s probably using a tool like this one; I’ll ask him.


Does anyone here currently run a eGPU on their MacBook Pro and can share their experience (including what card you are using)?


Akitio Node + Apple TB2->3 dongle + GTX 1080 + external LG 4K monitor on a rMBP 15" 2015, used only w/ Bootcamp.

Works pretty well, and my latest addition is a DisplayPort switch to use the iGPU directly with the monitor (when I'm in macOS) or the external GPU (when I'm in Bootcamp). Performance is great, around ~95% of desktop GPU I'd say.

The only issue is that it needs a particular boot sequence to enable it, when switching from macOS to Windows with the GPU enabled: reboot from macOS, turn on the GPU, the default partition is Windows so Windows will boot with the GPU enabled.

To reboot from Windows to macOS, disabling the GPU: reboot from Windows, hold Option at boot to show the boot menu, then turn off the GPU and boot into macOS.

Not that bad once you get used to it, and having a desktop GPU on my laptop makes it all worth it.

My post on /r/eGPU: https://www.reddit.com/r/eGPU/comments/77e86g/success_macboo...


Great info, thank you! Glad to see there's a sub for this too. I'm about to try the same setup.

If it performs well enough, I might just do away with owning a gaming desktop entirely.

Have you tried GSync with your setup?


My monitor only supports FreeSync, so I only have enabled Fast Sync in the NVIDIA control panel, which is monitor-independent.


excellent info, thank you. So the Akitio node works ok with GTX 1080/ti? You mention you only use with bootcamp - but do you know if the MacOS drivers Nvidia provides work ok?


My understanding is that since High Sierra, Mac OS supports eGPUs natively, but only with AMD GPUs and only using thunderbolt 3:

https://support.apple.com/en-au/HT208544

There are some hacks to get other configurations working:

https://egpu.io/external-gpu-macos-10-13-4-update/


What DisplayPort switch are you using?



I have been for a couple of years. Mine is an DIY version with a PC power supply. I made it about 3.5 years ago. Last time I checked, I couldn’t use my particular setup for accelerated graphics in MacOS (I could in bootcamped Windows) but I think that works now. I just haven’t done it. I use it mostly for 3D rendering.

It’s an Akitio enclosure, which was the first popular one. I use it with a GTX 980Ti with a little 400 watt power supply on a late 2012 MacBook Pro and it works great. I travel with it and everything.

The performance hit is a lot less than people make it out to be. Mine runs on TB2 and I don’t remember numbers but I have done graphics benchmarks using it in boot camp Windows against the same card in my PC. It was a few FPS different on most tests. I don’t remember exact numbers. For 3D rendering/data processing, there is no performance hit.


I’m using an Akitio node 3 with a 1070 on a 2016 15” Macbook Pro. I’m using my mac and macos to work during the day. When I go home, I start windows and plug my egpu to play.

It’s really easy to use on widnows, mostly plug & play. It works so well that I stopped using my old gaming machine.

Main issue though is that you need to plug it to the left thunderbolt 3 port on your mac or the egpu doesn’t work. It also doesn’t work if you plug any other usb-c devices on the right ports. My solution for that is to use a usb-c <-> usb-a hub, plugged on 1 of the 2 usb-c left port, to plug my external ssd and gaming keyboard.


If you look up gaming benchmarks. The performance losses make it more economical to build an entire separate computer, at least for that use case.


I have a 2014 MBPr 15". I bought the VR DevKit the day it was released. Running it under the beta of macOS at the time was... wonky. Then they upgraded macOS to support it natively, and it worked exactly like I wanted it to. For about a month.

THEN they upgraded macOS AGAIN, and forced it to stop working on pre-TB3 laptops. It was working GREAT, and they just killed it. Thanks, Apple!

I tried using some dirty hack that someone on some board claimed to work, and got it working again, but after about the 5th kernel panic in 2 days, I ripped out the hack, and sold the damn thing on Ebay. What an enormous waste of time, energy, and money.

It's clear that eGPU's are the future of gaming. I'm eagerly awaiting consensus to align on what that looks like on a MacBook. I just want to play Civ V again, without it looking like a slideshow.


I just tried it out and unfortunately it didn't work for my use case. Only certain applications are currently allowed to get accelerated by the eGPU - like Photoshop. I wanted to accelerate WebGL, but it was impossible to make it work. Chrome(ium)/Firefox/Safari, nothing worked. I tried a lot and the browsers showed the eGPU as the card to be used, but it was still the Intel Iris. When someone has any pointers what I made have done wrong, I would be super grateful, but my research was, that it just doesn't work yet.


I would like to know as well. And if anyone knows if they work well with thunderbolt2.


I built a simple, no-hacks-needed TB2 eGPU for Bootcamp use. The caveat is that an external monitor is required, but other than that it's basically 100% reliable: http://archagon.net/blog/2016/12/31/cheap-and-painless-egpu-...

This thing must have extended the life of my laptop by 2-3 years.


I’m playing PUBG on a MB Pro Mid-2015 (integrated graphics only) with a Akitio Node TB3 + nVidia 1060. It worked out of the box on Windows 10.

Yes I’m losing some bandwidth stepping down to TB2 but it’s enough to satisfy some casual gaming.


You take a performance hit even with tb3. Tb2 (for eGPU) is not directly supported by Apple and not really worth it.


The promo photos of the eGPU include the LG UltraFine 5K, which fully saturates the TB3 port it uses.

Are eGPUs compatable with the UltraFine 5K? IIRC they were explicitly excluded from Apple's compatibility in the early days of High Sierra.

(EDIT: Yes, "There’s even a second Thunderbolt 3 connection for connecting high resolution displays such as the LG 5K display which gives you incredible image resolution, contrast and color depth!")


EGPUs sounds great on paper. A graphics card that I can use easily with either my macbook or windows desktop. I really think that could be an exciting way to prolong the life of old hardware or to get decent gaming performance out of non-gaming hardware.

At the moment, unless you're Windows only, the reality is a bit rougher. Although there are some pretty compelling enclosures out there (Node Pro, Mantiz Venus, Sonnet Breakaway), with Apple not supporting Nvidia (the only GPUs worth getting at the top-end) it's probably not worth it. Sure you can get it working with some hacks, but I personally wouldn't want to spend that much money to be reliant on 3rd party scripts.

Eventually the picture may change, AMD might release a competitive GPU, or the support on MacOS might open up. As soon as the software support is there I think this becomes a really compelling idea and it'll only become more so as the years go by and thunderbolt 3 becomes more widespread.


The video being worked on in the main header image is the recent sci-fi shortfilm "Hyperlight": https://www.youtube.com/watch?v=Od49AfIS2-U

Seems to be everywhere recently, and judging by the visuals it looks like it was used as a tech demonstration in many cases.


Since this is from Blackmagic what might be more interesting is a more expensive version of this with decklink built in. Then you could plug your output monitor straight into it. For a lot of people working in film/video that little extra would turn this into a Mac Pro without a cpu in it.


Why not just use an UltraStudio via Thunderbolt for that? Especially since both this and a DeckLink/UltraStudio could saturate a single Thunderbolt port alone so you’d want them on separate ports anyway.


See also the Gigabyte Aorus GTX 1080 Gaming Box: https://amzn.to/2maZl0a. It's the same price and not only way more powerful than a Radeon 580 but also the size of the eGPU is way smaller.


It also makes a lot more noise and looks ridiculous imo.


It’s so small you can just hide it somewhere.

That mac eGPU looks bigger than a Mac Pro.


How do these people have such huge, clean desks? Anyone I know would have it buried in paper and junk.


Did they use another company to build this? The design looks very different from Apple's basic line.

EDIT: oh, blackmagic IS a separate company...


considering that a thunderbolt hub alone is already very expensive and other enclosures are in the ~500 USD range, this is not too bad for a nice looking solution that is well engineered. Imo it lacks performance for the price though.


Does this work for any Thunderbolt supporting notebook like the XPS 13?


I just ordered a eGFX Breakaway Box 550 Bundle (with Radeon RX Vega 56 Card) for $819.

https://www.sonnetstore.com/collections/egpu-expansion-syste...

How does the Radeon Pro 580 compare to the Vega 56?

Edit: I just ordered a OWC Thunderbolt 3 dock and it looks like this eGPU has all the ports I need on the back. The dock was like $300 alone.


I'm understand why the page is so obsessed with DaVinci Resolve, but does anyone have any insight on how this would work with After Effects or Premier Pro?


Why is hellomrjack's reply below marked dead - anyone know? seems a useful reply and an innocuous account.

Although, in reply to him, this late 2013 iMac with NVIDIA GeForce GT 750M 1024 MB certainly struggles to playback edits in real time.


After Effect and Premier do not scale very well with GPU performance ([0], [1]), having a one at all does make a huge difference but internal GPU in MacBook Pros are good enough.

[0] https://www.pugetsystems.com/labs/articles/Adobe-After-Effec... [1] https://www.pugetsystems.com/labs/articles/GTX-1070-and-GTX-...


So, https://www.gigabyte.com/Graphics-Card/GV-RX580IXEB-8GD#kf this Thunderbolt enclosure from Gigabyte has an RX 580 in it and costs 500 USD (Google for B07CCK527Y). It is not yet tested but it is possible / likely the Vega 56 Nano will work in it once it's more available.


Let’s see: MacbookPro 15 with dedicated Video Card $2500 + $700 Blackmagic + other dongles + a new bag to carry all that ( search Apple dongles video on YouTube just for a laugh) I think we get the point. More is better, and I don’t mean more materialistic items, I mean as in spend more $. Not pointing the obvious but come on Apple, make a damn gaming laptop already.


You're no more expected to carry around the eGPU than the two external monitors that many developers have.


It's hardly a mid-range GPU. If you have a desktop PC, you can spend that money and get a much higher end GPU for the same price...


This is for laptops.


With the elusive Thunderbolt connector.


Anyone knows if I can use an nvidia card as an egpu with a macbook pro for machine learning? Any blog post or link will be appreciated.


Yep! Using Windows 10 via Boot Camp on a Late 2016 MacBook Pro, Sonnet eGFX Breakaway Box with 550W Power Supply, and Nvidia Titan X Pascal, everything just works.

In macOS, only tears and kernel panics though.


Latest script from egpu.io helped me get to pretty stable performance with 1080GTX:

https://egpu.io/forums/mac-setup/script-enable-egpu-on-tb1-2...


Is that because of Titan X drivers?


An Nvidia GPU with CUDA would've been nice...


Would definitely have been nice.

But if you're thinking about scientific programming, it's not so much effort to use AWS GPUs. I do that, and don't think I would buy an eGPU even if it was CUDA capable.


Isn't AWS expensive in the long term? I want to have a 1080 Ti but I don't even have a desktop to begin with.


AWS is expensive if you are an extremely heavy user.

Checking out whether the pricing works for you is an easy exercise. A p3 instance costs $3.06 per hour at the time of writing. It runs an Nvidia V100, which costs around $10k. So you've got yourself around 3000 hours of use before it makes sense to buy your own.

If you want something cheaper, you can go with the p2 instances at $0.90 an hour. These cost around $2k, so you're looking at around 2200 hours of use before it might become economical to buy your own.

I don't want to sound like an AWS fanboy, but I do believe it's a good democratising catalyst for deep learning and scientific computing.

EDIT: p2 instances run on K80 GPUs, which I neglected to mention above.


It depends. Small instances are a few bucks an hour, so if you just need a GPU for like a hundred hours then it’s great. If you are running the instance 24/7 for months on end, the equation changes. The big instances are cool though because they give you access to hardware that would be pretty unobtainable otherwise

I really wish they had a lambda of gpus


seems similar to the Gygabyte Aorus gaming box

https://www.gigabyte.com/Graphics-Card/GV-N1080IXEB-8GD#kf

Which you can get in the $500 range which has a GTX 1080.

This eGPU is $699 and has a less powerful AMD 580.


nvidia and mac is not the best combo. https://www.amazon.com/GIGABYTE-Gaming-Graphic-Card-GV-RX580... is only 500, almost the same video card, an RX 580.

For Windows, the 1070 / 1080 boxes are better deals, sure, but 1080 box is not 500 but 700. https://www.newegg.com/Product/Product.aspx?Item=N82E1681493..., even the 1070 is more than 500 https://www.newegg.com/Product/Product.aspx?Item=N82E1681412...


A PCIe card with a AMD Radeon RX 580 8GB is $280 from Amazon.

You are paying a lot for the cool looking case, USB hub and PSU.

I would have hoped that they would have put a premium graphics card like a AMD Vega64 or 1080 in to justify the premium price.


you are also paying for thunderbolt 3 dock which is 250-300 standalone


How does all of this behave when you have an eGPU and a Thunderbolt-attached storage device? In what order do you install those peripherals? Who loses when the CPU wants access to GPU and storage at the same time?


Just thought I'd mention. The game in the "gaming" picture looks like The Last of Us - a PS3/PS4 exclusive. Not sure how they're playing that on a mac lol.


Looks like Tomb Raider to me


It is, The Rise of the Tomb Raider and screenshot comes from here I guess: https://tombraiders.net/stella/images/TR10/large/preview/pre...


I'm not sure if it is that game, but if it was, they might very well be using the "Remote Play" functionality, which streams the game from the console to a PC or Mac. You'd definitely need a beefy eGPU to decode all those pre-rendered images ;-)


How does a Radeon Pro 580 compare to an Nvidia 1070 or 1080?


It's performance is closer to the 1060 (6gb version) from what I understand.


Depends on the acceleration method and apps used. With nvidia there's a lot of driver magic, and apps that are optimized for their proprietary architecture. Hardware wise, when it comes to raw power or computing cores, they're very similar. AMD might even have an edge in raw power (or what the hardware would be capable of if used optimally).

That's how you can think about Apple and their Metal architecture, they're basically doing the same thing like nvidia, but kind of decouple it from the GPU vendor. They simply go with AMD hardware because it gives them more bang for the buck (they don't need the optimizations because they do it themselves and pass it down to their customers).

For work in the real world creative industries, like CAD or graphic design, many rely on Adobe or Autodesk products, and that's a nvidia stronghold, most Adobe and Autodesk apps are optimized for nvidia cards.

Gaming is coupled with the CAD/VFX ecosystem, and nvidia historically just have a stronger relationship with game studios and VFX companies, that's why game studios and devs tend to optimize for them first.


Interesting technology, shame that it looks ridiculous.


How much latency does it add?


Yeah, would this be feasible for first person shooters where every millisecond counts?


If this added even a single millisecond I'd be extremely surprised. I can guarantee any latency this adds is imperceptible.


Most laptops with discrete GPUs don’t have a direct connection to the display. They copy the frame buffer back to the integrated GPU’s frame buffer and then send it to the display.

When that display is a VR headset the latency is very real and it’s a big problem. Only in the last year or two has the hardware started to change.

I would be shocked if this didn’t add latency. It’s just a matter of how much.


Beware of anything from that company. Their hardware build quality is notoriously shit. So much so that we've blacklisted their stuff from production, and I know we aren't the only ones.


Can you back these claims up at all? Without some claims why should people believe your words at face value?


Because I say so - it is anecdotal, of course! Take it what you want from it. Over the years, almost every product they've made and that got into my hands (not by choice) broke in one way or the other or was unusable in production over any longer period of time due to build quality, qc issues or abysmal software. It's why they can get lower price than their competition (Aja or BlueFish for example) which don't suffer from same issues. It's a 'known' issue with them around the TV and Film industry, hence why no one serious is using their stuff, unless they are on a strict budgetary diet. Quality is extremely random and support is next to non-existent apart from what they give you in form of Q&A and software downloads on their site. It's so bad that even their higher-priced hardware gets you almost none support. It's a crap company that is interested in volume.


This is quite amusing. In order to make your sleek macbook suffice for the purposes or real work.. You need to hook up this massive brick.


Oh look, the trolls came out to play


it cost me roughly $600 to build diy egpu. happy to see this finally hit the mainstream.


can i use two of these at the same time?


Acoustic Performance: ~18dB

I did expect a fanless design by Apple :-/


Fanless dissipation of 150W TDP? Good luck.


Well, my off-the-shelf RX460 has just half the power but has no problems with it's fanless cooler while being stuck in the same case as my Core i7 (which runs with a fanless cooler too).

So I don't think it is madness to expect a new product to be a bit better than what you could buy everywhere else since nearly two years.



A kickstarter making ambitious claims? Good luck.



It is not by Apple.


Thanks for the hint, marketing got me :-/

I saw all the nice Apple products and didn't notice that the eGPU is from a completely different company.


I don't know why the title says "Apple's". It works with MBPs but the company seems to have nothing to do with Apple Inc.


Thanks. We've updated the title from “Blackmagic eGPU: Apple's external graphics processor”, which was editorialized and misleading.


Except every comparison on the page is using a MPB, the bottom of the page says “Buy at Apple” and when you click on the link to Apple.com, Apple’s page says “only at Apple”.


Using "Apple's" is still extremely misleading, and I'm pretty certain Apple's lawyers would not approve.


Apple worked with them on it. That being said, I think calling it "Apple"'s GPU is quite misleading.


Would you say the same about the Logitech "Crayon" that was obvious designed with Apple's help, only works with the iPad and is Apple's solution for pen support for the 2018 iPad?


Yeah, I'd say that's "Logitech's crayon", not Apple's. Unless Apple claims ownership.

Mods might want to update the title.


Apple worked closely with them according to this article: https://www.theverge.com/circuitbreaker/2018/7/12/17563646/a...


Designed 'in collaboration with Apple' and sold exclusively by Apple. It's in their FAQ, pop-up right above the 'buy now' button and the price.


Probably the poster meant to say the Apple partner featured in the new macbook pro promo materials: https://www.apple.com/v/macbook-pro/o/images/overview/perfor...


It's a bit like the LG Ultrafine monitors; not actually an Apple product, but sold by, and "blessed by", Apple.

By the way, it's worth noting that that blessing doesn't mean as much as you'd think. The LG Ultrafines have significant quirks that you wouldn't really expect from an Apple thing.


Marketing.

It was either Linus or Dave2D that did some videos on external GPUs and IIRC, all AMDs work on Mac, but only a subset of nvidias do.

Since we are talking about Resolve, which is cross platform, you could just save a bunch of money by buying a PC/Linux box. Several gaming laptops have really good GPUs builtin, although Apple did finally refresh their laptop offerings.

Also keep in mind that if you want to watch video on your laptop screen, you do lose some bandwidth for the image to come back over usbc/thunderbolt (vs external display). If you do a lot of video editing and portability doesn't make you more productive, you can save a decent amount by building or buying a desktop.


Running a desktop would be cheaper, but a lot of people have to do stuff remote from a laptop. This thing seems a bit big to be really portable, but some version of an egpu would be great at preventing Resole from melting your laptop.


Its understandable- everyone wants their cellphone to complete the journey of the desktop- but the battery is just to bad. Even with a dedicated chip for drawing - open pubg and the batterys power-supply falls like a rock.

What is a game other- then a lightshow- No-Sleep-Bug?

In my opinion, as long as there is demand and physics unaligned, "building it and they will come" will not archive anything.

So why not instead build a miniserver with gfx cards into WiFi and do the computations for AR, VR and games (mostly downstream data) and catch that with a passive WiFi-Reciever? It is still video displaying - but it does not come with the usual computational burden.

I know, that is a bad architectural choice - away from the expected model, that should batterys get good, would triumph any day now. But if you plan in for that eventuality - you can still program architecturally near neutral and you could walk ahead into the future, not just glimpse it while tied to a charger.

https://passivewifi.cs.washington.edu/files/passive_wifi.pdf

https://bib.irb.hr/datoteka/575153.Energy_Consumption_in_And...

https://pdfs.semanticscholar.org/9934/9551d3ef0e0efc3c54aaab...


I lost all respect for the product when I saw Beats headphones in the promo photo. It says a lot about the focus for the product.


I hate to agree, but yeah. Given how people elsewhere in the comments have pointed out how overpriced it is, maybe they know their target market pretty well?


I think not only beats but the book selection on the shelf was sad too.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: