Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's surprising to me macs aren't a more popular target for games. They're extremely capable machines and they're console-like in that there isn't very much variation in hardware, as opposed to traditional PC gaming. I would think that it's easier to develop a game for a MacBook than a Windows machine where you never know what hardware setup the user will have.


The main roadblock for porting the games to Mac has never been the hardware, but Apple themselves. Their entire attitude is that they can do whatever they please with their platforms, and expect the developers to adjust to the changes, no matter how breaking. It’s a constant support treadmill, fixing the stuff that Apple broke in your previously perfectly functioning product after every update. If said fixing is even possible, like when Apple removed support for 32-bit binaries altogether, rendering 3/4 of macOS Steam libraries non-functional. This works for apps, but it‘s completely antithetical to the way game development processes on any other platform are structured. You finish a project, release it, do a patch cycle, and move on.

And that’s not even talking about porting the game to either Metal or an absolutely ancient OpenGL version that could be removed with any upcoming OS version. A significant effort just to address a tiny market.


> an absolutely ancient OpenGL version

I still don't get this. Apple is a trillion dollar company. How much does it cost to pay a couple of engineers to maintain an up to date version on top of Metal? Their current implementation is 4.1, it wouldn't cost them much to provide one for 4.6. Even Microsoft collaborated with Mesa to build a translation on top of dx12, Apple could do the same.


It's because of Khronos' licensing of their IP; it seems like it's not compatible with Apple's legal team's interpretation of what they need.


They can't do Khronos things because they don't get along with Khronos. Same reason they stopped having NVidia GPUs forever ago.


> They can't do Khronos things because they don't get along with Khronos.

Has anyone figured out what exactly the crux of their beef? OpenGL 4.1 came out in 2010, so surely whatever happened is settled by now.



Their current OpenGL 4.1 actually does run on top of metal making it even more blatantly obvious that they just don't want to.


> If said fixing is even possible, like when Apple removed support for 32-bit binaries altogether, rendering 3/4 of macOS Steam libraries non-functional.

IIRC developers literally got 15 years of warning about that one.


Apple's mistake was allowing 32-bit stuff on Intel in the first place -- if they had delayed the migration ~6 months and passed on the Core Duo for Core 2 Duo, it would've negated the need to ever allow 32-bit code on x86.


IIRC that didn't convince many developers to revisit their software. I still have hard drives full of Pro Tools projects that open on Mojave but error on Catalina. Not to mention all the Steam games that launch fine on Windows/Linux but error on macOS...


Yes, game developers can't revisit old games because they throw out the dev environments when they're done, or their middleware can't get updated, etc.

But it's not possible to keep maintaining 32-bit forever. That's twice the code and it can't support a bunch of important security features, modern ABIs, etc. It would be better to run old programs in a VM of an old OS with no network access.


> But it's not possible to keep maintaining 32-bit forever.

Apple had the money to support it, we both know that. They just didn't respect their Mac owners enough, Apple saw more value in making them dogfood iOS changes since that's where all the iOS devs are held captive. Security was never a realistic excuse considering how much real zombie code still exists in macOS.

Speaking personally, I just wanted Apple to wait for WoW64 support to hit upstream. Their careless interruption of my Mac experience is why I ditched the ecosystem as a whole. If Apple cannot invest in making it a premium experience, I'll take my money elsewhere.


> Apple had the money to support it, we both know that.

Not possible without forking the OS. No amount of money can make software development faster forever.

https://en.wikipedia.org/wiki/The_Mythical_Man-Month

Especially because Apple has a functional design which means there is nearly no redundancy; there's only one expert in any given field and that expert doesn't want to be stuck with old broken stuff. Nor does anyone want software updates to be twice as big as they otherwise would be, etc.

> Security was never a realistic excuse considering how much real zombie code still exists in macOS.

Code doesn't have security problems if nobody uses it. But nothing that's left behind is as bad as, say, QuickTime was.

nb some old parts were replaced over time as the people maintaining them retired. In my experience all of these people were named Jim.


> there's only one expert in any given field and that expert doesn't want to be stuck with old broken stuff.

Oh, my apologies to their expert. I had no idea that my workload was making their job harder, how inconsiderate of me. Anyone could make the mistake of assuming that the Mac supported these workloads when they use their Mac to run 32-bit plugins and games.


Another big, non-technical reason is most games make most of their money around their release date. Therefore there is no financial benefit to updating the game to keep it working. Especially not on macOS where market share is small.


The company in general never really seemed that interested in Games, and that came right from Steve Jobs. John Carmack made a Facebook post[1] several years ago with some interesting insider insights about his advocacy of gaming to Steve Jobs, and the lukewarm response he received. They just never really seemed to be a priority at Apple.

1: https://www.facebook.com/permalink.php?story_fbid=2146412825...


It's impossible to care about video games if you live in SV because the weather is too nice. You can feel the desire to do any indoor activity just fade away when you move there. This is somehow true even though there's absolutely nothing to do outside except take walks (or "go hiking" as locals call it) and go to that Egyptian museum run by a cult.

Somehow Atari, EA and PlayStation are here despite this. I don't know how they did it.

Meanwhile, Nintendo is successful because they're in Seattle where it's dark and rains all the time.


Gamedevs have not forgotten that Apple attempted to get Unreal Engine banned from all their platforms, thus rug pulling every game built on top of it.

It was only the intervention of Microsoft that managed to save Apple from their own tantrum.


As far as I’ve seen, Apple is to blame here as they usually make it harder to target their platform and don’t really try to cooperate with the rest of the industry.

As a game developer, I have to literally purchase Apple hardware to test rather than being able to conveniently download a VM


for games, how would you test in a VM, when games so explicitly want direct hardware access?

i am obviously misunderstanding something, i mean.


I run Linux and test my Windows releases on a VM. It works great.

Sure, I'm not doing performance benchmarking and it's just smoke tests and basic user stories, but that's all that 98% of indie developers do for cross platform support.

Apple has been intensely stupid as a platform to launch on, though I did do it eventually. I didn't like Apple before and now I like it even less.


I develop a game that easily runs on much weaker hardware and runs fine in a VM, I would say most simple 3D & 2D games would work fine in a VM on modern hardware.

However, these days it's possible pass-through hardware to your VM so I would be able to pass through a 2nd GPU to MacOS...if it would let me run it as a guest.


on linux, KVM provides passthrough for GPUs and other hardware, so the VM "steals" the passed through resources from the host and provides near-native performance.


I'm not a subject matter expert, but I do find it a little odd to read the second half of that. I'd expect, beyond development/debugging, there's certainly a phase of testing that requires hardware that matches your target system?

Like, I get if you develop for consoles, you probably use some kind of emulation on your development workstation, which is probably running Windows. Especially for consoles like XBOX One or newer, and PS4 or newer, which are essentially PCs. And then builds get passed off to a team that has the hardware.

Is anyone developing games for Windows on Apple hardware? Do they run Parallels and call it a day? How is the gaming performance? If the answers to those 3 questions are "yes, yes, great", then Apple supports PC game development better than they support Apple game development?


> Like, I get if you develop for consoles, you probably use some kind of emulation on your development workstation

I don’t think anybody does this. I haven’t heard about official emulators for any of the mainstream consoles. Emulation would be prohibitively slow.

Developers usually test on dedicated devkits which are a version of the target console (often with slightly better specs as dev builds need more memory and run more slowly). This is annoying, slow and difficult, but at least you can get these dev kits, usually for a decent price, and there’s a point to trying to ship on those platforms. Meanwhile, nobody plays games on macs, and Apple is making zero effort to bring in the developers or the gamers. It’s a no-chicken-and-no-egg situation, really.


Basically you are correct, MacOS has to be treated like a console in that way. Except you get all the downsides of that development workflow with none of the upsides. The consoles provide excellent debugging and other tools for targeting their platform, can't say the same for MacOS.

For testing, I can do a large amount of testing in a VM for my game. Maybe not 100% and not full user testing but nothing beats running on the native hardware and alpha/beta with real users.

Also, since I can pass through hardware to my VM I can get quite good performance by passing through a physical GPU for example. This is possible and quite straightforward to do on a Linux host. I'm not sure if it's possible using Parallels.


You do it for Xbox and PlayStation and Nintendo.


I'm sure you literally purchased Nvidia hardware for game development.


A component is much cheaper than an entire dedicated system (which would of course contain a similar component).


I don't know; a 5090 costs about 3k, a 5070 about 500. You can either buy a MacBook Pro or a Mac Mini. Seems reasonable.


Mac dev sucks. You're forced to use macos and xcode (for the final build anyway). You're not able to virtualize the build machines.

Apple is actively hostile to how you would build for Linux or PC or console.


> You're not able to virtualize the build machines.

Sure you can. And officially, too. Apple still ships a bunch of virtualization drivers in macOS itself. Have a look:

/System/Library/Extensions/IONetworkingFamily.kext/Contents/PlugIns/AppleVmxnet3Ethernet.kext

Whether or not you're using ESXi, or want to, is an entirely different question. But "you're not able to" is simply incorrect. I virtualize several build agents and have for years with no issues.

macOS 26 is the last major version to support Intel, so once macOS 28 is latest this will probably become impossible (macOS 26 should be able to use Xcode 27, but maybe the platform removal will change this previous year's OS support from continuing).


> Apple still ships a bunch of virtualization drivers in macOS itself.

I think OP means virtualizing on something that isn't Apple.


Interesting. The last I looked into it, you could only officially do this on Mac hardware (defeating the purpose).

You can get an xcode building for arm Macs on PC hardware with this?


- Windows: windows and Linux vm.

- Linux: windows and Linux vm.

- Apple: windows, Linux, Apple VM.

Seems pretty straightforward.

I am being facetious. You'll have a PC for gamedev because that's your biggest platform unless you are primarily switch or PS5, in which case you'll have a devkit as well as a PC. But the cost of an Apple device is insignificant compared to the cost of developing the software for it.

So it really comes down to the market size and _where they are_. The games I play are either on my PS5, or on my Mac, never both. For any specific game, they are on one or the other. Ghost of Tsushima is on the PS5. Factorio is on my Mac. If I were an indie game developer, I'd likely be developing the kind of game that has a good market on the Mac.


>Mac dev sucks. You're forced to use macos and xcode (for the final build anyway)

Having to use xcode "for the final build" is irrelevant to the game development experience.


If you're an indie with just PC hardware it sure as hell matters.


This is simply not the case. Every major game framework/engine targets Mac natively.

If you are building your engine/game from scratch, you absolutely do not need to use Xcode


Why don't you look through the Unreal and Unity docs and see if you can make a build without a Mac and xcode.


I think I misunderstood your point as “developing a game on Mac sucks”, vs “developing for Mac without a Mac sucks” which I absolutely can’t disagree with


Yea you’re right I skipped over the part where you said the final build required it.

Nonetheless that’s a small fraction of the time spent actually developing the game.


Ideally, it's a continuous part of development because you're making daily (or more) builds and testing them.

That makes it a continuous headache to keep your Mac builders up.

It means you need to double dev hardware costs or more as you need a gaming PC to target your core audience and Macs handle the mac bugs.

It means your mac build machines are special snowflakes because you can't just use VMs.

The list goes on and on of Mac being actively hostile to the process.

Just Rider running on a Mac is pleasant sure, but that's not the issue.


I was very surprised, and pleasantly too, that Cyberpunk 2077 can maintain 60FPS (14", M4 Pro, 24gb RAM) with only occasional dips. Not with full resolution (actually around FullHD), but at least without "frame generation". Turning frame generation on, it now can output 90-100 FPS depending on environment, but VSync is disabled so dips become way more noticeable.

It even has "for this mac" preset which is good enough that you don't need to tinker with settings to have decent experience.

The game is paused, almost like becomes "frozen" if it's not visible on screen which helps with battery (it can be in the background without any noticeable impact on battery and temperature). Overall way better experience than I expected.


I play a lot of World of Warcraft on my M3 MacBook Pro which has a native MacOS build. It's a CPU bottlenecked game with most users recommending the AMD X3D CPUs to achieve decent framerates in high end content. I'm able to run said content at high (7/10) graphics settings at 120fps with no audible fan noise for hours at a time on battery. It's been night and day compared to previous Windows machines.


Multiple solid reasons have been mentioned from ones created by Apple to ones enforced in software by Apple. One that hasn't been mentioned is the lack of marketshare. Macos market is just tiny and very limited. It's also not a growing market. PC gaming isn't blowing up either but the amount of players is just simply higher.

Ports to macos have not done well from what I've heard. However you can see ports on PC do really well and have encouraged studios like Sony and SquareEnix to invest more in PC ports. Even much later after the console versions sell well. Just not a lot of reasons to add the tech debt and complexity of supporting mac as well.

Even big publishers like Blizzard who have been mac devs for a long time axed the dedicate mac team and client and moved to a unified client. This has downfalls like mac specific issues. If those are not critical then they get put in the pile with the rest of the bugs.


It's easier to develop a game for a mac in some ways but you reach a tiny fraction of gamers that way.


I wonder how that might look once you factor in Apple TV devices. They're pretty weak devices now but future ones can come with M-class CPUs. That's a huge source of potential revenue for Apple.


The current Apple TV is, in many respects, unbelievably bad, and it has nothing to do with the CPU.

Open up the YouTube app and try to navigate the UI. It’s okay but not really up to the Apple standard. Now try to enter text in the search bar. A nearby iPhone will helpfully offer to let you use it like a keyboard. You get a text field, and you can type, and keystrokes are slowly and not entirely reliably propagated to the TV, but text does not stay in sync. And after a few seconds, in the middle of typing, the TV will decide you’re done typing and move focus to a search result, and the phone won’t notice, and it gets completely desynchronized.


The YouTube app has never been good and never felt like a native app -- it's a wrapper around web tech.

More importantly for games, though, is the awful storage architecture around the TV boxes. Games have to slice themselves up into 2GB storage chunks, which can be purged from the system whenever the game isn't actively running. The game has to be aware of missing chunks and download them on-demand.

It makes open-world games nearly impossible, and it makes anything with significant storage requirements effectively impossible. As much as Apple likes to push the iOS port of Death Stranding, that game cannot run on tvOS as currently architected for that reason.


There's a cost/value calculation that just doesn't work well...I have a Ryzen9/rtx3070 PC ($2k over time) and my M4 Mini ($450) holds it's own for most all normal user stuff...sprinting ahead for specific tasks (Video CODEC)...but the 6 year old dedicated GPU on the PC annihilates the Mini in pushing pixels...You can spec an Apple that does better for gaming, but man, are you gonna pay for it, and still not keep up with current PC GPUS.

Now...something like minecraft or SubNautica? The M4 is fine, especially if you're not pushing 4k 240hz.

Apple has been pushing the gaming experience for years (iPhone 4s?) but it never REALLY seems to land, and when someone has a great gaming seperience in a modern AAA game, they always seem to be using a $4500 Studio or similar.


I wrote a post (rant)[1] about my experience of releasing a game on macOS as an indie dev. tl;dr: Apples goes a long way to make the process as painful as possible with tons of paper cuts.

[1] https://ruoyusun.com/2023/10/12/one-game-six-platforms.html#...


- have to build using XCode on macOS

- have to pay Apple to have your executable signed

- poor Vulkan support

The hardware has never been an issue, it's Apple's walled garden ecosystem.


Apple is not the only platform where you effectively pay to have it signed. At some point people need to let this go and accept that the wider industry has started to go this way.


Metal is a very recent API compared to DirectX and OpenGL. Also, there’s very very little people on Mac, and even less that also play videogames. There are almost no libraries and tooling built around Metal and the Mac SDKs, and a very small audience, so it doesn’t make financial sense.


you have to release major titles for windows and console, because there are tons of customers using them.

so a mac port, even if simple, is additional cost. there you have the classic chicken and egg problem. the cost doesn't seem to be justified by the number of potential sales, so major studios ignore the platform. and as long as they do, gamers ignore the platform

i've seen it suggested that Apple could solve this standoff by funding the ports, maybe they have done this a few times. but Apple doesn't seem to care much about it


Up to some years ago, it was common for gamers to assemble their own PC, something that you can't do with a Mac. Not sure if this is still common among gamers though.


The advent of silicon interposer technology has made modular memory and separate CPU/GPU soon to be obsolete IMO

The communication bandwidth you can achieve by putting CPU, CPU, and memory together at the factory is much higher than having these components separate.

Sad for enthusiasts, but practically inevitable


It's kind of a myth though, Mac has many flagship games and everything in between

If you identify as a "gamer" and are in those communities, then you'll see communities talking about things you can't natively play

but if you leave niches you already have everything

and with microtransactions, Apple ecosystem users are the whales. again, not something that people who identify as "gamers" wants to admit being actually okay with, but those people are not the revenue of game production.

so I would say it is a missed opportunity for developers that are operating on antiquated calculations of MacOS deployment


> It's kind of a myth though

It's kinda not. Here's a rough list of the 10 most-played games currently on PC: https://steamdb.info/charts/

macOS is supported by one title (DOTA 2). Windows supports all 10, Linux (the free OS, just so we're clear) runs 7 of the games and has native ports of 5 of them. If you want to go argue to them about missed revenue opportunities then be my guest, but something tells me that DOTA 2 isn't being bankrolled by Mac owners.

If you have any hard figures that demonstrate "antiquated calculations" then now is the time to fetch them for us. I'm somewhat skeptical.


Doesn’t MacOS favor an 60Hz output? Gamers prefer much higher rates.

And don’t forget they made an VR headset without controllers.

Apple doesn’t care about games


> Doesn’t MacOS favor an 60Hz output?

Kind of? It does support higher refresh rates, but their emphasis on "Retina" resolutions imposes a soft limit because monitors that dense rarely support much more than 60hz, due to the sheer bandwidth requirements.


The MacBook Pro has had a 120 Hz screen for nearly half a decade. And of course, external displays can support whatever resolution/refresh rate, regardless of the OS driving them.


The porting is not straightforward; you must switch to Metal, you should adapt rendering pipeline to tiled deferred shading.


i think it depends on how easy it is for a dev to deploy to apple. M1 was great at running call of duty in a windows emulator. iPhone can run the newest resident evil. apple needs to do more to convince developers to deploy to mac




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: