I wonder how apples latest iPhone gaming reveal affects these plans. If the iPhone can play raytraced games, then why would someone want to stream the game in the first place?
It’s probably still too little too late, but apples latest metal conversion tool and iPhone event have me hopeful about gaming on apple devices.
Streaming is fantastic. It means there is an indirect reference between the display of the game, and the running of the game. It means you can play max quality games on a $200 chromebook. The game can be running in the same rack as the server. Visual/input lag is a different type of lag than game client/server lag, I'm not sure which one will win the long haul.
As fiber is being installed into more residences, streaming becomes an even better prospect. I would much rather be vendor locked into steam than into the xbox, play station, or ios stores. Steam still has execs that think about more than next quarters profits, they aren't trying to bleed their cash cow dry like most other companies seem to be doing. From a monetary point of view $10 a month with no contract is more appealing than $500 or even $3000 every several years. It definitely seems like there are many economies of scale wins to be had via centralization.
Apple also failed to create a marketplace culture. IOS gaming is synonymous with "pay to win". Good games just aren't put on those devices with the exception, maybe, of riot's games and pc ports. I would be worried about Apple's "lets make money off kids gambling" culture, that is absolutely pervasive in mobile gaming, becoming more pervasive due to cross pollination. Apple is generally a benevolent dictator, but when it comes to mobile games they are a slum lord.
It perpetuates the idea that you pay for a thing but don't own it. It creates more incentives for ongoing monetization (to support the streaming infra) like in-game ads and loot boxes with paid keys. It introduces horrendous input lag that fluctuates based on network conditions and it means there's yet another probably buggy layer of abstraction between the game and the player. (edit: and you can't play on LAN!)
Unless you're playing Yahtzee, streaming delivers a suboptimal gaming experience.
> It perpetuates the idea that you pay for a thing but don't own it.
That is probably true in the case of the streaming that you're discussing, but I want to bring up something that I think is pretty cool, Steam Remote Play: https://store.steampowered.com/streaming/
I actually had a case where I wanted to play a game on my netbook, but it just wouldn't run on the device (Transport Fever 2, which was demanding both with its OpenGL and Vulkan renderers). So instead, I setup the game on my PC and just streamed it to my netbook, over wireless in the house.
I'm mentioning this because it was actually playable and honestly it was cool to see the tech getting better for this (not a bunch of input bugs, or performance as bad as what VNC/RDP would historically give you) and being able to do something like that without a hassle.
Of course, that's a bit different from not owning the game and just renting someone else's hardware/service to play it on.
> It perpetuates the idea that you pay for a thing but don't own it.
Renting is an old concept and it's not going away any time soon. Maybe the technology is not there yet but I don't see what's inherently wrong with the concept itself. Owning is wasteful unless you have close to 100% utilisation. Why pay for the full price of an X-Box that sits idle most of the time? If every idle X-Box could be utilized by anyone in need, we would either see a significant rise in the number of people able to enjoy gaming, or a decrease in the number of consoles required to be produced.
>Owning is wasteful unless you have close to 100% utilisation.
If you're going to veer into Economics 101 then you'd better reconcile with the fact that this "renting" only causes more surplus to be captured by the publisher, and not by the consumer.
A. You pay for an x-box and you own it. You can mod it, develop games on it, play games on it, sell it, whatever. Economists tell you this is "inefficient" because the xbox is not 100% utilized, and so in theory you have incurred an opportunity cost.
B. You pay $amount for the privilege of "renting" a game that is streamed to your television over the internet. You own nothing. Your access to the game and/or the game console and/or the service as a whole can be terminated at any time, for any reason, at the whims of a gigantic faceless transnational corporation. You have no means to contact an actual human at customer support because there aren't any humans working in customer support. The corporation gives less thought to you than you do to a mosquito that splats on your bumper as you drive home in your Elonmobile (which, incidentally, suffers many of the same problems)
I'm on your side of the argument, but I take issue with the black and white "You pay for an x-box and own it". For any computer hardware that I pay for, I don't feel lime I truly own it unless I can run my own code on it.
Technically, I can rent an EC2and do that, but MSFT won't let me run a hello world on an X86 computer :/
Computers are complex enough that there is more than one dimension to ownership (in fact, if you consider how many computers are inside your computer, it'll quickly get fractal), but the argument still applies to whichever ownership dimensions you care about.
> "renting" only causes more surplus to be captured by the publisher, and not by the consumer
This assertion is demonstrably false. If this were the case, it would beg the question as to why many consumers and businesses willingly opt for rental models. Cloud hosting and music streaming are prime examples of the popularity of such models.
You have a personal preference for ownership, but if we look at consumer trends in the past decade, I doubt your preference will be shared by a majority of consumers. My bet is the majority of consumers are likely to prefer the flexible, less costly options, rather than ownership (assuming the technology is competitive).
Selective quoting to misconstrue what I'm arguing won't win you any favors. I'm suggesting that in this particular case (you carefully avoided quoting the "this" that I used) the surplus is being captured by the producer.
Anyways, not interested in discussing further with someone who quotes selectively to argue with a strawman. See ya.
It was not deliberate but my point stands with or without the "it". I'm not exactly sure what you mean by the surplus being captured by the producer in this case, but typically, for a voluntary transaction to occur in a free market, both the buyer and the seller must perceive some kind of surplus or benefit. If consumers willingly opt in for the streaming service, they must see the transaction as beneficial to themselves right?
What I am saying is that, looking at past trends and assuming the technology becomes competitive, I believe a majority of consumers will opt for option B (streaming) rather than stick with option A (owning).
Because it's a smaller upfront cost, and a "pay per usage" cost ongoing. Many people don't want to (for example) dedicate the space to a large dvd collection for something they're only going to watch once, or to buy a media cupboard to hold two extra dvds. People don't want the hassle of reselling/searching the second hand market (where companies like eBay provide value, but friction and risk at the same time as anyone who has ever had an "item not received" case opened against them will tell you).
That's a common theory among people who hold a minority preference in a market, but it's almost invariably wrong. On aggregate, people tend to be rational. And there's plenty of valid and rational reasons for preferring to rent vs owning. Some reasons I can think of: lower cost, convenience (don't need to go to a store or wait for a shipment, doesn't take space in your apartment), not having to worry about hardware breaking, not having to worry about upgrading when a new model comes out, not having to worry about backups, portability, access to a huge catalog, personalized experience, ability to easily switch service providers, etc. It's not hard for me to understand why many people would prefer to rent.
If only that could be true. Economics would be more simple, and maybe actually a science.
Modern economics is nearly entirely how to deal with the fact that individual actors are indeed not rational according to the traditional definition of the term. Which is why advertising works so well.
It's a model that mostly reflects reality. I find the alternative much more unbelievable (e.g. that this trend is the result of general ignorance and dumbness). It's simply more likely that your personal preference simply isn't shared by the majority.
> If this were the case, it would beg the question as to why many consumers and businesses willingly opt for rental models.
Consumers: because "as-a-Service" can have cheaper sticker price (often free), people have hard time seeing long-term costs, and social immune system hasn't caught up with this particular type of abusive business practices.
Businesses: because they're much more focused and high-cadence economic units than are individuals living a life, and most importantly, they're abstract. Their don't have lives that are valued. They're an entirely different kind of economic agents, and it makes much more sense for them to be "cash flow minded" instead of ownership-minded.
Then there's also the angle that, in B2B, typically both sides of the relationship are much closer in terms of relative power than in B2C, which is why there's less blatant abuse happening there.
> Consumers: because "as-a-Service" can have cheaper sticker price (often free), people have hard time seeing long-term costs, and social immune system hasn't caught up with this particular type of abusive business practices.
That seems a bit elitist to me. "Ah, if only the consumers knew what you know?" Maybe you're right but I find it more likely that they simply like the deal being offered.
Nit picking goes both ways. Don't postulate unless you're going to give both sides equal treatment.
> You own nothing. Your access to the game and/or the game console and/or the service as a whole can be terminated at any time, for any reason, at the whims of a gigantic faceless transnational corporation.
Your access to an Xbox is, and has been, gated behind a Microsoft account for a decade (same with PlayStation). That "facless transnational corporation" can ban you from using your _purchases_ Xbox and physical games just as easily as they can revoke your streaming access.
> You have no means to contact an actual human at customer support because there aren't any humans working in customer support.
You're speaking about Google here, but Microsoft, apple, Sony (can't speak for Nintendo) and valve all have open,semi-reasonable support channels.
> The corporation gives less thought to you than you do to a mosquito that splats on your bumper as you drive home in your Elonmobile (which, incidentally, suffers many of the same problems), sell it, whatever
I hope you had fun writing this drivel. It reeks of immaturity, pomposity and navel gazing without any real substance, and your post would be better without it.
> If you're going to veer into Economics 101 then you'd better reconcile with the fact that this "renting" only causes more surplus to be captured by the publisher, and not by the consumer.
Why should the savvy renters care? They aren't generating the surplus captured by the publisher.
As someone who routinely plays and mods older hardware, I really hate renting culture. It’s short sighted and in 20 years time, when your kids are adults, they’ll be wishing there was a way to play games from their childhood.
We are already seeing this problem with specific games that force online verification. Let’s not break the gaming platform itself as well.
I might sound like communist, but i would like to figure out a way to a future (sustainable) where everyone owns everything and are happy. Not talking about equally to everyone, but at least not 99% owned by 1%.
Is there anything inherently wrong with 99% "owning" 1% nominally? Somehow 95/1 or 75/1 is systemically different or is it just extracting an emotional reaction to numbers folks have no accurate mental representation of in such context. If not, it does strike me as a slippery slope to communism.
Renting costs are not scalable or sustainable. The people who rent everything they can afford and own nothing will have to be in perfect health all the time, have no debts to pay, and basically work until death. Logan didn't run for no reason.
It’s been years since I’ve watched Logan’s Run, but wasn’t it the girl he fancied who went on the run and he just got caught up. Rather then him explicitly choosing to run?
societies willingly pool taxes so that the elderly when retired can maintain a decent standard of living.
often the decision is to move somewhere warm and sunny or to travel but it can be to have all the streaming services and a nice little cottage in the country.
evidently someone downvoted me for the suggestion that such societies exist - gee, I wonder what their problem is.
> Why pay for the full price of an X-Box that sits idle most of the time?
Because it's available at a moment's notice. You don't have to actively schedule around its availability, or otherwise worry about it. It gives you consistent play experience, freedom to use it in nonstandard ways, and if anything goes wrong, it's likely you can do something about it now.
In general, ownership removes worry and gives you more options. Those have utility that more than offsets reduced utilization.
As for example of what I mean by worry, let's take video streaming. Between Netflix's failure rate and unexpected catalog removal, my ISP's failure rate, OS (both Linux and Windows) and browser (both Firefox and Chrome) being what they are, and anything else that happens between my machine and the CDN hosting the video I want to watch, there's a non-trivial chance it may be not available at the time I want to watch it, or it will be non-watchable (e.g. because it's pausing every few seconds to buffer, because of some failure somewhere). I experience some sort of a failure like this roughly once every week or so. That doesn't seem like much, but it generates some anxiety - a worry that at the exact moment I wanted to stream, it won't work, e.g. ruining an at-home movie date with my wife, or my personal "sit down with my favorite supper and put on a show" unwinding ritual.
Contrast that with the same video, acquired on the high seas in form of a local file. This. Just. Works. If the same or similar video file worked last month, it will work today, 100% of the time. Even if the Internet goes down, even if Netflix goes belly up, even if my ISP gets hit by a fire, even if a war starts somewhere and there's a billion people rapidly F5-ing news videos - my video file Will. Just. Work. I can set dinner and sit down and double-click and, not for a second, worry about the hidden weather of markets and networks ruining my evening.
That's just one small part of what makes ownership good despite seeming economical drawbacks (well, drawbacks to whom? it really seems to depend on one's accounting methods...).
With traditional products there were three kind of parties involved: Producer, dealer, and consumer. Renting was a concept between the latter two. At an increased logistic effort between them (returning the product, more but smaller payments) the number of products bought from the producer is reduced.
With dealers included, exhaustion occured between producer and dealer. With dealers removed and products rented exhaustion no longer occurs at all.
That's a significant change, much more so than removing dealers or a switch to renting alone.
Producers retain full control, and individual consumers lack the negotiation power dealers had, which is an issue if there is no eqivalent alternative product from a different producer.
Agreed - including many who should really know better. https://hpbn.com is an excellent resource to refresh or learn practical networking fundamentals.
Back in the day, you could rent your own games to other people by handing them your cartridge and accepting some money for the loan. The goods were transferrable. Renting is good when it coexists with real ownership.
Having played both Cyberpunk 2077 and Destiny 2 exclusively as Stadia games, I respectfully disagree with your third point. And your second point is actually a flaw in the current videogame industry economics; the lack of any kind of subscription relation and perpetual-payment model for most games breeds most of the ills we see in the industry (buggy AAA launch titles, crunch time, studios releasing brilliant products, then going under because they just timed their release badly and were the second-best thing against something else that took all the attention that cycle).
But Stadia is definitely not a counter-argument to your first point. ;)
I agree it's awful, but anyone using an iPhone that is locked down doesn't care about ownership. The games can go right in their iTunes library with the other media and entertainment they don't own.
I've had good experiences streaming, but I also don't like competitive FPSs. So we'll have to agree to disagree. Maybe you live farther from data-centers than I do.
Rather than buying a new computer to play cyberpunk, I streamed it and it and was quite happy. It cost me $10 for one month and my relationship with the company ended afterwards. I've used streaming a bit more and I've tried several different services and generally been happy with all of them. nVidia now is great, I just finished BG3 on it and it ran well.
> It perpetuates the idea that you pay for a thing but don't own it.
I think that's myopic. A new computer every 8 years for $2000 or $10 a month for 8 years? From a purely monetary and economic point of view, there are clear advantages to subscriptions. From economies of scale perspectives there are clearly advantages to subscriptions/centralization. There are disadvantages, too. So it's most correct to talk about trade-offs.
Where I think you are right is that it gives vendors power to alter the rules and leave you praying they alter them no further. We are seeing the down side of now, with the enshittification/shrinkflation/brand looting of nearly everything in the American economy. So I agree that in a might makes right world, a world in which there are no consequences for our aristocracy, centralization/streaming/etc puts you in a weaker negotiating position and could very likely be good in the short term, but bad in the long term.
> It creates more incentives for ongoing monetization (to support the streaming infra)
I think that depends on if these streaming companies are subsidizing the streaming or not in the same way Uber subsidized rides: at a loss. Allocation for peak capacity is the direct cost to the company, and any unused capacity could potentially be re-tasked for other purposes. Any user that doesn't contribute to peak capacity+growth projection is free from the perspective of hardware acquisition. Hardware is an economies of scale world, there may even be positive tax implications.
I'm not sold that there is any special incentive to monetize streaming unless the companies are streaming at a loss. Letting kids gamble is the culture of mobile gaming, and I think it's culture that perpetuates it more than structure. I think the PC market salivates at the idea of loot boxes (just look at EA), but the culture is quite against them. PC gamers have provided consequences to predatory publishers while Apple and mobile gamers have not.
> It introduces horrendous input lag and it means there's yet another probably buggy layer of abstraction between the game and the player.
I haven't personally experienced this. I have had almost entirely good streaming experiences on broadband. I think from a purely theoretical point of view, input+vision doubles the potential latency and therefore it should be worse.
I think from a practical point of view it's probably easier for one company to "solve" input/graphics lag for all games while games run in ideal conditions, than it is for every game's netcode to be fantastic.
While we're comparing and contrasting:
Have you read the EULA's of the anti-cheat software on some of these games? Literal root kits, some from Chinese companies, some that allow direct raw memory dumping from the computer if not direct RCE/remote access for investigation.
From an environmental point of view computing from a datacenter as well as time sharing devices is probably a win.
So while you come in strong against it. I think you should be somewhat respectful of other people's positive experiences.
"Streaming is awful" is poor criticism. "Streaming quality is greatly dependent on where you live and your internet quality" is probably an accurate criticism. Streaming from SJC in San Francisco is probably going to be a pretty good experience.
Streaming is horrible. All of the game streaming services so far that offer a half-serviceable GPU cost $40+ per month, because that’s the minimum cost for offering a GPU to a user over the Internet. The remote end of a game stream is orders of magnitude more resource and power hungry than the remote end of a video stream. For it to be profitable, without tying in some kafka-esque monetization scheme into even single player games, it would have to cost at least $60 per month.
I’ve paid for and used multiple remote GPU gaming services and all of them suffered from horrible jitter in input lag. If even once every 20 minutes you suffer in a (even single player) game randomly due to network latency, you’re going to hate the service. This was on residential fiber as well. I think this is why Google exited the market. It’s just a bad gaming experience.
When you think about the type of player that would need to champion this service for it to reach critical mass among hardcore gamers, think of a speedrunner or top-level competitive MoBA or FPS player. In the offline case, they are honing thier skills at a time resolution of 16ms or less (in the case of an FPS player 8ms or less). This type of gameplay just isn’t possible over the Internet in general. Live more than 400 miles from a datacenter? Whoops, you can’t get good at games.
Game streaming, not to mention the “own nothing” aspect of it, is just horrible from the perspective of anyone who cares about the game they play and getting good at the game they play.
I cannot stand latency. I can tell when my ping jumps from 20 to 70ms and it always throws me off. I don't think I'm particularly sensitive to it either. The thought of every game being subject to that sounds horrible.
>I wonder how apples latest iPhone gaming reveal affects these plans.
Likely nothing. Forget about Raytracing, the tech. But RayTracing as in better Graphics. iPhone has been able to output very decent graphics for games for quite some time. Arguably since Unreal Engine Game running on iPhone, and if I remember correctly that was Infinity Blade. And yet the market for hard core games on Smartphone is just not there. To the point where there are very little overlap between Smartphone Gaming and PC / Console.
And if anything the small market of Console Games wanted on Smartphone may be much better served by Cloud Streaming.
Edit: Also worth mentioning the constant iOS changes means developers have very little incentive to keep updating their game over a long period of time.
Except the storage choices don't make sense. Storage is extremely cheap right now, the base should be at least 512GB, yet we have to pay premium to get a reasonable amount of space.
how many people actually use their space? really use it. not just use a quarter of it and keep the rest in reserve.
my phone has 256gb right now. 150gb are free. (cloud storage doing its part.) that’s two large games and 40gb+ to spare. so as much as a larger capacity would allow for more, this is still… fine.
They'd use it a lot more if they were regularly downloading 40GB+ games to it. One of the biggest criticisms of the $300 Xbox Series S is that it only has ~350 GB free to the user. And while RE4 Remake is 40GB it is no where near the biggest. Mortal Kombat 1 is 100GB, Starfield is 100+ GB, Baldur's Gate 3 is 100GB+.
A lot of these discussions seem to be ignoring control inputs. Many, many games only work well when you have an analogue stick and ideally some physical buttons. On-screen controls just aren't the same.
A lot of the technology has a lot of potential but I don't know how you get past that hurdle to broaden iPhone gaming.
Honestly what jumps out to me reading this is how unclear his communication is, and I don’t mean from the perspective of native/non native English speaker.
It’s hard to understand specifically what he means here, which just may be due to a lack of context, even though I generally get his point.
I think mostly I’m just being a harsh grader here because I expected someone like him to communicate very clearly and concisely.
This is how most emails at the exec/c-suite level read. Admittedly kind of annoying, but a lingo develops and shorthand is used all over. There have been many in-person conversations with the people included in this email so they have all this built up context and shared language. To you, it sounds cryptic, but the recipients understood exactly what Satya meant here.
I don’t think it’s the lingo/shorthand but rather the fact that some of the sentences don’t parse at all.
It’s possibly to figure out what he meant, but it requires significantly more time than if the sentences were constructed “correctly”.
Missing pronouns are easy to fill in, but consider this sentence:
> But [I] want to use every opportunity to make Cloud Streaming more mainstream the better it is in the long run […]
As soon as I reach “the better it is” I feel a mental jolt because I expect a construction along the lines of “the more <X> the better it is” (apparently this is called a “correlative comparison”); even though I just read the word “more”, it didn’t fit into the expected construction, so I go back over the sentence to see if I misread it.
“…want to basically use every opportunity to make Cloud Streaming more mainstream. The better it [will be] for us in the long run [and] for all the strategic reasons we talk about”
I think the case re: lesson / learning is less clear, because in a “learning” there’s no teacher, you’ve all learned something, whereas lesson implies teacher and taught.
A lot of the time, a metaphorical lesson has no teacher, too. "Learning" is imported to the tech scene from India, where "a verb-ing" seems to be able to be the noun form of one instance of the verb (in certain contexts).
I find that limiting the number of things that drive you up a wall is good.
"I have something to ask of you" is not a novel term for request. And ask is softer than request, giving the other party more opening to decline. I don't hate it.
I’m not putting myself anywhere on his level - want to make that clear - but I was specifically referring to clarity.
I work with execs at our company a lot and am often in their email chains - agree on the lingo and shorthand for stuff, but it’s usually clear. You’d be surprised at how candid people are and how much cussing there can be lol. But I was specifically referring to the clarity of the message.
Either way it’s just an opinion and as I said, I think I was just idealizing too much and expected something else.
Yeah, I got you. I understood you were just expressing shock at how unclear this was and I totally hear you. I was just commenting that this is surprisingly more common than you might think. It could also very well be that Satya is a worse emailer than most. Or it could be that he fired this off while jetsetting or running late to a meeting. Who knows.
I thought the exact same thing. Though I realize for me this might just be that I don't know what he means by "socket" - must be some product jargon that I haven't picked up yet.
There are sentences in this email that don’t parse at all; it’s still possible to guess what he probably meant, since humans are highly adept at error recovery for natural languages, but it takes significantly more time.
i still have no idea what's going on, despite reading the ELI5 thread just up, and learning from a different one that they have a cloud gaming streaming service.
It's basically analogous to what Valve is doing with steam. Windows has a monopoly on PC gaming and Valve does not want to be subject to any future restrictions or requirements that may be enforced in the future, so they funded andfacilitated gaming on Linux based PCs.
Analogously, Microsoft doesn't want to be subject to the duopoly of Google and Apple with mobile (gaming), so they want their cloud steaming platform available on competitors.
First: I'm happy that Steam has put effort into Linux!
Then, as a bit of a tangent:
Windows only has a "monopoly" on PC gaming because Apple refuses to licence their operating system for non-Apple hardware - and because Linux just isn't mainstream enough.
If you could buy OSX just like you can buy Windows and install it on your home built PC, I'm confident the landscape would at least be somewhat different.
(It is certainly possible to install osx yourself on your own PC hardware, if you put your pirate hat on and are happy with massively outdated GPU drivers etc - or at least it was half a decade ago when I checked last).
Of course Apple would also have to develop some care for gamers.
“Windows has a lock on PC gaming because macOS isn’t available for PCs. Of course, macOS would have to radically chamge to be even remotely usable as a gaming competitor to Windows.”
If macOS is that bad for gaming, no gamer is going to use it even if it was available for PCs.
Disagree. My argument is that Apple have been ignoring it pretty much completely.
If a significant user base of PC users had OSX, then with a bit of TLC/ development from Apple I'm sure it'd be a very viable platform for gaming as well.
I.e. Microsoft are going a crap load of work behind the scenes to provide the best gaming experience, so Apple would have to step up their (sorry) game to compete.
> Disagree. My argument is that Apple have been ignoring it pretty much completely.
Worse than ignorning really. Games for windows from years ago largely still work. Games for Mac Os X don't because Apple regularly breaks compatability. This means the long tail of late sales won't happen without more developer effort, which isn't usually available.
I second toast0's point. Apple didn't ignore desktop gaming, they actively gave the middle finger to the whole community, repeatedly, and iOS was the only bridge that was thrown to restore some kind of decent relationship.
See nvidia basically banned from the macos ecosystem, or the whole fight with Epic as they couldn't come to any resemblance of a compromise.
Imo the bigger point that you’re missing is that Mac OS wouldn’t be Mac OS without limiting its hardware compatibility. Its stability and usability is directly tied to just having to support so few hardware profiles compared to the nightmare Microsoft has to deal with for decades.
Also as others have already pointed out, Apple has historically hated games. The only reason they tolerate it now is because of the massive revenue it brings with iOS
I think that apple is going to break into gaming in a big way in the next 5-10 years. A small target range of carefully-curated hardware is a HUGE advantage in game development.
Devs will be able to optimize for apple machines in much the same way that they currently optimize for consoles, and you'll be able to know exactly how a game will perform on your system before buying it.
There is only so much optimization you can do before you run into the limits of the hardware.
I wouldn’t mind a 30% markup for an apple gaming pc, but based on the markups the currently charge for compute… I’d expect the top of the line $2k PC I built this year to cost $4k-$6k if it came from apple. I’m extremely skeptical of the vision for the same reason. You need raw power for driving high end displays.
They will probably compete in the console market though. Kinda like how their headphones compete in the “rich but not knowledgeable about audio” demographic. Actually, I’m kinda selling my self on this. I should buy more apple stock.
They're not top end AAA games by any means, but both Factorio and World of Warcraft have highly performant native Silicon builds that drive my high refresh rate display @ 1440p quite happily, without even making the fans spin up noticeably.
Microsoft poured money into game tools since the 90s, published a large library of in-house games themselves, litigated and demolished it's competition (both Apple and Linux), and courted every single hardware manufacturer in existence. XBox as a console came into being from this massive investment.
> because Apple refuses to licence their operating system for non-Apple hardware
Reminder that Microsoft poached Bungie, who developed Marathon and others exclusively for Macintosh, to develop Halo for PC and XBox exclusive releases.
> Windows only has a "monopoly" on PC gaming because Apple refuses to licence their operating system for non-Apple hardware - and because Linux just isn't mainstream enough.
I mean, defend it or don't but it doesn't make it a not-monopoly that its competitors are not able to compete effectively for various reasons.
AAPL can, but doesnt. A lot of hand waving over hardware, is misdirection. AAPL can curate limited support for consumer hardware same as their custom hardware, but they dont want to negotiate AND stop relying on ridiculous markup. The issue is momentum. AAPL has demonstrated this is the way and gaming...with new random features becoming popular based on consumer hardware breakthroughs, doesnt feed into their existing, stable, pipelines of profit.
TL;DR Gaming is partially fed by innovation in hardware and AAPL hates that.
I have no idea what you are saying. I don't think we are having an exchange about the same things, or at least not the same context.
ie The reason AAPL's board doesn't want to support 3rd party hardware, which is usually based around the newest gaming technology from potential competitors, is some opposite reason?
> it doesn't make it (MSFT) a not-monopoly that its competitors are not able to compete effectively for various reasons.
My assertion is that AAPL could, but won't. The monopoly of MSFT in the gaming sphere is because of a sort of happy coincidence between a massive company willing to support third party hardware (MSFT) and another massive company that depends on not supporting it, to maintain a stranglehold on their market (AAPL) and all the minor players who can't afford to support hardware at scale for a prolonged period of time, to compete with MSFT...even thought a few have for a short time and fell behind or were acquired.
I mean, Apple's refusal to license their OS for non-Apple hardware is unambiguously the correct decision. From Apple's perspective there are countless downsides and zero upsides to doing otherwise.
Apple is going to be huge in gaming in the mid-term future. If you have a limited, controlled hardware range, developers can tune Apple-targeted games in the same way that they tune console games. They can guarantee that everything works exactly as intended, which has been the achilles' heel of PC gaming since time immemorial.
I grew up a hardcore gamer and vehement apple-hater, but over the past decade, Apple has become the most competent consumer hardware company on earth and I'm super excited for the future here.
I just can’t imagine Apple building and selling a replacement for a 4080 gpu and top line amd/intel cpu. And if they do they markup would be insane. Look at what they currently charge for compute as an example. Price vs performance ratio between apple and pc is too high to be viable.
So if apple does get into gaming it’s going to be incredibly gimped and ten years behind the tech curve. (Or at the level of consoles) Which might be competitive against that market.
But I am extremely skeptical that apple will compete with top of the line pc gaming in a meaningful way. Pushing +100 fps at 4k is not easy or cheap, and if apple wants to win enthusiasts (or even have decent looking VR for the vision) they’ll need to offer significant compute at a competitive price. So basically, they’ll need to completely change their economics model… and I don’t think they’ll do that.
Even with Apple's "limited" HW, just on the iPhone it's like Sony releasing a new PlayStation every year. Add Mac in and it's like Sony's past decade in consoles every 12-18 months. Plus you add in the fact iOS users balk at paying $5 for a game, Mac gamers are a blip and Apple's penchant disregard for backwards compatibility and love to overcharge for storage and you really don't get a great recipe for gaming.
I think the key to their strategy is the line "more fragmented set of device stores/platform is better for us."
Nadella is acknowledging that being an app means the platform hosting the app has a lot of leverage. Apple could potentially prevent Microsoft from releasing an update on any of the apps they publish on the App Store. Apple and Google essentially play a gatekeeping role between many Microsoft Users and Microsoft itself. This is big for Microsoft because they have a huge install base on Android/iPhone/iPad, and have some exposure to macOS. Teams without iPhone won't work. This is a topic Microsoft is specifically sensitive too because of it's history.
Nadella is saying it's worth investing in less popular devices in hopes that they become a bigger competition again iPhone and Android. In fact, Nadella is hoping Meta takes a small but meaningful portion of the market.
Satya says that the more different competing platforms exist, the better it is for MS and their cloud services ie MS services will be eventually be available on all those platforms like Meta’s Quest charging subscription fees
Imo it’s inline with Satya’s Microsoft Everywhere strategy. Office and Xbox are everywhere now with the cloud
I think it's being posted in light of the USG's pending antitrust lawsuit against Google[1].
(However, that case concerns digital advertising, and this document seems to have been produced by discovery in a different case. So I'm not sure if it's relevant beyond the general theme of "Google considers other companies competitors.")
The significance I saw was Microsoft trying to project being the good guy for upcoming regulations of digital marketplaces. I thought they had a strategy to separate themselves from Apple and Google, and this confirms that. They keep getting the game console stores exempted from consideration.
It's OK. Good value when combined with Game Pass. It means you can stream hundreds of games to any device with one monthly charge, like Netflix for games. Starfield's on there too, along with a bunch of other ones. You have to pay for the "Ultimate" tier to stream, though, and the quality isn't great because they stream from actual Xbox consoles.
If you want better graphics fidelity and performance, you can also now play certain Game Pass games on Nvidia's Geforce Now. Whereas xCloud games stream from underpowered consoles, GFN runs on RTX 4080s and offers MUCH better graphics and performance. It also lets you use a mouse/keyboard for all the supported games, while xCloud streaming games normally limit you to the Xbox version and don't let you use a mouse or keyboard unless the game itself supports it on the Xbox (like Gears).
To do this, you would need both a GFN subscription and a PC Game Pass subscription (though not ultimate).
You can also just use GFN with Steam, which offers a superior experience overall.
Yes. It's arguably one of the better attempts at cloud gaming. I've tried it, still prefer locally hosted games, but the days of cloud gaming are definitely coming soon.
You should try Geforce Now if you haven't already... it performs a lot better than xCloud or Stadia or Shadow. Nvidia has that first-party advantage of being able to stream from RTX 4080s.
Are any of those, or a similar well-polished service, available for use with your own workstation, e.g. for 3D/engineering/video editing workloads? (Basically remote desktop on steroids)
My wife and I use it regularly enough for me not to question the $15 per month Game Pass subscription. Was a very pleasant surprise when we discovered that we could play Xbox games on our Samsung TV without ever owning a physical console. Just needed to buy a controller.
There are some weird rough edges, though. I was very excited to play Fallout 76 with my brother and very disappointed to discover that the cloud gaming system has some limitation around headsets and wireless controllers.
I was making fun of game streaming until recently, didn't understand the use case because latency has to be an issue.
But now I'm a digital nomad with just two laptops, the best GPU I have is a Radeon 680M, and I really wanted to play Baldur's Gate 3. So I signed up for a streaming service and it works great. 9 euro/month and I can play more than I need to over some sort of VDI setup I guess.
I think it can be a real hit for a lot of games, to a lot of people, but some games obviously are best played locally.
I thought GeForce now was a “bring your own library” service? I think they can auth with steam directly and you can play most games available on the steam store.
It is. You can play a subset of games from Steam, Epic, Ubi Plus, and Game Pass -- if the publisher opted them in.
For Game Pass specifically, you don't own the games... it's a subscription service, like EA Play or Ubi Plus, that lets you play from a library of games as long as you're subscribed. Of those, several (but not all) are also playable on GeForce Now. You still have to have a PC Game Pass (or Ultimate) subscription.
Without GeForce Now, if you have PC Game Pass (the cheaper plan), you have to install and play those games on your own computer. If you have Game Pass Ultimate, you can also stream some of those games from Microsoft's cloud of Xboxes. They are inferior to Nvidia's offering, but the selection might be better as long as you don't mind lower resolution and having to use a controller for most games. You also get access to the EA Play subscription library that way.
Confusing, right? It's a pity. When GeForce Now first launched, you could play any game on it -- all of Steam was available -- but publishers started complaining I guess (who knows why? maybe they wanted to one day launch their own services, heh, good luck with that).
Notably, there were a few games that were Stadia streaming exclusives (like Red Dead Redemption and Elder Scrolls Online). Once Stadia died, there was no way to stream those anymore (unless you rent a whole virtual PC like Shadow.tech). Can't play those now, sadly.
There are a lot of streaming services now: First-party ones like Playstation Plus, Amazon Luna, Xbox Cloud, Ubi Cloud (via Luna).
There are third-party stream-a-game-directly ones that just launch you into Steam and/or the game directly, no OS maintenance required: GeForce Now and Boosteroid.
There are also "rent a PC" style streaming services like Shadow, AirGPU, probably others (including renting them directly from cloud providers, usually at an incredibly high price).
Of all those, GeForce Now offers by far the best performance and quality, because Nvidia both controls the supply of RTX cards and also has written some incredible software to support the streaming, enabling play at 4k/ultrawide/120 Hz/low-latency via Reflex, etc. It is also the best value in terms of monthly costs. Its primary downside is the limited library available; only some games are available on there (https://www.nvidia.com/en-us/geforce-now/games/).
>but publishers started complaining I guess (who knows why? maybe they wanted to one day launch their own services, heh, good luck with that).
They could simply want Nvidia to purchase rights to use their game. If you made a movie you wouldn't want random streaming services to be able to stream your movie without paying you for the rights to do so.
Geforce now doesn’t directly offer games. You need to pay to buy those via Steam or another store; you effectively login to a remote Windows desktop and get that streamed.
So, I don’t understand publishers’ complaints, people could only play games they paid for, and Nvidia provided the streaming service. It’s probably technically correct on the publisher side, but I feel Nvidia did nothing wrong there.
I don't understand this line of thought. What difference to the publisher does it make if I buy a PC or rent a Nvidia VM to play their game? They get paid through a sale, not based on what computer I play it on.
And those games are still playable on Shadow, Airgpu, etc., just as bigger headaches.
There are many titles I've purchased because they were available on GFN. (And many I didn't because I don't have a machine to play them on otherwise).
The publisher could, theoretically, double dip: Require a user to buy the game on Steam, and also require Nvidia to cough up the cash to list the game for streaming.
I am cynical enough about the games industry to say this is possible, even probable.
This is probably the “technically correct” part, but it still makes no sense. The publisher is taking away my rights as somebody who bought the game. How is that model impacting a publisher’s bottom line?
What’s next? Should I pay 2x a game for a SLI config with two cards on my PC?
It's not 1:1. If you stream a movie, you don't own it and you are probably not gonna go back and watch it again in most cases. With GFN you first have to buy the game yourself, so you're already paying the publisher. But these publishers are greedy...
I think you actually have to pay for the Ultimate tier. Their branding is SUPER confusing.
If you play on PC, PC Game Pass gets you access to hundreds of games but no streaming.
If you play on Xbox, Game Pass Core gets you multiplayer and a few free games but no streaming.
If you want streaming or play on both PC and Xbox, the Ultimate membership gets you everything.
If you want just the PC games and streaming, you can sign up for PC Game Pass and separately for Nvidia Geforce Now, which streams a subset of Game Pass games but at much higher quality and performance.
To be fair, Microsoft has changed the name of that service several times over the last few years. Even as a member I couldn't figure out what they were talking about because it would get a new name every few months.
To be clear, my answer is wrong or at least misleading. 30 million is the approximate number of game pass subscribers (not including core). When I posted this I missed the word "streaming." Apparently the game streaming service is now a higher tier of the regular game pass service, but that's kind of a technicality. 30 million people are not using their streaming service, afaik.
In my experience, if your executives are communicating in corporate PR speak internally, it means you’re surrounded by grifters and people who have failed upward.
Speaking directly and clearly is a core skill of successful executives. Corporate PR speak only interferes with conveying direct and clear internal messages. You aren’t worried about accidentally offending the people you work with, you’re more worried about not getting your point across as directly as possible.
> Corporate PR speak only interferes with conveying direct and clear internal messages.
I'd argue corporate PR speak directly and purposefully interferes with conveying any kind of message clearly. It often skirts the line between misleading and lying. And obviously you don't want to poison your own well with lies.
I'd expect there's some corporate-speak maximum that happens in middle management, and upper and lower people would use it less, with the curve being flatter for more bureaucratic companies.
Corporate-speak is usually centered around covering your ass with ambiguity.
CEOs dont feel compelled to do that on "private" emails but they sure as hell do it in public statements. They dial it up to 11 during layoffs or being interviewed about unethical behavior.
I liked it for it's simplicity a few years ago (and I always hated Outlook... The Bat was great, and Thunderbird had a few okay years, but basically sucked at the core thing of syncing and reading mails), but haven't really used it for actual emailing.
Except it's an absolute cargo cult of a signature.
"Sent from iPhone", and later "Sent from Mobile" were a thing because virtual keyboards were new, and at first, everyone was slow and imprecise on them.
So the signature was basically a way to say "Hey, sorry if this is abrupt or have typos, but I wrote it from my phone".
It makes zero reasons to include an automatic "Sent from <Desktop App>" signature in every email, except for sad, pathetic, marketing.
I'm in the accelerated testing group and I was moved from Mail to the new Outlook a couple of months ago. It's better than Mail for Windows in my subjective opinion, but it was a bit weird to switch just when Mail had gotten the last missing basic features, like signatures etc.
New Outlook is just atrocious from a design perspective IMO. Everything is flat, and the panels all blend into each other. All the emails subjects run right into the next.
On top of that it just wastes so much space. It's offensive how they manage to both waste space and still have things right up against each other blurring the lines of emails.
Besides Edge and ads everywhere in windows, I think the complete mishmash of design styles and general "unpolished" feel of their apps are really making me wonder why even use Windows. I don't want to fight my software just to use it.
No. It is terrible, a webview around the shitty outlook.com web UI, which is cluttered, inconvenient, and often buggy, especially when rich text editing is needed. Also the Bcc field is hidden and needs some weird incantations to display the field. A steming platform, as it can be read in the emails.
Outlook Express was almost peak outlook usability, and is on the decline ever since. More useless features, and ever worse GUI.
To me this is the #1 reason this looks fake - nobody uses Mail for Windows, it probably doesn't support the security features available in O365, feels totally wrong for someone at MS.
Mail app has its UI deteriorated during the years, but it is still he best email client around. Actually there is no real competition in the field anymore.
To each their own. To me "Thanks" looks boilerplate-y and so will likely fade out as meaningless noise when reading the mail, whereas "Thx" looks like it has at least a (tiny) chance of being genuine.
When watching American TV shows it is already disturbing that people don't seems to say goodbye to each other, just walk away. Guess they found it boilerplate-y too? It would be considered immensely rude here, yet American tourists always complain why the cashiers, or whatever service person don't have an ever-fake-smiling face...
> When watching American TV shows it is already disturbing that people don't seems to say goodbye to each other, just walk away. Guess they found it boilerplate-y too?
I don’t know, I understand why you’d rush to his defense, but in this case I don’t view him as an especially great ceo.
He performs well, as you’d expect from an extremely well compensated CEO of one of the preeminent technology companies of our time, but he’s by no means been exceptional.
Not really accurate, he’s a far better CEO, compared to Tim Cook, Jassy and especially compared to Sundar Pichai who for all intents and purposes is running Alphabet down into the ground. He made Microsoft relevant again using strategy (as opposed to autopilot like Appple).
It’s funny that you chose to compare him to those CEOs in particular, seeing as to how they all inherited great business that were seemingly on autopilot.
All of these companies were poised to explode in value as they have in the last few years. Somehow you’re singling out Satya as being responsible for Microsoft’s growth yet dismissing the other CEOs - not sure why you’re so passionate about defending him, though of course you’re entitled to your opinion.
If anything Tim Cook has done the better job all out of all of them.
Sathya since taking over pivoted Microsoft’s strategy in a fundamental way that no other CEO did. I mean Tim Cook did to some extent firing Ive and all, but they still sell iPhones to people.
Microsoft embraced open source and the cloud when its original profits from the opposite of those two.
Corporations are required to hang on to communications for N years (depending on their type of business and whether or not they are under consent decree)
IMO rules like this are why American corporations are investable. Rule of law is and must be respected.
I'm no lawyer, but this is context-dependent. I have worked in companies where all emails not legally required to be retained longer were auto-deleted after 90 days, Zoom was auto-deleted after 24 hours, and getting any kind of persistent chat was an utter ass-pain, all because Legal was paranoid about liability.
This is interesting to learn. Are other countries required to do this? I’m always searching for reasons US stocks have outperformed global for quite some time now.
In the US, the Federal Rules of Civil Procedure require corporations to keep emails from 3-7 years depending on their content, with some emails potentially kept forever.
Banks are subject to far more stringent record keeping requirements than most other companies. Microsoft is not bound by rules similar to the article you reference.
That said, there are some categories of information that Microsoft would have to retain. In particular, once you reasonably anticipate imminent litigation you have a duty to retain related information.
Banks are far more stringent... For the rank and file, because even low level employees in banks can access private information that allow them to insider trade.
For the CEOs, they were already under intense scrutiny for every single trade, so it makes more different.
In addition to sibling, FOIA is only for the government (and probably work that government contractors do that's related to their contract, which this isn't).
The contents of any evidence brought forward as part of the case by either side as public information that is part of the record of the court case.
This includes information that is submitted voluntarily or extracted via a discovery request. (Not everything you find is public only what you submit as evidence)
Note you can avoid things becoming public but you need a decent reason to "I don't want to share that" isn't enough. It looks like this document was partially redacted for instance.
To your original question courts make the proceedings "public" although there is a process for getting a copy. Not so much a "can you get this" more of a "we don't upload to a server and let web crawlers index everything".
We use signal at work and it is not made for it. When someone joins or leaves the company, ensuring they are added to or removed from the right groups is ripe for mistakes.
the better option is to kill the document retention policy for executives, or keep it as short as legally possible. then, for reasons of corporate security, pre-emptively delete all traces of communications well before anyone subpoenas them.
It’s probably still too little too late, but apples latest metal conversion tool and iPhone event have me hopeful about gaming on apple devices.