Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Macintosh Endgame (mondaynote.com)
60 points by MaysonL on Nov 28, 2016 | hide | past | favorite | 93 comments



Much of the shrinking market is due to the fact that laptops used to need to be replaced every three or four years, and now only need to be replaced every six years or so. That's about how long it takes for the hardware to wear out, and for the next generation of hardware to be significantly better.

In 2022 Intel plans to have 7nm CPUs available. Those laptops will be considerably more powerful, and laptop's purchased today will be starting to fall apart. So it's unlikely that folks are going to go ten years between new laptops any time soon, meaning the size of the market should level off somewhat.

There is plenty of room for Apple to continue to make improvements within this general form factor, and to reduce the price. And if tablets or phones really do somehow make laptops obsolete, then Apple will just license or open source OS X. That way it wouldn't have to deal with making laptops, but developers would still have a good platform for making iOS apps.

The idea that OS X or MBPs are going anyway makes no sense, at least for the next decade or so.


I agree with the first portion of your post, but I'm not entirely convinced on the second. I do think it is entirely within the realm of possibility that Apple will kill off their desktop and laptop lines. I think it's a ways off, but Apple's refusal to evolve in any meaningful way in the space might make their efforts a legacy line at best.


I think it's a ways off

Want to weigh in on a time line? To misquote Fight Club, “On a long enough time line, the survival rate for every product drops to zero.”

Apple's refusal to evolve in any meaningful way in the space

Given the complaints about the changes Apple made in the most recent MBP update, what evolution are you looking for? Touchbar, larger trackpad, and USB-C aren't meaningful? Continuing to reduce the size of the machine? Those all seem evolutionary (rather than revolutionary), and quite typical of the iterative updates Apple is known for.

Are only increased memory and CPU speeds meaningful? Or the lack of a touch screen? Maybe no equivalent to the MS Surface Book? What would constitute a meaningful change for you?

Feeling my oats a little bit, so feel free to push back. I can understand people not being happy with the new Macbook Pros for a variety of reasons. However, refusal to evolve in any meaningful way sounds unjustified.


> I do think it is entirely within the realm of possibility that Apple will kill off their desktop and laptop lines.

How would anyone develop apps for their IOS devices than?


Probably using Visual Studio on a Windows machine.


I don't think XCode for iPad is that far


Or do something radical to kill the need for desktops and laptops. A dock with a monitor and a keyboard you slot your iPhone into? I would say it sounds absurd but given how many people claim, even on HN, to find it satisfactory to program and SSH and everything else on their iPads...


He's ignoring a rather large elephant in the room: iOS is the most locked down OS in history.

That's going to keep two kinds of people from ever moving to it as their primary computing device: people who care about freedom, and power users who care about versatility and complex work flows. The latter includes most developers. You just can't do that stuff on iOS. You can barely even run your own apps on it.

I suppose you could push a model where all the interesting stuff lives in the cloud or headless metal servers and the iOS device is a portable terminal with apps and the Web as sophisticated terminal emulations more or less. Think of an app or a big JS site as a giant analog of a VT macro.

I see PC sales declining and then leveling off after having lost most casual and low end users to tablets and mobile.


Productive work - unlike consumption - is based on the file metaphor. Not well supported on iOS. The workplace runs on written conversations and that means keyboards are required.

The high road leads to no good place. Apple has been there in the past. SGI and many other bodies are laying rotting left and right.

Prices for PC are declining and in any case there is already a significant and probably unsustainable gap. Macs have 4 cost drivers that Apple can tackle to ensure they keep growing their market share:

- Manufacturing

- Intel

- Apple margins

- Mac features

Compromising manufacturing would damage the brand. Intel is somewhat kept in check by the threat of Ax (and fighting back by providing more function/lock-in around the CPU) but ultimately an Ax processor may be attractive. Apple margins are something to consider both from appropriateness for the market and as a stop-gap measure. Providing a low feature Mac may be another and imho the most promising one. A 12-13'' laptop aimed at the basic educational need and simple business use case market priced around $1000. Not much CPU/GPU but enough battery to last a day.


This is a great point. Keep am eye on the "Raspberry Pi clamshell case" type laptops. These are truly disruptive. A much higher end version of this could be interesting. I'd consider an Octacore A64 with 8-16GB RAM and 4K or better display.


Agree on all points.

iOS is limited by the inability to develop iOS apps on iOS devices, because software not hardware. (Same for Android of course.) Swift Playgrounds suggests they're moving in that direction, but it looks like a very far cry from Xcode.

Hardware-wise, the iPad Pro, new MacBook, and new MBP seem to be closing in on convergence to an "iPadBook" (too bad the iBooks app means no resurrection of the iBook brand...), i.e. dual screen iPad. When I first typed on a new MB, I got the feeling the flat keys were partly preparing people to transition to touchscreen typing. The touchbar is another step in that direction.

Something a bit like this feels inevitable:

https://bencollier.net/wp-content/uploads/2010/10/20101021-0...

Software-wise, macOS and iOS still seem far apart though. But I'm no Mac or iOS developer. Hell, at this point I basically only use iOS for testing my company's apps before a release.

(Funny that the reasons I went to, and have stayed on, Android this year have zero to do with software, i.e. USB-C, fingerprint sensor on the back, and Project Fi. All the apps I want are on both platforms, and I would never develop for either.)


He's missing the big picture, in my mind. Pc sales may be in decline but aside from a few quarters, Apple has sold more macs by more than 6 times it did when it plateaued ten years ago. Sales have been on a steady incline because they were always selling miniscule amounts compared to the other cheap brands on the market. There was almost no where to go but up. It's not milking anything.


My brother, a comparatively well paid teacher in the Midwest desperately wants but cannot afford a new MacBook Pro. At the same time his school and many others are awash in perfectly functional Chromebooks.

In my own experience then we've got Apple purposely cutting off vast swaths of potential buyers due to lackluster/irregular updates and high prices on one end, and seemingly unaware of a vast market for education and lower price users on the other.

A declining market is one thing but this, so far as it looks to me, is Apple simply throwing potential customers away. That's not riding a wave, it's just plain bad business. What am I missing here?


I think two things are happening. The first is, Apple does not pay attention to it's marketshare, instead thinking of itself as a mercedes with 10% market share on the high end. Apple wants to avoid commoditization at all costs. Two, I think that Apple believes strongly in protecting it's margins. Despite the current cash piles, Apple execs still vividly remember a time when Steve jobs resurrected the company from near death[0], and religiously follows what worked for them before. One lesson was to always prefer to pouring money into R&D over decreasing pricing.

[0] I mean, this wired cover was cited often during my time there as a perfect illustration of the mood then https://cdn2.macworld.co.uk/cmsdata/features/3520866/Wired-p...


Apple was pretty clearly shaped by their near death experience and it showed on the balance sheet. It took them a very long time for them to start returning cash to shareholders, but Tim Cook finally did it and that was the right capital allocation decision. They've returned $186 billion via buybacks and dividends since then and cash (and equivalents) has grown in spite of it.

They would rather stay relatively small (115k employees but over half of that is probably retail) and operate like a start-up than grow headcount to support more products. Frankly that's the right decision. There's been a lot of whining from Mac fans about being neglected and today people have been sharing this Vox piece [1] saying they should change to support more products, which would be fine if it wasn't the fact that being resource constrained is part of Apple's organizational culture and if it wouldn't kill the magic that led to their position in the first place.

[1] http://www.vox.com/new-money/2016/11/27/13706776/apple-funct...


I was wondering for a while what if Intel bought say Compaq back in 1991 instead of Rod Canion and Jim Harris leaving.


Chromebooks and notebooks with Linux on them do tend to be cheaper than Macbooks.

When Apple first started the Apple 1, was $666.66, it was a breakthrough in using cheaper parts to make a cheaper microcomputer for a market that was just growing and didn't discover what these microcomputers could do.

Over time the Apple /// and Apple Lisa flopped. Steve Jobs knew he had to innovate to make something to give Apple and advantage over other microcomputer makers. Taking the GUI designed for the Apple Lisa Jobs took over the Macintosh project to make it into a GUI based desktop computer using a mouse. The problem is it cost a few thousand dollars and did not run Apple II software. There were not many apps available for it, so Apple let Jobs go and replaced him with John Sculley.

Apple in this post-Jobs era had sold expensive Macintoshes and had a hard time earning money for profits. They brought Steve Jobs back from Next to fix the problem. His solution was a cheaper Mac named the iMac, then later the Mac Mini as an ever cheaper Mac.

The Macbooks have always been targeted at developers, web designers, and programmers. People who had a high paying job and could afford a few thousand dollars on a Macbook Pro. Teachers are not paid enough to afford Macbook Pros and have to settle for cheaper things like the Chromebook for like $300 or $200 instead of $2000 for the Macbook Pro and the educational discount.

Apple has sort of stopped innovating after the Jobs 2.0 era is over and Tim Cook runs Apple now. Just look at the Mac Pro, it used to be ATX based which means gamers would buy it to upgrade the video card to play more games. Tim Cook changed the Mac Pro to a trashcan tower that can't upgrade the video card, forcing Mac gamers into making Hackintosh systems and eating into Apple's sales.

The Amazon Echo and Google Chromecast have sort of beat the AI at Apple. Apple is getting out of Wifi routers and Google now has Google Wifi and taking over Apple's sales.

In the Post-Jobs 2.0 era at Apple, it is still profitable via iPhone and iPad sales, but has no endgame for the Macintosh, and left a void that Google, Amazon, and Microsoft are looking to fill.


Just look at the Mac Pro, it used to be ATX based which means gamers would buy it to upgrade the video card to play more games.

Did any significant number of users actually do that though? I do remember in the Power Mac G5 era that NVIDIA would, occasionally, vaguely support Mac gaming by releasing high-end consumer-level graphics cards for the Mac, but I truthfully don't remember ever hearing a single user account of a user riding the expandable Mac Pro wave for high-end gaming.


Some of the video games need a better GPU than the stock that comes with Macs. The Mac Pro since it was based on ATX standards would take a PCIe NVidia/AMD/ATI video card that could be replaced.

Tell me about a modern Mac made today that can exchange the GPU for a faster one.

If the Macs can't exchange the video card for a better one, they are losing the gaming market. More Windows than Mac games, and Bootcamp can install Windows on an Intel Mac to run Windows games but if the GPU is too slow for the top of the line games, might as well buy a cheaper PC with expandable video cards instead.

I've found Linux Mint or Debian on a PC ATX gaming system runs faster than on an Intel Mac with OSX and video games.

Apple is losing their customer base by doing a classic blunder that Atari, Commodore, etc tried, they don't know the target market of users. Apple is losing the Mac gaming market, due to price losing the Educational market, due to not having a pressure sensitive pen the Artist and creative market, and while Apple has the developer, web designer, and programmer market with Macbook Pros for Startups as the Macbook can book OSX, Windows, and Linux with Bootcamp, they've forgotten of their other markets that left a void that others like Google, Amazon, Microsoft are filling.


Your brother wants a Mac in no small part because he can't afford one. It's a status symbol. Ferrari make ridiculously unpractical cars that they refuse to sell for cheap; same thing with Apple, especially in the latest iterations. Advertising and society tell one that wealthy and hip people use Macs, and one "decides" he wants a Mac.

If your brother really needs OSX, he can go second-hand (I use a 2012 retina model, still works great); if he needs general computing, there are tons of equivalent alternatives at cheaper prices. Apple is never going to sell him anything for cheap because it would damage their brand.


You're missing the fact that he doesn't need a new MacBook Pro. It's a luxury item. If Chromebooks are perfectly functional for school use cases, clearly a new MBP is entirely unnecessary. If he wants OS X and a retina screen, the baseline MB 12 is more suitable for his needs. Or he could get an older, used MBP.


You might be missing the point here. I think what the parent was saying is that Apple are not making a sale because they're (needlessly?) keeping their margin at X% whereas someone who currently states they can't afford that would've bought a MBP for Y% less margin.

But then again, who needs an MBP? Maybe if you have a use case that is only covered by an Apple Laptop and not a similarly specced model with Windows and Linux - but most people who buy a MBP want one, not need one. (Which is perfectly fine, I'm not judging)


Apple has nearly always been happy to throw potential customers away, if those potential customers aren't potential high-margin customers. This preference goes all the way back to the original Macintosh. The Apple Way is to sell products that can command premium prices to whatever segment of the market is willing to pay those prices, and leave price-sensitive customers for their competitors to fight over.


What am I missing here?

Apple is confident that in an iPhone/iPad-centric ecosystem consumers will be strongly drawn to Apple computers. It's the same reason OSX is becoming more like iOS, they're not interested in catering to power users - average iPhone/iPad users are a much bigger market.


Does the Apple educational pricing discounts still not make it worthwhile? I have no idea what the discount percentage is now, but I remember years ago some teacher (and tertiary student) friends of mine being delighted in being able to afford and Apple Macbook due to this incentive.


Last time I checked the educational discount was around $100 for a $2K+ laptop. Here is the relevant link http://www.apple.com/us-hed/shop/buy-mac/macbook-pro/13-inch if someone is interested.


Hmm, I thought I could remember my friends saying the educational discount was around 20%. Perhaps I am not remembering correctly, or the discount has changed a lot lately.


I remember larger discounts as well. That said, summer vacation seemed longer, too.


You could check yourself http://www.apple.com/us-hed/shop/buy-mac/macbook-pro/13-inch there is indeed a $200 discount for the high end 15" version. The low end 13" has a $0 discount. Remaining 13"/15" models all have $100 discount.


Looking at my old receipts the discount for college students in the UK at least was approx 14% on Macs.


It's hard to create a good menu. The lower priced options will often be good enough that folks who would have bought the higher priced item will buy the lower instead. This lost revenue will often be greater than the total new revenue for folks that would have otherwise bought nothing from you.

The free market finds an equilibrium price such that some demand is not met. To meet all demand, the price would need to be zero or even negative.


> seemingly unaware of a vast market for education and lower price users on the other.

Does anybody make money off of Chromebooks?


Google does if you're paying attention to the long game. Does anyone think that Chrome, and by extension ChromeOS, is pure altruism?


It's probably more accurate to say that Google hopes to. I don't know that I've ever seen an analysis that directly connects the dots between Chromebooks and profit growth at Google.


I meant more in terms of the hardware manufacturers; obviously Google does (or intends to).


Chromebooks are using a Linux type OS with the Chrome web browser as the GUI. They are sold at $300 or $200 because there is no $100 to $200 Windows Tax by Microsoft on them. Google makes money with Adsense and collecting user data like any other Dotcom might do. Google also has Chromebooks that can run Android apps from the Google Play store as well as ebooks, music, videos, etc from it.

Google has a different way to make money off Chromebooks that Apple does with Macbooks. Apple claims not to collect data and sell it to make money, which is why Apple products cost more than normal PC or Chromebook products.


What data does Google sell and to whom?


Here this may explain it better, you have to read the TOS and EULA from each company: http://www.pcworld.com/article/2986988/privacy/the-price-of-...


Please, I have read the TOS of the companies and the article you have linked and I can't find where is said that the data is sold.

Can you point a source to your claim? Thanks


I am copying this from someone else who replied to you, and you didn't say anything.

https://en.wikipedia.org/wiki/PRISM_(surveillance_program) And a giant backdoor with purposefully vague language in their privacy policy to let them sell data. https://www.google.com/policies/privacy/ "We provide personal information to our affiliates or other trusted businesses or persons to process it for us, based on our instructions and in compliance with our Privacy Policy and any other appropriate confidentiality and security measures."

I really wish you'd read articles and remember what others wrote to you in the comments. The only reason why I have trouble with that is I have a mental illness that put me on disability. I try to learn from my mistakes and the mistakes of others.

If you forgot because you have some sort of mental illness, I understand, if not and you have a perfect memory, you might just be trying to troll or put up a test for others like college professors do.


>? And a giant backdoor with purposefully vague language in their privacy policy to let them sell data. https://www.google.com/policies/privacy/ "We provide personal information to our affiliates or other trusted businesses or persons to process it for us, based on our instructions and in compliance with our Privacy Policy and any other appropriate confidentiality and security measures."

I suppose that English is not your first language, because what you have quoted has nothing to do with your claim

And now you put PRISM? Do you know what the heck is PRISM?

> f you forgot because you have some sort of mental illness, I understand, if not and you have a perfect memory, you might just be trying to troll or put up a test for others like college professors do.

so, not only you don't know what you talk about, you insult people when you're wrong and has shown to you?



No, still no word of Google or others selling your data.

But if you think that I missed it, can you point in which line is said that Google sells the data?


I am not going to do your homework. If you refuse to read it, it is not my problem.


I have read, there is nothing in the links that you have provided that says that Google sells your data.

If you don't know the business model of Google and you can't read your own links it is not my problem.

But don't let facts break your imagination


One more thing Google open sourced their AI routine but didn't release the data. Collecting data and selling it is how a lot of Dotcoms earn their money for free web services. They don't give data away for free. How do you think Google earns money on free web services if they don't sell user data and advertising based on user data?


> One more thing Google open sourced their AI routine but didn't release the data.

Now you must be joking or it is just that you don/'t know anything about what you talk about.

> They don't give data away for free. How do you think Google earns money on free web services if they don't sell user data and advertising based on user data?

I know how they make money, the one that clearly doesn't know is you.


No you didn't because in the headline of the last article had Tim Cook CEO of Apple saying that Google sells user data. You'd know that if you read it.


Oh, yes, because Tim cook said anything it is the Bible.

Time to stop reading your poor trollish attempts, have a nice day and when you really start to comprehend how advertisement work perhaps you won't be such wrong.


Since you are too lazy to do it:

"Google

Google has become the de facto name in search (although I’ve since switched to Bing) and Gmail, Google Maps, and its other services now rank among the leaders in those categories. But all that “free” adds up to a huge amount of your personal information being traded away to create personalized, targeted ad experiences.

The latest? Google has launched a program by which your profile is now keyed to your email address. Dubbed Customer Match, the program ensures that an advertiser’s “brand is right there, with the right message, at the moment your customer is most receptive,” Google promises. So if you’ve previously asked a travel site to send information to your Gmail address, that site can sign up for Customer Match. Then when you’re watching YouTube, that site “can show ads that inspire them to plan their next trip.”

User data is sold to target Google users with advertising.

From the second link:

"Cook has repeatedly look to differentiate Apple from its competitors like Google and Facebook — where, he has said, users don’t pay for products and therefore become the products themselves. Apple, he said, does “not share information with advertisers that personally identifies you, unless you gave us permission”.

Companies like Google use the data generated from the people on their services to generate revenues and fund their maintenance. By collecting information about what people are searching for, for instance, the company is able to display more relevant ads.

The policy has allowed Google to make almost all of its biggest consumer services free. Its recently announced Photos app, for instance, gives people as much space as they want to store their images in the cloud — whereas Apple’s is limited and users must pay for access.

Tim Cook seemed partly to be responding directly to Google’s recent announcement. “You might like these so-called free services, but we don’t think they’re worth having your email or your search history or now even your family photos data-mined and sold off for God knows what advertising purpose,” he said.

Cook also argued that American citizens have a right to encrypt their data, and in so doing keep it from inspection by law enforcement officials. Apple and over 140 other tech firms including Google signed a letter to Barack Obama last month, asking him not to force them to hand over user data.

“Now, we have a deep respect for law enforcement, and we work together with them in many areas, but on this issue we disagree," he said. "So let me be crystal clear — weakening encryption, or taking it away, harms good people that are using it for the right reasons. And ultimately, I believe it has a chilling effect on our First Amendment rights and undermines our country’s founding principles.”

In order for you to be correct, Google would not sell the user data in any way to any advertiser or third party.

I know Google means to not be evil, but almost every Dotcom business model provides free web services to collect user data, and then sell part of that data to advertisers.

You'd know that if you read the articles I cited. I had hoped you would be honest and read them and tell me, but instead you turned into an Internet Troll and didn't read the articles, which I am now proving by citing the articles, which I am not supposed to cite if you actually read them.


> User data is sold to target Google users with advertising.

this is false, please, provide a link where it is confirmed

> In order for you to be correct, Google would not sell the user data in any way to any advertiser or third party.

No, in order to be correct YOU, you have to prove just a single proof that data is sold and you have not provided any single proof

> You'd know that if you read the articles I cited. I had hoped you would be honest and read them and tell me, but instead you turned into an Internet Troll and didn't read the articles, which I am now proving by citing the articles, which I am not supposed to cite if you actually read them.

If you were honest and you read and COMPREHEND the articles you linked you should have seen that no fricking data is sold. What is sold is ad space

The one being an internet troll is you, you own links prove you wrong and you don't want to acknowledge it


I meant the hardware manufacturers.


>Microsoft contends that the Surface hybrid, now in its fourth iteration, is the best-of-both-worlds that their customers want.

Microsoft contends this because the market forced their hand. They have no presence in smartphones or tablets so having a separate operating system for them is untenable. They tried very hard to get developers on board and it didn't work.


Surface didn't work? Its current generation is making a profit and is well-reviewed by users and tech journalists alike.

For what it's worth, Surface products are exactly what I want in a portable computer. I find both traditional tablets (such as Android tablets and iPads) and legacy-style laptops (including the new Macbook Pro) unexciting and ultimately not worth my money.


I think the poster meant that Windows Phone didn't work, so they doubled down on Surface instead, which is all Windows 10 for all form factors (so there's no need to get developers to make special apps anymore).


They had to go back to hardware that runs Win10 x64 apps, because nobody wanted to ship WinRT ARM apps.


Microsoft had to go back to x86 apps for two reasons.

1. WinRT devices couldn't run x86 (win32) apps. Thus customers who were frustrated and returned the products in droves.

2. Product name - if Microsoft had not named WinRT OS Windows, customer expectations would have been clear.

3. Lack of Emulator - Apple shipped an emulator whenever they changed processor platform but Microsoft did no such thing with WinRT, believing that security was more important than usability for their customer base - they were wrong and have still not learned the lesson. Apparently, the company's now working on an win32 Emulator for ARM

* One reason Microsoft didn't do well in countries with poor internet penetration is because all apps MUST to be downloaded from the store on the device in question. Heck, even IOS allows an alternate download means through iTunes.

Even after putting an option to sideload apps on both Windows 10 desktop and mobile, it's impossible for normal users to download and store the apps offline for later use.

Stability on windows 10 mobile is far worse than on WP8.1. E.g - I've reset my own windows 10 phone more times in three months than I ever did in WP8.1


Oh is that what we're talking about? Yeah, WinRT was needlessly limited.

But it's also a distant memory. Speaking about it today is about as relevant as discussing iOS 6 or Android 4.1.


Possibly. Still, when you think about the death of Windows RT, remember that there's still a version of Windows running on ARM, and the ecosystem of applications that run on it is now called "Windows Universal apps."

Windows Phone may be moribund in the market but that ARM capacity and the admittedly not ready for prime time Continuum (phone + monitor /keyboard /mouse) desktop could lead to a better supported and accepted descendant returning to the market currently dominated by Chromebooks.


Good article, and as the ex-CEO of Apple, he thinks more deeply about Apple's business than most of us.

Even though I have several Linux laptops and a Chromebook, the device I use the most is my 12" iPad Pro. It is great for light weight writing, SSHing into a server to trouble shoot a problem, run updates, etc. I just bought a new MacBook, which I am starting to like a lot, after a period of getting used to the keyboard, but I just use it for work. If my wife sees me using the MacBook then she assumes I am working (programming). With the iPad, I could be having fun or working. I look at the iPad as the future for most people.

That said, what I really want is convergence. I want one seemless environment that seems the same no matter what device I am on. I actually more or less have that, in a kluge way, but I want something more polished and effortless. I think Apple, and perhaps Microsoft have a good shot at doing this near term. With some trade offs, you can get this now with a Chromebook, Android phone, Chromecast, and Google Home but as much as I like Google's garden I don't want to live in it all the time.


I want one seemless environment that seems the same no matter what device I am on.

Microsoft tried going in that direction with Windows 8, and it still appears to be somewhat of a trend, but all they did was dumb-down the desktop interface to mobile touchscreen levels of usefulness.


He was never the CEO of Apple. He was Head of Macintosh Development.


He was an Apple exec, but never CEO.


He was also a founder of Be, which may contribute to people misremembering. There was talk of Apple buying Be back in the day, but they ended up buying NeXT instead. In some parallel universe, Gassée ended up playing the role of Steve Jobs.


When you SSH from your iPad, do you use the touch keys or do you have a physical keyboard?


The virtual keyboard on Prompt has special keys, so that is usually more convenient.

I bought the physical keyboard from Apple, but I do't use it very often.


> Even though I have several Linux laptops and a Chromebook

Honest question: What is the use-case for having several notebooks?


I just bought an HP, which was half the price of the new MacBook Pro, has better specs, and looks as good if not better. And instead of having a fancy touch-friendly strip, my entire screen is touch-enabled. I've invested tons of money into little, but expensive apps that allow me to maximize windows instead of making them full-screen (Moom), or make a large number of icons fit in the menubar (Bartender), but this is getting ridiculous! I think Windows 10 Pro is a better OS and is more developer-friendly than MacBook nowadays, too.


"In a fantasy world, Apple produces an Ax-based Mac"

Combining the iPad and Macbook is equivalent to rolling the iPod into the iPhone.

I also have a Macbook and its the perfect computer for normal laptop tasks. I am surprised by how many people don't know this exists.

"But what would an Ax Mac mean in the real world, to software developers?"

There are lots of companies producing iOS and Mac apps. It is going to be easier than porting programs from Windows to Mac.


> I also have a Macbook and its the perfect computer for normal laptop tasks. I am surprised by how many people don't know this exists.

They know it exists, they also know is $1300 and up.


The MacBook Air is $999, with plenty of discounts to be found. Also a nearly perfect computer for the vast majority of use cases. Any cheaper, and you're into disposable shittop range.


The list of Kaby Lake laptops of all sizes is starting to get pretty long[1], with more and more of them having better performance, battery life, weight and screen resolution than the MacBook Air for less than $1000 if you're willing to deal with Windows[2].

I think Apple will have something new in the same space as the Air or non-Pro MacBook soon, but the options aren't great right now.

[1] http://www.ultrabookreview.com/11638-intel-kabylake/

[2] Driver support probably isn't great for Linux yet.


Will these laptops last 3+ years? Anything I've bought sub $1k fell apart or failed in some expensive way within 18 months.


No idea. I occasionally use a ten year old ThinkPad T60 that runs Linux Mint 18 (2016) MATE okay.

I'm pretty sure modern ThinkPads are not as durable. The Dell XPS might be a lot better than Dell laptops of the past, though.


> Combining the iPad and Macbook is equivalent to rolling the iPod into the iPhone.

It really isn't, Apple is right that the interaction models are too different.

The argument Microsoft make that you can have your cake and eat it, have a device that's an ideal tablet and an ideal laptop, has failed. The compromises are too great. Desktop software has interaction targets that are too small for a touch interface and the mouse interaction model doesn't map across to touch. But supporting both touch apps and desktop apps on the same device is a non starter. You can't require all your app developers to implement two different UIs every time.

The iPad Pro optional keyboard doesn't prove anything in this regard. It's just a dedicated text input device, not an alernate primary interaction mode.

Note that MS and Apple approach this from opposite directions. The MS solution is primarily a keyboard interface with touch as an ancillary mode. The iPad Pro is primarily a touch device with an ancillary keyboard mode.

But desktop software on the Surface is next to unusable in tablet mode. Meanwhile iPad software is perfectly usable with the keyboard attached. But it achieves this by decidedly not taking full advantage of the horizontal interface. There is no trackpad on the iPad Pro keyboard and no mouse support. For iPad software, they serve no purpose. That's why even with keyboard attached the iPad is decidedly not a converged device. The keyboard is not a primary interface, it's really just a text input peripheral with some convenience UI interaction options. There will never be iPad software that requires the physical keyboard.


> The argument Microsoft make that you can have your cake and eat it, have a device that's an ideal tablet and an ideal laptop, has failed.

Wait, it has? It has evolved, and continues to evolve; but I see no evidence that it has "failed." The Surface line has swelled to over $1 Billion, and hybrid devices are the fastest growing segment of the OEM market (the only growing segment, really.) The design process of apps is changing, in the same that is changed for the web. Apps will become responsive, and change with use input and share the same codebase. That's Microsoft's vision, and I think it makes the most sense. Even Google is starting to bend to this. Apple will be next.

And in the same context, some apps don't need to evolve with input. Some apps just aren't meant to work with touch or vice versa, and that's ok. No one is going to run their IDE in touch, and no one is going to run their drawing program with a mouse. But the ability to dock and switch with the same device is appealing to customers and businesses. It saves money on device costs, and it has the possibility to save money on the development of software that wants to live in multiple worlds with multiple inputs.


Hybrid devices are selling well, but they're overwhelmingly not being used as hybrid devices. Only a tiny minority of users ever actually use the devices in tablet mode. See my reply to spronkey below on why this is a terrible sign.

I get why MS and Google are doing this. Their tablet strategies have largely failed so they're trying to leverage desktop/laptop success to prop up the tablet side. I contend that it will have the opposite effect because it will cause an accelerated tapping away of attention and resources from the struggling platform to the more successful one.


Do you have any data to back up the idea that touch input is not used commonly on hybrid devices running Windows? I'd love to see that data, because I don't believe it exists. There's a difference between running windows in "tablet mode" and just using touch casually as an input. I never put my Surface in tablet mode, but I have set my parents to always be in tablet mode because it's the only mode of operation they use. I dock my Surface and use it as my work station, but when I'm at home I use it strictly as a tablet. I have a strong feeling that others do similar.

I also don't buy your notion that this is a "toasterfridge" solution. That's an Apple talking point.


The point isn't that you can use all of your desktop apps on a touchscreen, it's that the device can adapt to context. I'll never use Affinity Designer on my Surface without it either plugged in to my 34" display or, at a stretch, the Type Cover. The UI doesn't work for touch input. Doesn't matter. I likewise will probably never use Drawboard PDF with a keyboard and mouse. Again, doesn't matter. It's so quick to transition from touch/pen to keyboard/mouse that it's a nonissue. You can always use the input method most suited to the task at hand.


The only world in which the 'device adapts to me' model works is where the software adapts as well. If I have software that only works well in tablet mode and other software that only works well in laptop mode, I'm the one who has to do all the adapting to switch between them.

In all Honesty, I'd rather carry two different devices completely optimised for the different interaction models than deal with all the compromises of a toasterfridge solution.


Let's reset the expectations, and answer the question again.

If you could buy an iPad Pro, that runs all the iPad apps, has a touchscreen with the awesome pen digitiser, that was also as fast as a Surface Pro, and could run all the desktop apps within the same OS as well (even though the desktop interactions might by in large require a keyboard and mouse/trackpad), would you do this in preference to either the current iPad Pro or the current Surface Pro?

Yes, yes I think you would.

Developers shouldn't be asked to build hybrid apps. They should be given the option to build hybrid apps, desktop apps, or tablet apps, and have all of those solutions deploy on the same hardware.


If it didn't compromise the device design, didn't compromise the OS design in either mode and also provided strong and clear incentives to app developers on both the desktop and tablet mode apps without tradeoffs, and also was crystal clear about app discovery and acquisition for each mode, sure maybe.

I just don't think those are achievable goals in practice. So trying to achieve them is a fool's errand. On the hardware side compromise is inevitable. On the software side look at the problems Apple has with pushing forward both the Mac and the iOS platforms. They're actually doing a fairly decent job pushing forward the Mac with new hardware features, radical designs and annual OS updates. Even so there are cracks. The App Store model is a sore point, the pro desktops are woefully out of date and there are widely-voiced concerns over the general quality of their desktop software and tardiness with iWork and other app development.

How would strapping iOS and MacOS together in one device help this? When one side of the system is lagging behind and suffering from issues, you still have the other side to prop it up, right?

Actually that's a horrible pit trap. The more successful side of the system will suck user focus, developer attention and platform owner resources away from the side that's struggling. Why put resources there when the other side of the system is doing so well? Rather than prop up the weak side, it pretty much guarantees condemning it to a death spiral.

So no, I would not like to both have my cake and eat it, because I know that is not possible without compromises and risks that I am not prepared to accept.


I really disagree that it isn't possible without compromise. The Surface Pro, Surface Book, and iPad Pro all have hardware capable enough of providing both experiences for many users.

If you, right now, built an iOS virtual machine into a touchscreen "hybrid" Mac/iPad, it would be a somewhat usable system. Focus some engineering efforts on combining system settings, some sort of filesystem/local iCloud integration, and it would be a very usable system.

What compromises would have to be made that aren't already made worse by the fact that you have to have two separate devices?

It could be argued that Apple have dropped the ball in software probably precisely because their platforms have diverged so much, and even more so with Swift, iOS moving away from older APIs.

The argument that I'm trying to make is that if you treat "the system" as a continuum, there is no "side" that gets more attention. Some things will focus a little more on the touch side, other things will focus on the desktop side, and many will take advantage of the inbetween.

There are little glimpses of this in Windows 10, and a few in Windows 8. Microsoft lacked the mobile ecosystem to pull it off, but I don't for one moment believe that it isn't possible.


I don't have an iPad Pro, but I occasionally use an iPad Air with a bluetooth keyboard. Command-tab for switching between apps is much, much more convenient than the touch interface. Using command-space then typing for launching apps is also more convenient that swiping then typing.

Compared to general content consumption, I think that a large portion of work-like interactions with computers or tablets involves keyboards rather than either pointing devices or touch interfaces.

I don't entirely disagree that there remains a large difference between the types of device for now. However, I do think the keyboard is a more significant differentiator than the type of pointing. For example, whenever I am doing much with Excel I use Windows in VirtualBox, because even though it probably cuts the CPU performance in half, I can use the Excel-for-Windows keyboard shortcuts and combinations rather than having to constantly shift to a mouse or trackpad.


Ipads will also need to add extra monitor support as a buying a macbrook pro and some monitors is worth the extra cost versus developing on the small screen real estate of an ipad.

In addition, IOS app development is only supported on mac which will not be trivial to port to IOS. As @simonh said, touch and mouse UX are quite different and I don't see them being combined optimally in any near future.


Idea extension:

VR to enable users to maximize visual field for productivity, Magic leap type interaction which could bridge the gap between touch and mouse (pointing/"touching" at objects in a users visual field would require less effort than actually trying to tap the screens of multiple monitors) might be that synergy that will be able to bridge those gaps * .

* keyboards are still probably going to be used for a long time though.


I was watching an episode of Star Trek: Enterprise where Trip was staring at four PADDs in the mess hall while trying to rewrite warp-drive equations.

Sometimes I wonder if multiple simultaneous iPad use is in the future of computing.


> There are lots of companies producing iOS and Mac apps. It is going to be easier than porting programs from Windows to Mac.

Most of the application have server-side components too and developers work on both side. Currently that means you can setup a develop with a MBP and he is setup. With an Ax MBP, that's no longer the case.

Will they switch to another architecture keeping the same weaknesses as the current one ? What would be the point ? Apple is very opinionated with their stuff so can you be sure you will be able to run software like java ? That does not sound very likely.

So yes, there are plenty of companies that will port the software. However, developers themselves will need to have their needs fulfilled by another company.

That removes basically the last set of free professional endorsement for the platform and set it on equal footing with Google and Microsoft. I guess that's good for the competition.


What makes you think that the server-side components are going to be incompatible with Ax MBPs? Most open-source software runs just fine on ARM, and I would expect proprietary developers to quickly follow suit if Apple pushes Ax MBPs.


There is an enormous amount of assumption about what would the Ax MBP look like. People assume the current MBP with a different chipset.

That's however not the trend that Apple is currently following. For example it does not seem likely you would be able to run the software you want on it, especially not VM or Java or server side component. Think of it - if you prevent that, you solve the virus and malware problem and that's something that people are actually quite ok with in their iPad and iPhone. It is just such a win for regular people that it must be what Apple would be going for.

Maybe Apple will do a special "Developer" application that comes with a developer friendly package, tied to your AppDeveloper account. It will probably allows you to do most of your development, but not the complete freedom you currently have on the MBP. (and it will probably be nice, as you as you develop in the Apple ecosystem for the Apple ecosystem) That's IMO the best case scenario.

Most likely they will simply drop the developer entirely. Sure they will keep some way of coding in a restricted fashion, but a professional that does not have the same requirement as a hobbyist home ios developer will find nothing in Apple product line. ( you can see hint of that: you would need plenty of different MBP configuration just to please a handful of people, while the market reality is that the new MBP is selling like hot cakes. Developer are just too difficult to please to be worthwhile )

That's the meaning of post-PC world. People using a PC will be specialist, there are going to be specialist brand to cater to their need. That's not going to come from a consumer brand like Apple.


JLG seems deliberately stuck in the past. Just for starters they retired the Macintosh brand 18 years ago.


Or it might be intentional tongue-in-cheek nostalgia or something along those lines.

The guy was once head of Macintosh division back when PCs mattered a lot more. Give him some cred. :P


Perhaps but I'm not sure it's healthy to be nostalgic for the Macintosh line of 20 years ago. It was a train wreck, really.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: