The 2016 wasn't leadership, it was Johnny Ive without Steve Jobs bringing him back to reality.
Touch Bar? This was nothing more than adding expense to raise the ASP (Average Selling Price) of Macbooks, that had fallen precipitously low from a shareholder perspective because of the superb value-for-money proposition that was the 13" Macbook Air.
The butterfly keyboard was Ive shaving off 0.5mm of the width for a worse user experience with a higher production cost and less reliability.
USB-C only was a philosophical move rather than a practical one that forced people everywhere to carry dongles. The USB-C cable situation was and continues to be a nightmare as different cables support different subsets of data, power and video and, worse yet, different versions of each of those. Worst of all, it was the loss of the much-beloved MagSafe. Also, the ports weren't all the same. You were better off charging from the right (IIRC) rather than the left.
Replaceable RAM and SSD being lost is still painful. Personally I don't believe this was about forcing users to pay for upgrades primarily. It was about shaving off a small amount of volume.
Ive is gone and every one of those decisions has been reversed or at least significantly amended. This is no accident.
People are commenting this post and saying it is speculation, and until someone who directly was involved in these discussions shows up to comment, I suppose it is.
I have been in design meetings with Jony, and Scott Forstall, and many others whose decisions were micromanaged by Steve at every step. You can argue that a lot of Steve's design decisions were questionable; rich Corinthian leather skeumorphism, lickable Aqua widgets, brushed aluminum window title bars, but he owned them.
Steve and Jony would sit for hours outside of Caffe Macs going over designs. Steve would spend even more time inside the industrial design area going over prototypes. He would spend a couple of hours every week meeting with every software team that had user facing features. He had input on almost every pixel on the screen and every button/port/display/etc on hardware.
Once he was gone, the drift began. It was inevitable that focus would shift. Scott no longer had protection by Steve. Jony fixated on the new campus and things like watch bands. No one had Steve to rein in whatever impulse they had. Sure, people would ask "What would Steve do?" but we also had Tim Cook pushing to optimize production, lower cost of goods and increase margins.
Apple still has Steve DNA, but it continues to be diluted. You may disagree with Steve's vision and opinions, but it was strongly held and enforced. I feel almost everything about the last generation of MacBook Pros went against what Steve would have wanted and I am glad I wasn't there when those decisions were made.
Pretty much sums up my opinion from 20+ years of following Apple.
It is sad. No one knew what to do with Apple Retail. That was the most neglected part of business.
Ron Johnson left. Scott Forstall forced out. Katie Cotton retired. ( I felt both Scott and Katie had a bit of Steve Jobs in them ) Mansfield retired. It sometimes feel Apple is now largely run by Tim Cook and Eddy Cue.
Although the new MacBook Pro do seems to show there are people in Apple that still give a damn. That their voice may have been previously drown out. Quote from Steve.
>It turns out the same thing can happen in technology companies that get monopolies, like IBM or Xerox. If you were a product person at IBM or Xerox, so you make a better copier or computer. So what? When you have monopoly market share, the company's not any more successful.
>So the people that can make the company more successful are sales and marketing people, and they end up running the companies. And the product people get driven out of the decision making forums, and the companies forget what it means to make great products. The product sensibility and the product genius that brought them to that monopolistic position gets rotted out by people running these companies that have no conception of a good product versus a bad product.
> They have no conception of the craftsmanship that's required to take a good idea and turn it into a good product. And they really have no feeling in their hearts, usually, about wanting to really help the customers.
Johnny Srouji seems to be in a very important position these days. I bet the positive current you claim that you are still feeling comes largely from his direction
If he were, I do not think he could be running the company.
His working style and some of his personal history seem like they would not have worked in 2020s.
I could see him as a board member, occasional informal consultant. But not as head of the company. And I don’t know if he would have liked that too much.
I disagree. It's true that the management style he had during the 1980's definitely would have had a stronger and faster backlash in the 2020s - he would have been forced out of the company even sooner than he really was. But this style also didn't work all that well in the '80s and led directly to him being ousted and Apple entering a slow death spiral they had to be rescued from a decade later. His second act really was different and he was much more mature as a leader; certainly he still worked people hard and was unusually blunt compared to some other current FANG CEOs, but you didn't see the sort of petulant and intensely personal abuse that you would sometimes see from him in his first act.
I think if he was CEO of Apple in the current environment, certainly there would be less hero worship outside of Apple, and a lot more people would be writing pieces critical of his leadership style and effect on the world, but there wouldn't be anything serious enough to force him from the company given how well things would be going. In other words, he'd be much closer to Musk than Kalanick both in terms of the severity of the criticism and his ability to weather it.
When Apple's focus was on education, the philosophy followed that. Welcoming newcomers who Think Different. Consistent UI design. Good documentation for developers and users. Building an ecosystem. Optimising for connectedness. Community-centered.
In the past decade, Apple's focus is self-centered: optimising for profit. That has come at the cost of community. Will the new hardware solve that? It's a welcome improvement! I use a laptop for data storage, and am worried about being unable to quickly swap my SSD over to a spare laptop (restoring 8TB will take a long time). I'm very pleased about the return of MagSafe, though, and battery replacement. Will Apple's software improve? Will Apple listen to developers? Or will the Linux community act first to welcome more newcomers? We're going to find out this decade, and I'm excited to see what changes it brings.
A lot of these are missing out the Context. When Steve Introduce App Store, he said and quote
>"Maybe it'll be a billion-dollar marketplace at some point in time. This doesn't happen very often. A whole new billion-dollar market opens up: 360 million yearly run rate in the first 30 days, I've never seen anything like this in my career for software,"
>"Music is a two and a half billion-dollar business a year for us. I'm thrilled at $360 million a year run rate. We'll be dancing on the ceiling if we cross a half a billion. Maybe someday we'll get to a billion."
He also said he doesn't believe App Store will overtake iTunes in revenue. But that was at the launch of App Store in 2008.
Forward to 2010, App Store was nearly $2B. Still not as much as iTunes, but it was growing fast. And projected to overtake iTunes in 2011. He didn't understand the App Store market. So it was uneasy for him. He spoke about it in multiple interview since 2008. But he was sick, and well aware oh his health. On his last trip to Japan in 2010, we wrote "All Good Things", the last part "must come to an end" wasn't written out. But he must have known his days may be numbered.
Remember 2010 to 2015, Apple was about, or at least perceived to repeat the same mistake of Macintosh and Windows again. Current iPhone 13 in launch quarter would have sold more iPhone than all iPhone 2G to iPhone 4 combined. iPhone revolution has barely stated despite most of nerds realise how big of a change it will be.
In 2015 the war has pretty much settled. Android will never be able to destroy iPhone. And iPhone growth projection has zero momentum lost. I wrote on AI how Apple will reach 1B iPhone user in 2020 at those projected rate in 2014. At least Phil Schiller was aware of it in 2014 and floated the idea or lower commission rate from "Strength". I am sure if Steve was alive and Scott Forstall was still on the team you would at least have some support of that idea.
The problem is right now no one has the conviction to say no this isn't a business we should be in. Certainly not from the number guys like Tim Cook or CFO. iTunes was used to sell iPod. App Store should have been used to sell iPhone. Instead it is now used to extract value from iPhone users aka Services Revenue. Scott stood up for Developers, Steve did too. As shown in email released in court. And I am pretty sure Katie Cotton would have smelled the PR disaster before it even started. But all three are gone.
But let’s not pretend that most of the App Store revenue comes from poor indie developers. It comes from pay to win games, loot boxes, and other in app purchases with zero marginal costs.
Oh, so it's not that big of a loss since most of the people don't deserve that revenue anyways! So Apple on the other hand is totally entitled to that money?
I was hoping that Apple Arcade killed all play to win games or at least took a bite out of their revenue.
But again, you’re not standing up for the poor starving Indy developer. You’re standing up for companies that are doing far more sinister things going after “whales” than anything Facebook has ever done.
> You’re standing up for companies that are doing far more sinister things going after “whales” than anything Facebook has ever done.
There are several things wrong with that statement.
1. If these companies engaged in truly destructive business practices (and it could be proven), then Apple has a duty to remove those apps. Every developer has to pay them $99/year to be registered, so that money should be funding the removal of these supposedly exploitative and 'sinister' apps.
2. You cannot prove that the majority of these businesses are all big.
2a. Even if you could prove that, it's common knowledge that Apple is the largest company in the world, which renders that entire argument moot.
3. How do you delineate between indie developers and 'whales'?
3a. When you do differentiate the two, how is it ethical for the largest company in the world to ask for more of their profits?
4. Apple isn't standing up for the poor indie developer either, which is why it's perfectly reasonable to ask them to do better. There is not a company on this planet with more liquid cash than Apple, so there's nothing wrong with asking them to just improve their treatment of developers when consensus is that they're one of the most exploitative and destructive companies in the field of consumer electronics.
1. I agree. While Apple hasn’t gotten rid of the scammy apps, they have introduced an alternative - Apple Arcade - where they do fund Indy developers so they don’t have to make scammy pay to win games.
3. Indy developers are selling a product that has value. The “whales” are the 5% of consumers buying loot boxes and candy crush coins (?).
3a Most of their profits are not from the App Store. Most of their profits come from a simple ethical business model - I give them money they give me stuff. Unlike Google and Facebook or the aforementioned games.
4. Who are all of these poor developers Apple should be standing up for? In the link above (and the numbers were confirmed during the Epic trial), most money is coming from games with in app purchases. If you look at the top selling apps on the App Store now, the last one that I can remember that came from Indy developers is WidgetSmith. How many Indy developers would be successful if Apple took a 15% cut instead of a 30% cut?
> Who are all of these poor developers Apple should be standing up for?
Well, there's the FlickType guy[0] who got kicked off the app store only to have his product completely cloned by Apple. Then there's the Hey! email people who went to hell-and-back just to get an inoccuous update approved. Not even a year ago there was a class-action lawsuit against Apple by developers[2], and Apple's "compromise" was to charge less for a service that was provably garbage. They know their 30% cut is illigitimate, that's why they backed off so quickly. Even still though, they charge certain people 30%, others only 15%, and then the big companies like Netflix get away with 0%[3] because of insider deals that other apps cannot benefit from.
So fixing that would be a good start. Then they need to allow for alternative payment processors (as pressure mounts from countries like South Korea and France), and hopefully get rid of their asinine sideloading requirements that do nothing for the user except make it harder to get the functionality they want. These feel like non-negotiables to me, and as a developer I have no intention of supporting their software or using their hardware until it's fixed.
Your point was that Apple was taking money from developers - Hey isn’t going through in app purchases. Apple makes no money when people subscribe to Hey.
Netflix hasn’t allowed in app subscriptions for years.
Every developer with in app subscriptions only pays 15% after the first year.
If Apple’s 30% is garbage, so is Google’s and every console makers.
Do you support Android? Have you bought any game consoles.
As far as not allowing side loading - that’s a feature not a bug. Are you really unaware of all of the malware that is on computers because of no sandboxing?
> If Apple’s 30% is garbage, so is Google’s and every console makers.
I agree. We have to hold Apple accountable first though, because they're abusing it hardest.
> Do you support Android? Have you bought any game consoles.
I do support Android, through F-Droid, and while my app wouldn't benefit from existing on a console I've sideloaded several apps to my Xbox One without any problem. I played through Castlevania last week and it ran flawlessly, so it's mostly Sony and Nintendo who are holdouts at this point (and I say that as someone with a hacked Switch).
> Are you really unaware of all of the malware that is on computers because of no sandboxing?
I'm fully aware. I just don't think it matters when the iPhone has much more malicious attack vectors, like zero-click iMessage exploits that cut straight through BlastDoor like it didn't exist. Maybe once Apple fixes their more egregious security vulnerabilities and embraces transparency they'd have an argument: but right now it's a poorly-disguised and obvious excuse for lock-in.
So Apple is “abusing the hardest” even though to develop a first class console game you have to pay much more to develop on it up front than $99 and you have to pay a license fee for each game sold either physical or digital? All of the console makers make it much harder to develop than Apple or Android - yet you bought an XBox One from Microsoft.
In the eyes of the Supreme Court, games consoles are not general-purpose computers. Even if they were, Apple still drives wider margins than any of these console manufacturers do. The cost to manufacture an iPhone is about 40% it's retail price. The cost to manufacture a game console is ~90-105% it's MSRP. Without the ability to drive hardware margins, they stand a lot better chance in court than Apple does.
There already had been a court case. Epic vs Apple. Epic lost on every point showing Apple to be a “monopoly”.
But it wasn’t about court. It’s about you selectively choosing who to have moral outrage against. So Apple makes too much money as the most valuable company in the world. But Microsoft and Google are the good guys even though they are also making obscene profits and are worth a trillion+ dollars?
I have a contrarian view of Steve that completely goes against the current zeigeist - ala he was evil, squeezed employees, drove them to exhaustion, believed in voodoo magic health potions and was a total asshole. But then no one dares to ask - despite all this, how did he inspire so many to follow him, to look up to him and to worship him? Usually that's scapegoated with "He had that magic aura". This is totally unfair. He loved many people, had an excellent taste in design, changed his mind often, was sympathetic to people that he trusted, pushed back hard on things he knew sucked and generally kept Apple away from the rifraff endeavors of HP/Compaq/Dell/IBM and took risks.
I think Walter Isaacson did a massive disservice by not focusing on things he was uniquely good at, but instead built a largely negative narrative around him; squandered an opportunity to show his work ethic, his approach and how he inspired people. I recommend reading "Becoming Steve Jobs" instead by Schendler and Tetzeli. One of the best moments in the book is when Steve wanted the I-beams to be absolutely perfect in the new office building.
It would be better if we pick out good things about any accomplished personalities and try to benefit from them, instead of dishing out vile hatred that is oh-so-common at discussion boards like this.
Second this, Becoming Steve Jobs is the superior book. It helped a lot that the authors had interacted with Jobs over decades and also had an understanding of business and technology that Isaacson was missing.
There’s also an old documentary about the founding of NeXT that does such a great job of showing what it was like to be in the room with him as a member of a small team, and this was before he fully “became” Steve Jobs. People underestimate how much meaning can be found in being pushed hard by someone with a clear and inspiring vision.
I agree with most of what you say, but based on personal experience, have a hard time really knowing who he loved and didn't see sympathy expressed often for individuals who worked for him.
Well Steve's is being martyred into a figure head for people who disagree with Apple's direction. Often used in contradictory situations. For example, one can claim too thin being against with Steve, and someone else opposite.
Come-on, that guy has died for 10 years. His opinion on anything one cited for is *UNKNOWABLE*. Stop saying he will approve or disapprove some ideas... That's literally meaningless statement...
Of course you are right. I have a snapshot of Steve in my head that I apply, but his opinion changed frequently, as evidenced by the various permutations of the OSX interface designs.
That being said, I just can't believe he would have been happy about the various issues with the old MacBooks. So many things feel so wrong.
I think it’s fair to say that Apple was more responsive, faster, with someone like Jobs. There was just a bit more push through the company to fix X, Y or Z. It’s hard to say that any features in particular were delayed for iOS, but I think it’s possible macOS would have seen a bit more churn, arrived at the macOS 11 design sooner, and maybe already have a redesign in the works to handle the new “notch” at the top.
That said, pure speculation on my part, but I think the notch would not have launched on the laptops without some other benefit - e.g. Face ID - or it would have been on pause until it was small enough to match the current menu bar’s height. There was sometimes more of a push to get things “just so,” I think. Either way, I miss the old showy product introductions. I like the polish of the videos under lockdown, but it feels like the format drains the enthusiasm a bit.
And it’s hard to point to anything recent, except maybe AirPods Pro and recent software releases, where Apple really knocked it out of the park. Most Apple hardware seems like incremental improvements rather than flashy impulse buys. Maybe I’m just more impatient than I used to be.
The notch does have a benefit. The area below the notch is the same 16:10 display you would have gotten without the notch. Now the menu bar that has been at the top of Macs since 1984 doesn’t take away from that main area.
The Apple Watch has been much more profitable and will have a longer lifespan than the iPods.
>I think it’s fair to say that Apple was more responsive, faster, with someone like Jobs.
Only partly. Apple under Jobs sold a completely unusable mouse (the infamous "hockey puck") for years simply because it looked cool. And it took them ages to move away from the butt-ugly skeuomorphic design in IOS. Only after first MS went with flat design and later Google got the design language of Android right, did Apple throw that out.
>And it’s hard to point to anything recent, except maybe AirPods Pro and recent software releases, where Apple really knocked it out of the park.
Uh, IDK. Considering how first the M1 and then the derivatives rocked the world of CPUs, I'd really disagree. The ripples of that really rocked intel and might cause quite drastic changes there.
Me, personally, I was looking at buying either a Dell XPS or a Framework laptop next to run Ubuntu, but considering the latest MBPs, I will buy one of those as I really love quiet machines with great displays.
> And it’s hard to point to anything recent, except maybe AirPods Pro and recent software releases, where Apple really knocked it out of the park.
I mean… the evolution of Apple’s in house chips are absolutely park-knocker-outters, and the thing that makes me bullish about the company’s future. It alone may be enough to secure a front runner position when it comes time to transition to AR.
They would benefit again from an iconoclastic head of product, but the new laptops and stability improvements introduced in Monterrey suggest that bench is stronger than a lot of people think.
True I love the new chips, but the new screen tech is a bit of a battery hog and isn’t perfect (which now that I write this reminds me of when the Retina displays were first introduced…)
Yeah, I’m not knocking Apple’s tech - but I was hoping for “the latest specs” and what I got was an HDMI 2.0 port, UHS II SD card slot, and so on. It reminds me of when they removed Thunderbolt Display input from the iMac, so you couldn’t use an old iMac as a secondary monitor. Or when they removed the optical sound out from the headphone jack. I won’t even ask why the new Macs don’t have cellular or why Apple hasn’t thrown money at game companies to make better Mac ports or otherwise entice PC gamers consider Mac hardware instead of Windows. Speaking of gaming, the lack of a “console-like” Apple TV is also a bit of a letdown. As is the lack of an Echo Show competitor.
I don’t doubt that this will be like the iPhone all over again - Apple’s late to the party but gets it right. But I also kind of worry it will be Siri all over again - fantastic at first, but ultimately a cancelled HomePod and still a work in progress…
I’m very curious about a few things - how does someone like Steve gain so much respect from so many different types of people? Was it the ‘we’ve won before with him, so I must believe’? - a Nick Saban like persona. Or was it that he was unbelievably empathetic? — that doesn’t make sense, because not all empaths are able to rally people to a cause due to the bleeding heart syndrome.
I ask because it is almost as if you see the bricks change shape at Apple trying to fill the missing piece... they know they need that influence, it’s just not there, and honestly, I want to be a part of an organization that operates in the post-kicked out Steve aura.
This is such a difficult and interesting question to try and answer. Of course, I can only offer my viewpoint as someone who worked for (with?) him in four different contexts.
He was most definitely not empathetic to me. He could be very empathetic to an abstract construct of a person. He would act as an advocate of the "user" but I felt that the user was always him. How did this align with reality? I guess it did, to a certain cross-section of people who appreciated whatever guise of a user Steve represented. This user construct changed over time and I could see it in early Apple Steve, lost in the wilderness Steve, NeXT Steve and return to Apple Steve.
Steve had charisma. Younger Steve charisma was different to me and it left an imprint. I felt that he was attractive physically and mentally, could engage with you and make you feel like you and he were doing something that could really make a difference. We knew we weren't curing cancer, but that somehow the pursuit was noble in a similar way; enabling human potential that was being lost. I still want to feel this way about technology.
Once you got to know Steve (if one could) he was oddly two-dimensional. His lack of real personal connection or concern about you as an individual was troubling. In some ways, the more abusive he was to you, the more it showed his interest in you. It was dysfunctional. He was never mean to random people (in my observance) although you hear stories of him being abusive to people he didn't know, I never saw it. I saw him open doors for people, let people cut in line in front of him in the salad bar, normal sorts of politeness. The higher his expectation of you, the harsher the abuse you could expect to receive. You could never really become numb to it, but after awhile you just began to adjust your calibration.
As you point out, Steve did win quite a bit. Of course he lost and sometimes he lost big. I really didn't care about winning the way a lot of Silicon Valley people care and I saw some of this in how Steve lived his life. Yes, he drove a nice car and had some property. But he wore well-worn clothes, drove himself to work and generally seemed like one of us. Later in life, odd things like private jets, yachts and Central Park condos showed up.
Everyone that I worked with at Apple who is still there knows it is not the same company. It can't be. The scale, management structure, market, political climate and more is different. And there is no Steve. I left Apple when Steve finally left, but I knew that "my" Apple was gone around 2006.
All I can say is thank you for pouring this out into the ether, these attributes and recollections are beautiful, or dysfunctionally beautiful you could say.
> how does someone like Steve gain so much respect from so many different types of people?
Simple. Steve is undeniably one of the most influential people in the last 50 years. You don't have to agree with him to recognize this. To dismiss his impact because you don't like him personally is shallow and reductive. No one is saying he was a nice guy.
First, he and Woz brought us the Mac. Sure, Woz was the tech guy but Steve's product influence cannot be overstated. Macs really lost out to the IBM PC and then Steve was forced out of his own company by his own hire (John Sculley from Pepsi).
Apple languished for a decade while Steve started NeXT. On the brink of bankruptcy when Apple had to be rescued by a $150 million injection of funds from Microsoft.
In the next decade, Steve took the NeXT OS that because the foundation for OS X and iOS and released the iPod, iTunes and then the iPhone that ultimately turned Apple into what it is today: a trillion dollar company that literally prints money.
Steve was basically the ultimate product person and really a visionary. It was often stated that he generated a reality distortion field as he literally changed industries around him. The sea change that was the iPhone took control over code distribution on mobile phones from the wireless carriers. The price of this was several years of AT&T iPhone exclusivity in the US. The popularity of the iPhone bent carriers to his will. He unrelentingly refused to ship bloatware on the iPhone (unlike what you get on basically very Android phone other than the Pixel).
Apple developed a track record for taking terrible technologies and making them great. One of my personal favourite examples is connecting to Wifi. Many here probably aren't old enough to remember this but in the early 2000s that involved going to a settings window in Windows and entering the Wifi type (802.11b and then 11g), the encryption type (eg WEP, WPA, WPA-PSK, WPA2 or WPA2-PSK) and an encryption key, which may or may not be a password.
On OSX you simply selected a network and entered a password. Why ask the user for a bunch of stuff they don't care about, probably don't know and you, the computer, can figure out?
To me this is classic Steve influence.
Fun fact: Steve is the reason we don't have DRM on downloaded music. iTunes initially used a DRMed format to launch and all songs were $0.99. DRM was demanded by the RIAA. The RIAA didn't like the pricing model. Steve ultimately made the bargain that gave them pricing tiers but his demand was no DRM. That's how iTunes ended up distributing DRM-less MP3s instead.
Steve was by far the most user-focused of any of the tech titans of the last half-century.
Some people don't like Apple's walled gardens and that's fine but again, to dismiss the impact of the iPhone (for example) because you personally prefer the "freedom" of Android is a shallow judgement.
We stand on the shoulders of giants. And Steve was a giant. That's why he was and continues to be respected.
I'm late to the discussion. And I don't have near the personal insight to offer that @diskzero does. But I think there's another aspect as well.
Apple came back.
I can think of no other company in recent history that has come back from such a long run on death row to periodically become the company with the world's highest market valuation. And we tend to associate that (for many good reasons) with Steve Jobs. Steve's persona and the admiration that follows I think derives in large part because it has been the ultimate come back story.
This "come back story" is one of the most classic and inspiring stories that has resonated through myth and fable throughout history. It's what makes us stand up and cheer in a theater when Daniel Laruso delivers the take down kick. It is Miracle when the US beats USSR in hockey. It's why we like to interpret David and Goliath as a little guy takes down big guy story (vs the kid with the rock gun kills the big lout https://www.ted.com/talks/malcolm_gladwell_the_unheard_story...).
It's interesting to note that with Steve gone, Elon Musk has risen to the new Crazy Successful Celebrity Leader figure. Like Steve, you can see Elon from a variety of facets, some very flattering, and some very damning, what we would love about him is that he has had some success at defeating the status quo.
The comeback was amazing, wasn't it? It was cool to be there and I think about the dynamics a lot. What I really keep trying to figure out is how we were so effective with such small team sizes and why this can't scale. When I was at Amazon, the mobile application team three times larger than the OS X engineering team circa 2001.
That being said, I just can't believe he would
have been happy about the various issues
with the old MacBooks. So many things feel so
wrong
It's an interesting question for sure.
During his life, he certainly did champion a lot of form-over-function decisions: the "cube" G4, the hockey puck mouse on the iMacs, etc.
And then he also championed some similar decisions that most people regard as roaring successes: the removal of legacy ports on the MacBooks felt an awful lot like the decision to ditch legacy ports on the original iMacs.
> the removal of legacy ports on the MacBooks felt an awful lot like the decision to ditch legacy ports on the original iMacs
Sort of, but this disregards some important product context. Having a multi-port dongle or adapters on a desktop machine is a very different experience than a portable.
Totally agree - my understanding is Steve Jobs just was 100% committed to an opinion, until someone convinced him to go 100% in on a different opinion.
I'd also add that with Jony Ive on the way out for years, there are a lot of decisions attributed to him that he likely did no more than sign off on.
Whenever someone working at Apple has a Big Idea(TM), invoking "This is what Steve would have done" is now pretty much a mandatory tactic in arguing your position.
With this knowledge about Steve Jobs, it's fair to assume that he knew that too right ? he probably knew he was part of the glue that made Apple Apple, and took measures to avoid dilution happening too fast.
Also says a thing about leadership and human groups :)
Thanks for sharing this. From an outsiders perspective I always assumed it was a ying and yang relationship where the sum was superior to the parts and in balance. Sounds like that was fairly accurate.
This is pure speculation, ungrounded from any evidence.
The touch bar is a very flexible (effectively) analog input + rich display device. If adequately supported by software it can be an amazing input, affording a range of useful functions not replicable with discrete buttons. In general, I really wish modern computers had more analog inputs available. Analog knobs, jog wheels, sliders, trackballs, etc. are tragically missing.
I have seen no evidence that Jony Ive was its patron, and no evidence that including it had anything to with making laptops expensive as a goal.
The problem with the touch bar is that (a) it only shipped on a limited subset of devices so software authors could not depend on it, (b) after its initial functions, Apple made limited effort to adopt it in all of their own software, improve its integration into the system, or push boundaries of what it could do as an input device.
> The butterfly keyboard was Ive shaving off 0.5mm of the width for a worse user experience with a higher production cost and less reliability.
No, this was some Apple-internal mechanical engineering group trying to design the best extremely thin keyboard they could, but getting bitten hard by a mismatch between reliability in a prototype vs. full-scale factory production + poor estimation of reliability in a wide variety of contexts over a longer period of time. Nobody ever set out to make a “worse experience” or higher cost.
It goes the other way, too. When Apple put cameras in all of their laptops, the press relentlessly bashed them for wasting BOM on something so useless and expensive. Then the industry realized it was a good idea and followed suit. Similar for retina displays -- the term "High Definition" had become synonymous with "good enough" and ground PC monitor advancements to a halt for a decade. Phones were coming out with higher resolutions (not pixel densities, resolutions) than full-size monitors. Then Apple figured out how to market higher resolutions, the press mocked them for wasting money, but word got around that HD might not be the end-all of display technology and consumer panel resolutions started to climb again.
Here's a counterexample, a niche that could really use the Apple Bump but hasn't gotten it and probably won't get it: 10 gigabit ethernet. 1GbE became synonymous with "good enough" and got so thoroughly stuck in a rut that now it's very typical to see 1GbE deployed alongside a handful of 10 gigabit USB ports and a NVMe drive that could saturate the sad, old 1GbE port many times over.
Sometimes taking risks results in a Touch Bar or Butterfly Keys. That's just the nature of risks. The only way to have a 100% feature win rate is to limit yourself to copying features that someone else has proven out, but if everyone does that then the industry gets stuck in a rut.
I'm glad Apple exists, even if I don't personally feel the need to fund their experiments.
> 1GbE became synonymous with "good enough" and got so thoroughly stuck in a rut that now it's very typical to see 1GbE deployed alongside a handful of 10 gigabit USB ports and a NVMe drive that could saturate the sad, old 1GbE port many times over.
This has a few reasons:
- 10 GbE was, until quite recently, pretty power intensive and it still is more expensive and hot than gigabit
- Devices in LAN, especially those with high bandwidth usage, have become far rarer. A lot has moved to the cloud and the bandwidth of most people can't saturate 100 Mbit, not to speak of Gigabit.
- LAN as a whole has become rarer. A lot of people now only use WiFi with their phones or laptops, up to the point that most people now have (theoretically) faster WiFi than LAN.
Combined, there are few reasons to take the expense of putting a high-speed ethernet port on a device. Luckily, the introduction of 2.5GbE and 5GbE has decreased the jump a bit and you see those ports on a few consumer devices now.
10 GbE is still iffy even with CAT6 cabling over copper which complicates deployments and user experience. As a result, prosumer type devices like recent AMD x570 motherboards and the upcoming Intel Z690 based ones are including 2.5 GbE ports that are rated to work over CAT5E and provides enough bandwidth for a few hundred GBps with a lot less power usage on the switch side (something like < 4w / port seems common) and makes it easier for low cost passively cooled switches to work alongside a switching SoC that doesn't need to be terribly sophisticated to hit the latency requirements needed to support 2.5 GbE.
10GBASE-T is power hungry and unreliable, but dirt-cheap 10GBASE-LR and 25GBASE-LR transceivers work great up to 10km. If only they could figure out how to fit the transceivers into mobile-friendly packaging. But for a workstation they're great.
That's true, I actually run fiber in my home for that reason. I think the problem with fiber is, though, that the technology is pretty unknown to consumers and working with fibers is a lot harder than working with cables; they take a lot less abuse before breaking, for example. But if someone is going for 10Gbit+ in their home network, I can highly recommend fiber.
I really which LAN made a comeback. There hasn't been a week without a video conference where someone had internet issues due to Wifi problems. In fact, from my experience, most times people talk about issues with their internet it's in reality Wifi issues. But few non-technical (and even technical) people consider connecting devices like TVs or Laptops by LAN, even if they hardly ever move and the router is close by.
All this talk about how fast Wifi can be made people think it's all they need. But in reality, building a Wifi network with fast speeds across a whole house while avoiding too much interference from networks around you (esp in inner cities where you can easily have 40+ networks in reach) is more work and more expensive than pulling LAN cables to the right places.
But LAN isn't sexy and no one advertises how fast a copper cable can be, it just doesn't sell products as well as talking about Wifi 6.
I am partly with you. But considering you have to actually put wires through walls and install sockets vs. just setting up an Wifi 6 access point, I don't see Ethernet making a huge comeback.
> Combined, there are few reasons to take the expense of putting a high-speed ethernet port on a device. Luckily, the introduction of 2.5GbE and 5GbE has decreased the jump a bit and you see those ports on a few consumer devices now.
I think the only thing driving 2.5/5/10GbE at all is that WiFi Access Points need it.
Compare the cooler for a 2.5GbE card [0] to that of a 10 GbE card. The fact that WiFi (which is what most consumers use) now supports those speeds surely helps, but 2.5GbE is also simply far easier to integrate and power.
I agree that 2.5GbE is easier, and I still think AP backhauls are the primary driver for it. AP makers cannot sell multi-gigabit WiFi APs without a back haul that can support them.
> 10 GbE was, until quite recently, pretty power intensive and it still is more expensive and hot than gigabit
PCIe 3.0 transceivers were 8Gb/s and supported preemphasis and equalization, closing the sophistication gap with their off-backplane counterparts. How many PCIe3+ transceivers has the average person been running (or leaving idle) for the last decade? These days a typical processor has 16Gb transceivers by the dozens and 10Gb hardened transceivers by the handful. I just counted my 10Gb+ transceivers -- I have 36 and am using... 10 (EDIT: 8/4 more, HDMI is 4x12Gb/s these days).
The reason why 10GbE is expensive has nothing to do with technology, nothing to do with marginal expense, nothing to do with power, and everything to do with market structure. Computer manufacturers don't want to move until modem/router/ap/nas manufacturers move and modem/router/ap/nas manufacturers don't want to move until computer manufacturers move.
These snags don't take much to develop, just "A needs B, B needs A," and bang, the horizontally segmented marketplace is completely immobilized. That's why the market needs vertical players like Apple who can push out A and B at the same time and cut through these snags, or high-margin players like Apple who can deploy A without B and wait for B to catch up. Otherwise these market snags can murder entire product segments, like we've seen happen to LAN.
No, it isn't because of reduced demand. People are recording and editing video more than ever, taking more pictures than ever, streaming more than ever, downloading hard-drive busting games more than ever, and so on. LAN appliances would have eaten a much healthier chunk of this pie if LAN didn't suck so hard, but it does, so here we are.
> Luckily, the introduction of 2.5GbE and 5GbE has decreased the jump a bit
Yaay, PCIe 2.0 speeds. 2003 called, it wants its transceivers back :P
Power is a big differentiation. You need to send 10GbE over 100m (some break the standard and only offer 30). Have you ever touched a 10GbE SFP module or the heat sink of a card? They're quite hot and you need to provide that energy, which is not a problem on a desktop, but a big one on a laptop. If the laptop has RJ45, that is.
> modem/router/ap/nas manufacturers don't want to move until computer manufacturers move
Modems and routers only make sense once they serve a link that is actually beyond 1Gbit - which is rare even today. Also, these devices are minimal and the hardware required to actually route 10Gbit is a lot more expensive. Even Mikrotiks cheaper offerings today can't do so with many routes or a lot of small packages (no offense to them, their stuff is great and I'm a happy customer - it's still true, though).
APs are a bit different, as WiFi recently "breached" the Gbit wall (under perfect conditions). But there are already quite a few with 2.5Gbit ports to actually use that.
NAS, on the other hand, are a bit held back by the market. Still, high-models have offered either 10Gbit directly or a PCIe-slot for a long time now.
> People are recording and editing video more than ever, taking more pictures than ever, streaming more than ever, downloading hard-drive busting games more than ever, and so on. LAN appliances would have eaten a much healthier chunk of this pie if LAN didn't suck so hard, but it does, so here we are.
The professional video editing studios with shared server are already on 10 Gbit LAN, the stuff has been available for years. Pretty cheap even, if you buy used SFP+ cards. Switching was expensive until recently, but I'd say that the number of people which need a 10G link to a lot of computers are even less.
And LAN competes with flaky, data-limited, expensive 100 MBit lines (if you're lucky). 1GbE is beyond awesome compared to that and yet it lost, anyway.
> Yaay, PCIe 2.0 speeds. 2003 called, it wants its transceivers back :P
I'm not happy, either, but its better to at least go beyond Gigabit speed rather than stay stagnant even longer.
> ... a niche that could really use the Apple Bump but hasn't gotten it and probably won't get it: 10 gigabit ethernet
10GbE was a bit of a mistake on several fronts.
We had become used to these 10x iterations with Ethernet from 10Mb to 100Mb to 1Gb such that 10Gb seems like a natural extension. But running that bandwidth over copper remains a significant technical challenge. For awhile I was using a Thunderbolt 10GbE controller and it was huge (basically the size of an old 3.5" external HD) and most of it was just a giant heatsink, basically.
In commercial situations, the issues with copper often result in using fiber instead. At that point there are less barriers to even higher speeds (eg 25Gb, 40Gb, 100Gb), which make a lot of sense in data centers.
Added to this, there's not a lot of reason to run 10GbE in a home setting or even in many small corporate settings. Even in larger corporate settings, you can go really far with 1GbE using switches, bridges and routers, possibly using higher speed backhaul connection technologies.
What should've happened is what has started to happen in the last few years: interim speeds (eg 2.5Gb and 5Gb). Hopefully these become more widespread and become relatively cheap such that someday they just displace 1GbE naturally.
On top of all of this, Ethernet is an old standard that uses 1500 byte frames. This actually starts to become an issue at 10+ GbE such that various extensions exist for very large frames (eg 9000 bytes) but this runs into issues with various hardware and software.
Probably largely because of the 1500 byte frames of Ethernet, the de facto standard for TCP/IP MTU is pretty much 1500/1536 bytes and this has become a self-fulfilling prophecy as more and more infrastructure is deployed that makes this max MTU assumption.
The scary part? 1GbE is older than I thought. A couple weeks ago I replaced a 1GbE switch (gs524t) at my work and got curious. Said model came out in 2001 or 2002.
The touch bar was fingers on glass. It's not appropriate for a professional device since it requires you to look down and doesn't lend itself to the "mechanical" use of devices that high-paced work requires.
Also, you can actually add the analog inputs yourself. The DIY keyboard community -- which is flourishing with options and new vendors -- has lots of options available. I myself have two analog knobs and one trackpoint on my keyboard. It's absolutely amazing.
I was really hoping Apple had a big leap forward in fingers-on-glass interaction planned. Imagine if the glass could kind of raise or move down so you could “feel” where the buttons were. Heck, even providing a few notches in the chassis, above the Touch Bar, for a finger to “feel” relatives where it was, and require a harder “press” to activate the Touch Bar, would have been likely a game changer.
But they didn’t. And I was always confused that the Touch Bar never got more love from the hardware developers.
That definitely makes me wonder if it was pushed by Ive or someone at Apple as a pet project, but abandoned once the initial development was done. Seemed very Un-Apple to do something like that these days though.
It also wasn’t easy to build software for the Touch Bar from what I could gather. I had lots of ideas for little tools (think iStat-like gauges, but perhaps for things like the mic input level), but it wasn’t very easy to build one when I tried.
RIP Touch Bar. You might not be missed too much, but I bet something like you will come up again in a decade or two.
> That definitely makes me wonder if it was pushed by Ive or someone at Apple as a pet project, but abandoned once the initial development was done. Seemed very Un-Apple to do something like that these days though.
Yes. It wasn't Jony. It came from the software side. I won't name who to protect the guilty.
I was really hoping Apple had a big leap
forward in fingers-on-glass interaction
planned.
Me too, but even if they solved that challenge -- I think there's an even bigger and insurmountable challenge there: a sizable percentage of users, particularly "power users", frequently use their Macbooks "docked" and hooked up to an external monitor and keyboard.
While many people happily type on their laptops all day long with no external monitor or keyboard, there are also many people (me included) who think that's an absolute ergonomic disaster and greatly value the extra screen real estate of an external monitor.
So, honestly, there's no way the Touch Bar could have been good enough for me to use it.
The whole idea was just misguided.
What Apple should have done IMO was allow an iPad or iPhone to fill that role. I still do not understand why my iPad can't be an amazing accessory to my Mac. With the right software magic and integration, it could be everything the Touch Bar was while simultaneously doing a lot more, and is of course extremely compatible with external keyboard use.
Well, I can guess why, actually -- it would require a lot of coordination between the Mac and iDevice teams and that is challenging.
(However I'm encouraged -- Sidecar is a good first step. Let's see more...)
> That definitely makes me wonder if it was pushed by Ive or someone at Apple as a pet project, but abandoned once the initial development was done. Seemed very Un-Apple to do something like that these days though.
I think it was more like they decided to add the equivalent of an Apple Watch to Macs to support TouchID and then asked "what else can we do with it?".
If the analog add ons are DIY or even extra money, then software developers cannot rely on them being present and won't develop good software and use cases for it. At least not most of them. The best you can hope for is niche software support.
So adding stuff yourself is nice (I do it myself!) but not a way to move the industry or even the Apple ecosystem forward.
Analogue input is pretty much a solved problem. Not only do we have standards for game controllers, but also MIDI control surfaces give you a wide variety of analogue physical controls. MIDI even comes with incredibly rich input automation.
Sadly, the only company I'm aware of producing that sort of hardware for use outside the music industry seems to be Loupedeck.
My touch bar almost always has "Display Connected: [Mirror Displays] [Extend Desktop]" on it. I can fiddle around and get it to show app-specific things, or hold Fn to see the F keys, but most of the time I'm using it it shows those useless multi-monitor buttons.
I'm sure there's some setting somewhere that defaults to showing whatever the layout for the in-focus app is, but it's failed to make me care enough about it to try to figure it out.
> It's not appropriate for a professional device since it requires you to look down and doesn't lend itself to the "mechanical" use of devices that high-paced work requires.
Sure it could; it just has to beat the cost of the lookup. If you could have done some complex operation trivially with it, that couldn’t really be done with some keyboard shortcut, being a dynamic visual field would be fine.
Of course, volume sliders don’t fit that bill, and I don’t think anyone really found something that did… but it’s not some fundamental guarantee that it would be useless.
> It's not appropriate for a professional device since it requires you to look down
Apart from developers many professionals do look down all the time because they typically have other devices connected e.g. synths, photo/video editing rigs.
And Touchbar was designed much more for that audience.
I can see how you might make that assumption based on how the Touchbar has exposed functionality, but this was not the goal of the Touchbar as it was sold internally. It was sold as one of the next great UI affordances. It came from some of the same people that brought us Mission Control, the Dock, Exposé, etc. I worked on a lot of these features and I never use them. Shame on me.
On a laptop, that "look down" means looking at the bottom pixels of the screen.
The looking was never the problem, IMHO, the problem was execution and utility. It was actually distracting when there was adaptive completion results continually flashing. And the rest of the buttons were never great.
If every single dialog box flashed the buttons, that would be a win as it is easier and faster to tap the touchbar than it is to navigate the cursor and then click. But this obvious use case never really materialized.
And if, like most "professional" users, the laptop is operated via an external keyboard, the muscle memory never develops.
> This is pure speculation, ungrounded from any evidence.
It's not ungrounded from the anecdotal evidence that these changes are coming after Ive's departure.
> I have seen no evidence that Jony Ive was its patron, and no evidence that including it had anything to with making laptops expensive as a goal.
Holy, evidence Batman! Leadership 101: When your title is "Chief Design Officer", the design buck stops with you. When your company releases an updated design to an existing product, you had some kind of say in that design. Period. Even if your "say" was just that you were aware of it, and didn't veto it.
> When your title is "Chief Design Officer", the design buck stops with you.
Agreed with this. When you're coming to the CDO position after 20 years of being a hands-on designer at that company, most recently as the head of both human interface and industrial design across the entire organization, and having been described as being the person with the most operational power at Apple, after Steve Jobs himself, even before being promoted, my guess is that these design changes did not sneak under his radar. It is most likely that he set the goals that produced these designs, and that he was aware of and approved of them from the beginning. And I suspect that as a new C-level, he was probably even more hands on than that.
But since in this thread we are being asked to hold ourselves to a very high standard of rigor, I should note that I have not submitted this comment to peer review, or made my data available for replication at this time. I'm just basing this on, you know, how jobs work.
There's a big difference between "Jony Ive, as CDO, must have signed off on this, and thus bears responsibility for it" and "Jony Ive was pushing for this, for these specific reasons".
> Holy, evidence Batman! Leadership 101: When your title is "Chief Design Officer", the design buck stops with you. When your company releases an updated design to an existing product, you had some kind of say in that design. Period. Even if your "say" was just that you were aware of it, and didn't veto it.
This is just shifting goalposts because you got called out.
You didn't word your comment as "these things happened on Ive's watch" you consistently word your comment like Ives was personally pushing for something.
It's a common refrain on HN and it's never backed with proof.
And speaking of your first comment:
> Ive is gone and every one of those decisions has been reversed or at least significantly amended. This is no accident.
... you realize that this is a new generation of MBP landing on the exact same cadence they've come out on in the last few decades?
So it makes perfect sense to have drastic changes land now regardless of who's in charge?
Not mention the fact it hasn't even been two full years since Ives left. And the fact the HDMI port was coming back leaked at the start of the year.
So unless you seriously think Apple designs a laptop in the course of a single year, it's highly unlikely he had no input on the current machine.
> You didn't word your comment as "these things happened on Ive's watch" you consistently word your comment like Ives was personally pushing for something.
This is not a meaningful difference when he's in charge and it's a flagship product.
You're naming people that are managing entire companies.
The person in charge of design, for a company that has a handful of physical products, is a completely different situation. It's reasonable to blame them for top level product design decisions. What happens in that specific realm is what they want. The top priority of their job is those few dozen decisions. The opposite of a CEO that's overseeing ten thousand different things.
Be a little less stuck on the word 'pushing'. The fact is, when it's one of the main things you're in charge of choosing, and you allow a decision and then stand by it for a long time, you are now pushing it.
Also, wait, you're the one that inserted the word 'pushing' into the conversation! If you're upset with that wording, you're upset at a strawman.
Kevin Scott is a CTO. He's in charge of top level product technical decisions
The top priority of his job is those few dozen decisions.
Be a little less stuck on the word "Director". The fact is, when you're one of the main people in charge of allowing decisions, it's not the same as personally championing them.
-
> Also, wait, you're the one that inserted the word 'pushing' into the conversation! If you're upset with that wording, you're upset at a strawman.
You know you can just read the comment I referred to right if you've already forgotten right?
The butterfly keyboard was Ive shaving off 0.5mm of the width for a worse user experience with a higher production cost and less reliability.
Does that sound like personally assigning blame to Johnny Ive for something? It'd be one thing if it said Ive's team or something, but it's the common refrain parroted on this site
-
If John manages Joe and Joe deletes a database in prod, do you say "John's subordinate deleted a database in prod" or do you say "John deleted a database in prod".
You see how there's a difference there even though both acknowledge that John has a part in what happened?
It's not that complicated to see the difference if you've ever interacted with any sort of situation where the buck actually stopped with leadership, but I guess that's not universal.
> Kevin Scott is a CTO. He's in charge of top level product technical decisions
Then it's probably fair to blame him for some high-level decisions. But technical decisions go well beyond design, and microsoft has so many products, so it's harder to say how much you can point at him.
> The top priority of his job is those few dozen decisions.
I honestly have no idea which few dozen you mean. Across all of microsoft? I could list a bunch for "apple product design", like the way airpods fit, the decision to have no holes in airtags, the keyboard and touch bar choices in macbooks, etc.
Maybe the start menu location? You could probably blame him for the choice of xbox models too. I'm not singling out Apple in saying that executives should be considered responsible for certain high-level decisions.
> Be a little less stuck on the word "Director".
I'm stuck on the word "design". He's the design guy.
> The fact is, when you're one of the main people in charge of allowing decisions, it's not the same as personally championing them.
If it's one of the top few most important decisions under your job purview, the difference is so minor as to not matter outside the company.
> Does that sound like personally assigning blame to Johnny Ive for something? It'd be one thing if it said Ive's team or something, but it's the common refrain parroted on this site
Assigning him blame is not the same as saying he 'pushed' it. The buck stops here for design. He gets the blame because he strongly approved it and he could have easily spent entire days on the decision because that's the core of his job, and spending enough time on the decision is also his job.
> If John manages Joe and Joe deletes a database in prod, do you say "John's subordinate deleted a database in prod" or do you say "John deleted a database in prod".
John decided to delete a database in prod. Ive decided to go with this keyboard.
Assuming the delete wasn't accidental, because the keyboard definitely wasn't accidental! If it was an accident this analogy isn't relevant.
I think it's fair to say that Jony was ultimately the DRI ("directly responsible individual" in Apple-speak) for all industrial design, so he "owns" it, which is a bit above "signing off" or "accepting", whether or not he was personally pushing for something.
This is a bit of a quirk of how Apple structures responsibility, and makes it a bit more fair to say that "Jony made a disliked change" in a way that doesn't quite apply at Google or Microsoft, where responsibility tends to be a bit more diffuse.
DRI has expanded throughout the tech industry, I can't remember the last time I was on a team that didn't use the concept.
But I provided a simple analogy above.
Say John manages Joe and is the DRI for data storage. If Joe goes and deletes the production database, John has some blame even though he didn't personally delete the database.
Do you not see the difference between saying "John deleted production?" and "John's subordinate deleted production?"
Both are assigning some blame to John, but only one is factually true.
This entire conversations almost feels like the typical HN inability to realize the world is not black and white.
It's like people need Johnny Ives to personally have opened up a CAD drawing and shrunk the MBP because it's utterly impossible that a larger team decided on the vision for an entire flagship possible.
-
Lol the replies. What a weird way to dodge a simple question lol.
"John's subordinate deleted production" implies that John is partially responsible, but accurately reflects he did not personally delete it.
You're not even mentioning Joe, you're accurately reflecting John was in charge, but you're also not lying and saying John did it.
I work at Apple, I'm a senior engineer - been there for almost 2 decades. I'm DRI on a few things here and there.
Not a single decision is made on things that I am DRI on without me being a part of that decision. I may not get my way if I'm over-ruled for corporate reasons, but I know about it, and being the DRI, I get a slightly-larger-than-average say in what happens. Generally it takes a director or VP to over-rule what I want, and then the radar is very clearly marked as such.
Apple takes the concept of the DRI very seriously. You don't give responsibility without also giving power.
My opinion: There is zero chance (not "a small chance", zero chance) that Jony Ive didn't sign off on, and explicitly endorse the Touch Bar. Something that obvious, in that commanding a position in the user interface would never have escaped his personal input and attention.
Thank you for saying this, your personal experience here is just about the best insight we could ask for. Subjectively, there's an odd lack of current Apple engineers weighing in on threads here at HN relative to other FAANG companies. I've often wondered if the company's rules were stricter.
"John was responsible for production having been deleted" because of the systems and processes he did or did not put in place. At a high enough level of abstraction, this is all that matters.
Jony was responsible for the Touch Bar.
Anyway, some evidence: "For years, Apple Chief Design Officer Jony Ive has expressed a desire for the iPhone to appear like a single sheet of glass", suggesting that this could have been part of a larger overall design direction. (https://www.wsj.com/articles/apple-unlikely-to-make-big-chan...)
I'd be willing to bet that they mocked up MacBooks with full touchscreen keyboards.
Further, I don't think it's a coincidence that I don't mind typing an email (core C-level activity) on an iPad on-screen keyboard, but I'd find it infuriating to try to code on.
> When your title is "Chief Design Officer", the design buck stops with you.
That's only because in your ignorant reality you have made it so.
The actual reality is that what constitutes a product is so much more than just the design. For example it includes what features should and shouldn't be there. And that is a decision largely coming from the Product team. Or how it works. Which comes from Hardware Engineering team.
ESC itself doesn't cut it for me. With my resting hand position on the keyboard, my fingers touch the touch bar, and it always causes something either catastrophic or very frustrating. On a similar note, I'd gotten used to pressing Fn keys without looking at the keyboard. With touch bar, I have to carefully analyze the touch bar before doing anything with Fn keys. It's a very problematic experience overall. If it was a separate bar, I wouldn't have any of these issues.
I thought the touchbar was great idea but I hated that the function keys (and especially esc for a while) were sacrificed for it. They could have taken that 1cm of vertical space from the ridiculously huge touchpad instead and given us a ridiculously huge touchpad along with function keys and a touchbar.
> The touch bar is a very flexible (effectively) analog input + rich display device.
It just can't work with people like me that never look down at their keyboard. I'm not trying to be elitist, it's the honest truth. I wanted to love the Touch Bar, tried plugins like Pock, but in the end no matter how hard I tried I can't help and force myself and interrupt what I'm doing to look down, it just doesn't make sense.
It was tried. Many, many people spent many, many hours inside of Apple trying to make the Touchbar more useful. The simple fact was that looking down at it was a context shift and, in general, no one wanted to do it. It exposed functionality that you would eventually learn to drive from the keyboard.
> Analog knobs, jog wheels, sliders, trackballs, etc. are tragically missing.
Fwiw they are readily available by way of USB (e.g. MIDI) controllers. There are loads of dedicated knobs, faders, pads, etc. with a large amount of software to customize and translate those inputs (in addition to the array of software supporting them natively)
Obviously that would be external to the computer, but I think given the highly specific nature of analog controls it makes sense for these to be external. I'm having difficulty imaging a set of analog controls that would be at the same time universally useful and efficient in terms of weight and space utilization.
I actually really like the TouchBar except for the dramatic input lag. The input lag is so damn high that I never ever use it. If you could swipe left/right on it without holding down first, as on an iPhone, and if touch events generally had the same responsiveness as on an iPhone, I think everyone would have loved it much more. RIP TouchBar.
I love BetterTouchTool, and it does help, but the input lag is still too high. And as far as I know, even with BTT, there’s no way to get a proper swipe interface that removes the “touch and hold” delay.
What I don't get about the touch bar is: Why didn't they put it above the F-keys as an additional input? There's room there. Why remove the F-keys (and, even more insanely, the ESC key)? People were more upset about that than any limitation in the touch bar's usability. It obviously adds some possibilities. It's just that some programs actually use the F-keys and have for decades. They provided a no-look keypress feedback for many tasks. What ever was the genius idea of taking that away?
The touch bar was just awful. Not only did it lack software support, but occasionally it would just become non-responsive. And that's really bad when you are trying to mute or turn the volume down quickly, and you have to wait a couple seconds for it to recognize touch input again, or fall back to the mouse.
Also it's on the keyboard where you're not looking. If you want analog controls, either make them tactile - like a physical dial or slider - or just make the display a touch screen.
> mechanical engineering group trying to design the best extremely thin keyboard they could, but getting bitten hard by a mismatch between reliability in a prototype
And this is why I believe more in the Ivy story than this one.
If you're a mechanical engineering group mismatch between prototype and production is MechEng 201 pretty much.
You wanna put out a new keyboard fine, but test the heck out of it. Let people have a go at it, put force onto it, spread some food on it, etc
Unless that mechanical group really think keyboards are a toy and don't need reliability, in this case they don't belong there
> No, this was some Apple-internal mechanical engineering group trying to design the best extremely thin keyboard they could, but getting bitten hard by a mismatch between reliability in a prototype vs. full-scale factory production + poor estimation of reliability in a wide variety of contexts over a longer period of time
Okay, but...what caused them to try to make a keyboard that thin in the first place? GP is suggesting that it was driven by Ive, which you dispute, but you only give an alternative explanation for the "what", not the "why".
>and no evidence that including it had anything to with making laptops expensive as a goal.
It is product differentiation. The MacBook Pro 2016 redesign was delayed by a year due to Intel's CPU problems. The touchbar Macbook also had higher ASP from the start. It was the Post PC era. Everyone was suppose to leave the PC ( including Mac ) platform to Tablet. It doesn't get any clearer than that. Even making an iPad ads "What's a Computer". Making an ASP increase is a typical move of a market where you want to milk it. Did I mention they completely neglect Mac Pro for years?
>Jony Ive was its patron
Despite media wants to claim otherwise at the time and had shills cover it up. He spend most of his time on Apple Retail redesign and Apple Park. But iPhone X and Macbook / MacBook Pro was his vision of how the ultimate MacBook Pro and iPhone would be as he said so himself. He was named CDO in 2015, along with some design team restructuring. The “Designed by Apple in California” chronicles 20 years of Apple design photo book came out in 2016. When he finally left in 2019, the media and shills were suggesting he hasn't actually been on product design for quite a few years. His earlier work on Apple in 2011+ was iOS 7 re-design. ( After Scott Forstall was out ) And we all know how that went as they spend the next 3 years to iterate out of it. To the point their old UX design head had to retire. And if you look at the changes to Apple Retail Store redesign, they were the same. Form over function. Partly Jony's fault, partly Angela.
>Nobody ever set out to make a “worse experience” or higher cost.
Apple filled many patents where they were looking at keyboard on a flat piece of glass with Force Touch and 3D Touch. These patents were specific to computer. Higher BOM cost are often used as moat in any luxury items.
>to design the best extremely thin keyboard they could...
If it wasn't for the butterfly keyboard. The internet would not have a group of people and product reviewer now talking about key travel distance. The thin keyboard has a similar typing experience as typing on glass......
>There are many suboptimal features...
No one fault them for trying. But the first report of keyboard problem came out in 2016 from MacBook users. Less than one year after its launch even before the MacBook Pro TouchBar. Apple constantly delete report of the problem on its support forum. The whole thing only gotten attention when an online press themselves decide to blog about it and went viral. That was 2018. They stopped reporting Mac user satisfactory in 2018, both in keynote and in investor meeting. It took nearly 3 years of ranting before Tim Cook even made a Keyboard Service Program.
Basically without Steve Jobs, no one had the gut to say, fuck this. This isn't working. Close it down. Work on a alternative or go back to where it was and we make a Service Programme. Instead they drag on it for years. Without product sensibility and direction.
>Replaceable RAM and SSD being lost is still painful. Personally I don't believe this was about forcing users to pay for upgrades primarily. It was about shaving off a small amount of volume.
Louis Rossman gets a lot of things wrong because he does not have a computer engineering background. For example, he does not understand why Apple used SPI on the Macbook Air instead of USB despite it having USB capability. I had to correct him to explain that when your design goal is extreme power saving, you have to cut everything including running your data over SPI instead of a more power hungry USB bus.
Furthermore one reason they ship soldered on Ram is technical. It has been explained here from time to time that they are achieving much higher memory bandwidth with the memory modules they are using and it necessitates being soldered on. If the design goal is to build the most responsive laptop while maintaining excellent power savings, then this is the right approach to take.
I can understand the soldred RAM on M1 — yes, speed of light and other laws of physics get in the way. But why solder the SSD? What's the technical benefit of that over putting an M.2 slot in there or something? How do you recover your data if you spill coffee on your laptop? What involved simply yoinking the SSD out of the slot now requires a fully working motherboard.
As things currently stand, the flash controller is internal to the M1 SoC. This results in significant cost and power savings, as well as some rather impressive performance.
Using an external SSD format like M.2 means the added monetary and power costs of 1) an external flash controller, 2) the mechanical components to interface with it, and 3) multiple PCIe lanes which can never be fully powered off.
Okay, it was part of the T2 chip before. And they were able to put replaceable SSDs into the Mac Pro. Those only contained the NAND chips and were proprietary, but still.
The problem I have with soldered SSDs is that flash memory has a limited lifespan in terms of writes. Once you're out of writes, then what? This is aggravated by the fact that M1 can't boot even from external media if the internal SSD isn't working.
It's worth noting that Apple's SSDs tend to be on the high side of the industry for write endurance. I think their calculus is that no reasonable user would exhaust the lifespan before the rest of the computer is well and truly obsolete.
(This was briefly not the case with that Big Sur SSD thrashing bug, but that's been fixed now)
It's actually split. The NVMe controller is in the M1 SoC, but the flash modules contain their own dumber controller and connect via PCIe in the back end (using a custom protocol, not NVMe).
Apple could and have put their SSDs in modules - that's how the T2 Mac Pro does it. They're still proprietary though, of course, due to the split design.
You can't take on-package RAM and stick it on a separate module through a connector (what connector? You need a 512-bit interface for the M1 Max...) without causing a problem or massively increasing your power consumption, for the same reason you can't string together 30 meters of USB3 extension cables and expect it to work. This is why USB 3.2 chipsets are little bare chips and run at 10Gbps, while 10Gb Ethernet chipsets need a big heatsink. More distance means a higher power consumption for the interface.
Interfaces are designed for specific characteristics like maximum capacitance and signal loss, and going beyond the design parameters means you need to change something. In this case, it would involve significantly increasing the power consumption of the interface.
Even if the SSDs were modular, you still wouldn't be able to do offline data recovery on them, because they're encrypted with system-tied keys. So this is tangential to them being replaceable or not.
It absolutely does. It adds capacitance, which means you need stronger drivers and higher voltage levels, which means you end up with a quadratic power increase to maintain the same performance.
You cannot push a 512-bit wide memory bus like on the M1 Max through a connector and longer PCB traces without massively increasing the power consumption of the interface. Remember, those laptops have a memory bus equivalent to 8-channel traditional DDRx RAM. You'd need 8 SO-DIMM slots just to get the bandwidth, at a huge increase in power. LPDDR RAM isn't even available in modules for this reason; it's designed for low power, and using connectors goes against that design goal.
> Touch Bar? This was nothing more than adding expense to raise the ASP
I thought it was a great idea, and I still do, but am so glad they removed it. It sounds great on paper, but practically, I used it for nothing other than adjusting brightness and volume.
> The butterfly keyboard was Ive shaving off 0.5mm of the width for a worse user experience with a higher production cost and less reliability.
Agreed. It really wasn't great. From a design perspective, it's quite clever, but from a usability perspective it was horrible.
> USB-C only was a philosophical move rather than a practical one that forced people everywhere to carry dongles.
Apple has a long, long history of doing this. Headphone jack? Gone. Ethernet port? Gone. VGA port? Gone. Floppy disk drive? Gone.
I'm fine with it in moderation, frankly. USB-C is so clearly the future that where I take offense is that the rest of Apple's lineup doesn't work with it (iPhones, AirPods, some iPads, etc.).
> Replaceable RAM and SSD being lost is still painful.
Perhaps, but now that RAM is not only part of the SoC but a significant reason that the SoC is so good (high bandwidth shared memory between CPU and GPU), it's a change I'm more than fine with.
> Ive is gone and every one of those decisions has been reversed or at least significantly amended. This is no accident.
Agreed. This finally, truly, feels like a Pro machine: "design first" is an approach for consumer products and, to Apple's credit, works very nicely on the iPad and iPhone and consumer MacBooks (generally). "Design first" for pro machines is great for the 3 minutes after opening the box, but when trying to do real work, you'd sacrifice all the bezels in the world to shave 30% off compile times.
> Apple has a long, long history of doing this. Headphone jack? Gone. Ethernet port? Gone. VGA port? Gone. Floppy disk drive? Gone.
There's an important distinction here you're glossing over: unreliable, wireless, software-controlled Rube Goldberg-esque connections (bluetooth, wifi) can't possibly supersede reliable wired ones. Wired connections "just work" 99.999999% of the time, and when they don't, you can actually see and inspect the thing that connects your devices to troubleshoot it. Wireless works only when it feels like it.
VGA, on the other hand, was fully superseded by various digital video interfaces, and floppies were fully superseded by optical media and then various forms of cheap flash memory.
And I mean it. People do still miss headphone jacks, and people do still buy ethernet dongles for their laptops. People don't really miss floppies and CDs.
> Apple has a long, long history of doing this. Headphone jack? Gone. Ethernet port? Gone. VGA port? Gone. Floppy disk drive? Gone.
The MBP designers still bravely include the 3.5mm headphone jack [0], though it is certainly true that the iPhone designers courageously jettisoned the jack.
Indeed, I was lampooning their terrible decision to remove the headphone jack and their gall to refer to it as "courage." I'm still salty about the whole ordeal.
That they decided to remove it in the face of many people saying it was a terrible decision was exactly why they referred to it as "courage". A lot of phones have removed it since for most of the same reasons - the modern 3.5mm is a pretend spec.
"Pretend spec" misses the forest for the trees: amongst the vast, complex ecosystem of audio-visual ports, 1/8" audio stands out: we actually managed to converge on a connector format that is universal, omni-present, excellent quality, DRM-free, and which boasts an intuitive "no-code" UI affordance that Just Works.
I love my AirPods. But the instant I have to start digging through screens to do a pairing dance with some third-party speakers, car stereo, etc, I pine for the simpler times when you could just plug in the AUX, and I mourn for what has been lost.
Everyone says "bluetooth is great, just use that". Only, there's TONS of bad bluetooth implementations out there. There's very few bad headphone jack implementations.
You can literally hear how much worse the bluetooth audio sounds in my wife's care compared to mine. She uses the aux port because it's so bad.
In particular there is a wide variety of required gain/impedance as well as multiple different proprietary options as people attempted to adopt it to add microphones, controls/signals, and/or video. This is why Android, Apple, Microsoft and Playstation all have incompatible accessories.
The number of contacts can vary between 2 and 5. The length of the primary contact is not consistent, thus the mechanism to retain the plug often does not engage. Likewise, there is no specification on overall plug size to guarantee a jack will fit into the device case.
Thats excluding other extensions such as toslink. It is also worth noting that making the wrong connection, e.g. attaching an AV output to an audio input, can physically damage equipment.
Generally what people see is that the headset they got for whatever device works and assume there is broad compatibility - but to work best the jack on the headset is being designed for and tested against a particular subset of supported phones/controllers/music players. And broader support is typically not possible without separate connectors (e.g. a headset cord sold for apple, for xbox, for playstation, for android).
I’ve never had any consequence from this except now my headphones also dont have the option of being a wired headphone, which means I have to ask the flight attendant for headphones to watch a movie on their screens. Everyone else is in the same situation though so its commonplace. Dont know if you’ve flown anywhere in the last year but a lot of people have.
I don't think apple decided to "undo Ive" immediately, though. The 2019 models attempted an incremental fix approach - fix the keyboard layout, add a physical escape key - while trying to preserve the 2016 features (usb-c only, thinness, touchbar). It seems clear that it wasn't until 2021 that apple has decided to throw all the crap out entirely.
Pure speculation: the forcing function for the big changes in the latest models was the M1. That forced a redesign of probably the entire unit anyway (eg different thermals, chipsets, power requirements, etc). Prior to that the path of least resistance was incremental changes and fixes.
Both laptops (Air and 13" Pro) that the M1 launched in kept their previous designs. So the switch from Intel to Apple Silicon itself wasn't the cause. The switch from M1 to M1 Pro and Max, maybe. But even the previous Intel machines had some serious TDP (and thermal issues), and even that wasn't enough to justify a redesign.
Sorry, still confused. That happened in Nov 2020 (so it's not a 2021 model), and it's an M1 (so not an Intel model)...so what 2021 Intel MBP is being referred to?
It's the 2021 model. They were working on this in 2020 maybe even 2019, but it takes time to deploy. Not that they didn't think about it until 2021. </pedantic_mode>
i have the 2019 mbp, bought reluctantly earlier this year after my 2015 went wonky. based on what i’d heard, i wasn’t expecting much, but it was clearly an improvement, even if it was incremental (touch id, bigger/brighter screen, better sound). the biggest obvious lack in the upgrade was the performance-battery life tradeoff, which is entirely on intel stagnating for over a decade. apple addressed these most glaring issues via the combo of m1 and more battery in the 2021. usb-c or touch bar were minor issues in comparison (that notch tho…).
This was the most shocking to me. I learned that the left and right side ports run through different buses, but each side does not have enough capacity to supply both ports at full speed. This mean I had to buy long USB cable to run to the other side of my Macbook in order to supply 3 monitors. I have a port just sitting unused.
Also, there seems to be a problem with left-side USB ports when charging. They cause the system to overheat (or at least think it's overheating).
Ive was probably cursing at the announcement: "WTF are you doing! You're giving people what they want? That's not the Apple I left, mate. By this time, in 2021, there should be NO ports on the Macbooks, and it should be so thin you could shave with it. What the hell is all this?"
If Ive had stayed for one more year they probably would have been selling 1mm thick rectangles of anodized aluminium. There's probably an Onion report for that.
It's ironic the extremes Apple design has ended up at, because there was a point in time (when both Ive and Jobs were working together) when Dieter Rams claimed Apple was the only company that really followed his ethos, minimalism with a critical qualification of honesty, where form follows function... but for the last decade it's been more like the aesthetic of minimalism at all cost... it's funny because that sounds quite a lot like skeuomorphism, it's pretentious.
Sometimes I wonder if folks use rose colored glasses when thinking about MagSafe. I don't miss MagSafe and I enjoy the interchangeability of chargers.
I could not keep my 13" MacBook plugged in while using it on my lap. The MagSafe cord repeatedly fell off when my leg, or a pillow bumped into it. My MacBook MagSafe port had black marks on it from arcing. At least three times I couldn't get it to charge because a metal fragment (once a folded paper staple) became magnetically stuck to the receiving side. On one of those occasions I couldn't fix it till I got home and grabbed some tweezers.
It was a smart solution, but IMO to a tripping problem that wasn't widespread.
No, it's universally loved because it has universally saved macbooks from nasty falls. I upgraded to a sans-MagSafe 2020 16" MBP when my 2015 MBP had a $600 screen failure. I broke the 2020's screen in the first week. Sold it, repaired my 2015 and have been happy ever since.
Through all it's faults and problems, I can't shake how good my 2015 MacBook Pro has been: great keyboard, MagSafe, good enough everything, great keyboard. Did I mention how great the keyboard is? It's on my lap right now, purring and tethered to power as usual.
And the cables were fraying constantly because the connector didn't have strain relief and it was using some weird rubber compound that crumbled. I had the misfortune of having the L-shaped one which was really bad (and they knew it because the next iterations returned to T-shaped). It didn't disconnect to prevent my macbook from falling from my desk and made my relatively new macbook all dented and beat up. When that fragile L-connector inevitably fractured and failed, and the cable not being detachable, I had to spend around hundred dollars to buy a whole new power brick. And that one was ruined within a year.
That was a really nice touch and not too bright. I stumbled on a Dell USB-C charger with a white LED. It doesn't change colors and it's bright enough to light a room.
> I don't miss MagSafe and I enjoy the interchangeability of chargers.
The new magsafe uses a normal usb-c power brick with a special charging cable. The cable has a usb-c plug at one end and a magsafe plug at the other end. So it should be intercompatible between chargers now, and there shouldn't be any more need to buy a new charger when the cable frays. (And apparently you can also charge with a normal usb-c cable in any of the regular usb-c ports).
I'm quietly hoping they might be able to run data through that magsafe port too. Being able to have a small usb dock which connects over magsafe would be sweet.
I am appreciative they did that. I hope the USB-C ports on the laptop can handle the highest charging available today (~100 watts, not the new spec 200+ watts).
> The MagSafe cord repeatedly fell off when my leg, or a pillow bumped into it.
This sounds better than having your laptop's power connector internally damaged rendering your laptop unchargable just because your charging outlet was on the wrong side or your dog wanted to step on the cable.
The shape of the MagSafe 1 cord end was better for lap/lounging use because it didn’t stick directly out as far the way MagSafe 2 did. It came out into a fairly low profile cylindrical shape that made the cord turn an immediate right angle so things didn’t lever it over and make it fall off.
That’s actually the second iteration of the Magsafe 1 charger, FYI — the original was just a chunkier version of Magsafe 2/3. I liked the cylindrical version as well and I’ve always wondered why they did away with it.
I know what you mean, although I thought the disconnection ease was calibrated correctly. The good thing is the modern battery life means an accidental disconnection is less of a problem.
I have worked at Apple and Ive does not unilaterally make the product decisions in the company. It is a combination of Product Marketing, Hardware Engineering, Design, Procurement etc and they are all discussed and signed off by the Senior Leadership Team.
When you building products at the scale Apple does decisions are years in the making. And so they need to make them based on what they think the future will be. Mostly they are right and sometimes they are wrong e.g. USB-C being the standard connector for everything.
> they are all discussed and signed off by the Senior Leadership Team.
I dont know how Apple works but sometimes if there is guy at the top who has to be kept happy you sign off on only those things which you know will make the guy at the top happy.
I'm quite happy about the move to USB-C and changed all my stuff to it as soon as possible. 5 years ago I had several micro usb and mini usb chargers, some of them broken. On a regular base I had to buy new chargers and cables. The MagSafe power supply cables broke easily (but yes, the port was nice). Now there's just one cable for everything, I still have 2 phone fast chargers but both actually work. Also I can just charge my phone without searching for the charger and the laptop can be connected to screen/keyboard with just 1 USB-C cable.
After all, Apple were also the first to sell Desktops without Floppy or Optical drive.
Worst of all, it was the loss of the much-beloved MagSafe.
I wish all of my cables were magnetic, The amount of things I have broken in my life by tripping is downright embarrassing. I do like being able to charge my mac book from my external monitor, and keeping my apple power supply in my bag in-case I need to go somewhere. It would just be nice if I didn't have to label my cables.
I use magnetic tear away cables for a lot of my gear. They have magnetic tips for lightning, USB-micro and USB-C.
For stuff like charging headphones, LED lights and other random gadgets with mixed plug types, I use charge-only cables for that stuff, and it's been super convenient.
There are also magnetic cables that support limited fast charging and data, but only at USB 2.0 speeds, so that could still be a deal breaker for some people.
I'm often tempted by magnetic adapters but when I look on Amazon at the options, it seems like I always see reviews from people who said it nearly caught their stuff on fire.
You have any recommendations for high quality magnetic gear?
> it seems like I always see reviews from people who said it nearly caught their stuff on fire.
This is part of a larger problem with a lack of regulations on high-current accessories. In the US, the FTC should probably be doing stringent inspections of imported cables, chargers, etc similar to how the FCC currently inspects communication devices so the substandard/dangerous ones get turned away at the border.
My experience with magnetic usb-c connectors is that the magnet can’t be very strong, because then it just pulls the adapter out of the port when you try to disconnect it.
Is the cable going from your laptop to your monitor in a position that you can trip on it?
Not at my desk, but sometimes when I temporarily plug into another monitor or a television it can be a problem.
For my desk, I wish I just had a dock like my thinkpad from 10 years ago, or at least if the connectors were on the back so I didn't have wires sticking out on both sides.
Some kinds of cable really benefit from easily detaching, and some don't.
It's not like other connectors are difficult to disconnect. My previous macbook had an hdmi port that would disconnect if i breathed on the cable too hard. USB-C does seem a bit more snug so far, but who knows how well it will last. Magnets done well just have a better chance of surviving if you trip on something, or if you drop your laptop.
USB-C is supposed to be rated for many times more insertion cycles than USB-A which is a big bonus in it’s favour, that said they do tend to get perceivably looser over time.
I have one laptop whose usb-c port has loosened up to the point where I have to wedge something under the cable on the table to create upward pressure in the port or it won’t charge reliably.
I’ve been way more impressed with the durability of lightning ports. They get dirty and need to be cleaned out but their mechanical strength is amazing (apple’s cables on the other hand…). I like that Apple is confident enough in the strength of the lightning port that in the Apple stores, the standard display is to have the phone being physically supported by the port alone, even in an environment where hundreds of careless people are going to be messing with it.
Another advantage is that USB-C has all the springs on the cable, while USB-A has springs on the port. When the springs get loose, with USB-C you can just replace the cable, while with USB-A, you'd have to replace the port.
It would be nice if there was some sort of standard like USB for magnetic charge cables! I do like the standard USB c charger that I can use for my laptop, phone, and switch, but there are certainly clear advantages to magnetic disconnect
Touch Bar I think was a legit attempt at modernization.
Every time I 'see' a touchbar, I want one.
They look cool and useful.
The only reason I don't have one, is because everyone seems to indicate they are useless.
It's also possibly a platform issue - maybe they just didn't get enough participation etc..
Also - 'thinner' is a rational and generally positive thing, it's just that we've reached a threshold where the diminishing marginal returns are starting to weigh on other things.
Even ports - it's not an aesthetic issue only - they're trying to get everyone onto a standard. Frankly, I support the notion - I'd love it if everyone just used the same dam connector. The reason I don't like my 'USB C only Mac' is only because the ecosystem isn't there yet. If the ecosystem were there, I'd be fine with it.
I think that the touchbar could have been fine if it had been spaced one row above a full sized set of function keys.
As it is, when I use the keyboard on my work MBP, my fingers will brush the touchbar and do things. I have turned off the functionality in most apps so that I don't get that. (terminal especially)
For my actual usage, it's a step backwards from the pre-2016 function keys, an always available pause/play/stop, volume and brightness. I can't be doing something in emacs and then hit pause or louder, it's switch to music and then I can pause.
The lack of a physical escape key also dooms it in my usage. (no, escape is not going on caps-lock, that's where control goes)
So the work MBP is 99% used with an external keyboard/monitor. The personal 2015 MBP is generally used on the lap.
USB-c is ok, and the real magic is when you've got a Thunderbolt/USB-c power delivery monitor with a built in usb3 hub. One wire to the laptop and you're done. (and even better when the monitor has a built-in kvm so that the other computer is just a switch away, without mucking with cables).
Why not the best of both worlds? Bind capslock to "control when chorded, escape when pressed alone."
I use Karabiner-Elements for this, it's very easy to set up. https://karabiner-elements.pqrs.org/
> Touch Bar I think was a legit attempt at modernization.
I agree, I think they're really interesting. The way we interact with computers hasn't changed much, and having a set of buttons you can configure sounds amazing.
I wish they would have just put it above the physical function keys, and deployed it to all MacBooks and wireless keyboards. Being limited to just some MacBook Pros probably hurt the number of apps that adopted it.
Not everyone says the touchbar is useless. I absolutely love it! I will miss it. And the new Macbook may be a beast but it looks terribly old - like a yesterday-machine. For the first time Apple released a product i will buy because it will be more capable - but not because i want it!
I'm kinda sad to see the Touch Bar go so fast because I have one and actually like it a lot. The variant where it has a physical escape key is perfect.
> Touch Bar? This was nothing more than adding expense to raise the ASP (Average Selling Price) of Macbooks, that had fallen precipitously low from a shareholder perspective because of the superb value-for-money proposition that was the 13" Macbook Air.
If customers don't like the Touch Bar, how does this make any sense? If pro users will pay (made-up number) $2000 for a MacBook Pro regardless of whether or not it has a Touch Bar because it comes with the CPU/GPU they want, adding a Touch Bar just decreases the margin.
If the MacBook Air is a better value-for-money proposition than the MacBook Pro to begin with, and customers do not actually like the Touch Bar, then why would they start switching to the MacBook Pro?
> If the MacBook Air is a better value-for-money proposition than the MacBook Pro to begin with, and customers do not actually like the Touch Bar, then why would they start switching to the MacBook Pro?
Yeah it was a bit of a head-scratch moment when those 2016 machines came out. Until then it seemed like you could just trust a macbook pro to be a great machine that could do everything you need, and that each edition would get better than the last.
Around that period it felt like Apple made a pivot away from trying to increase market share through having the best product, and toward maximizing their revenue by leveraging their amazing brand image.
One of the early things was the change from the L-shaped mag-safe back to the T-shaped one. The L-shaped one was such an improvement - those things would last forever. But the T ones would fray in a matter of months. It almost felt like someone at Apple looked at the money they were losing from selling less chargers and decided they had to go back.
And I liked mag-safe but honestly I don't miss it. USB-C charging is fine and having one standard charger you can get anywhere work for everything more than makes up for any relative drawbacks.
What I always wondered is why they didn't start with a $999 base model (like the original iBook) that was cheap, but big and slow. if you wanted premium performance/expandability/ports/screen, you could pay $2K for the Pro model in the same form factor. If you wanted portability, you could pay $2K for the Air in a smaller form factor with the same performance as the iBook. The cheapest model being the most portable is bizarre.
Then again, the iPad Mini is more expensive than the larger iPad, so obviously there is something going on I don't understand. Perhaps the cost of engineering the motherboard and battery in an integrated package are so high that they can't afford to split the line any further.
The 2021 14" Pro is the first truly pro model in a while. I hope they keep it up. The keyboard is actually usable for extended periods, it has ports, the screen is great (to be fair, all Apple retina screens are great to varying degrees). Did I need it? No. But I wanted it. The last Mac laptop I bought for myself was the 2015 13" MacBook "Pro", so they're getting more money out of me this time around.
>the iPad Mini [$499+] is more expensive than the larger iPad
Sort of. The $329+ "iPad" has internals that are a few generations old, kind of like the iPhone SE. The iPad Air ($599+) and iPad Pro ($799+) are the "real" current larger iPads.
Each year apple has been shipping thicker and heavier products. I get that some people are happy but boy do I feel the heft when holding an iPhone 13 pro especially coming from a slim iPhone 6s Plus.
If Steve Jobs was alive he would never let such products ever released. You can feel he left his mark in MacBook 12 and iPhone 6 as those products were lightweight and thin - read pushing boundaries.
All the new ports are pretty useless to me. I think that adding them back in shows a lack of vision and just caters to the lowest common denominator of complaints instead of making a better product from first principles.
HDMI: I guess it's good if you have an old monitor? My monitor from like 2015 has USB C and charges my computer while I use it.
MagSafe: I've never had a computer fall off a table. Seems like a weird overoptimization for an unlikely scenario. If you use it you've got to carry a different cable that you can't use for anything else, unlike the USB C power cable that i also use to connect peripherals if needed.
SD card: I've never used one of these. I guess it's good for professional photographers? Why don't expensive cameras just have 256gb of onboard storage and connect over thunderbolt?
MagSafe isn't just about the convenience of it. It may not help you or have helped you but it's helped many people (including me) and even if it didn't, it provided peace of mind.
But the key reason for it is that it's a dedicated power port. That means that port is designed for that. You don't waste money allowing your laptop to be charged from any of the four USB-C ports. Like, who needs that?
Worse, those ports weren't identical leading to the advice to always charge from one side to avoid overheating.
I do find HDMI to be a bit of a strange choice however. I don't mind USB-C to HDMI/DP cables for this. Like you say, more monitors support DP passthrough over USB-C if not full TB.
The HDMI port is also 2.0 not 2.1. The difference? 2.0 can run 4K @ 60Hz. 2.1 can run 4K @ 120Hz. 120Hz continues to have poor support under OSX but it's clearly the future.
As for the SD card, I don't really use this either but this is aimed at photo and video professionals. Why not just connect the camera? Easy. Because you need to keep using the camera so it's far quicker just to swap out the cards and start copying.
A modern digital photo or video setup will have a camera with 2 SD card slots. The camera will write the same to both cards. When swapped out, one will be kept separately as backup. The other will be copied onto another device, which then may also copy that offsite. This way you immediately get 3-4 backups in 2-3 locations, which is a lot of redundancy.
> But the key reason for it is that it's a dedicated power port. That means that port is designed for that. You don't waste money allowing your laptop to be charged from any of the four USB-C ports. Like, who needs that?
USB ports have to be connected to the power system anyway because they have to power peripherals. Maybe it's got to be a slightly heavier connection to charge the computer, but I'm guessing the extra cost is minimal compared to a whole different fancy port with magnets in it and a whole different cable. Who needs it? People who don't want to carry around an extra cable everywhere.
> Why don't expensive cameras just have 256gb of onboard storage and connect over thunderbolt?
People don't want to throw away their camera every time the storage runs out. They want mirroring between cards so the wedding shots don't disappear on a storage failure. So they can swap storage on the go.
Your strong assertions based on your limited understanding of other people's needs says more about you than Apple.
Apple products are excellent, well built, and they last. If they didn't add new things regularly far fewer people would buy. I think a lot of the new features on Apple products have simply been attempts to make things look different enough to be worth buying.
This includes removing the things they've added...
I'm in the minority, but I really enjoy the butterfly keyboard on my Macbook. I'd love to have a fullsized butterfly style keyboard for my PC. Unfortunately the high-end market for keyboards is dominated by mechanical keyboards which aren't my kind of thing at all.
Just plug in the Apple external keyboard and remap some keys. Unpopular opinion but I find the Apple keyboards feel “better” than most mechanical switches (certainly mx brown and clear), they have good tactile bump better than most and a nice lightness to them so they’re not fatiguing.
Innovative design doesn't work without an internal champion who can rally the company around unconventional ideas. Jobs played that role, but now Apple is led by the operations team. The word "design" does not appear anywhere on their executive leadership page.
Unconventional ideas are inherently risky. They're just not worth pursuing if buy-in can't be secured and leadership is more focused on compromising to increase profit margins, etc. For that reason, it's great (in the short term!) that Apple is rehashing known-good designs from a decade ago. However, I don't see that strategy working in the long term.
I suspect M1X’s excellent memory bandwidth is related to the lack of replaceable memory. x86 CPUs can, as far as I know, max out the bandwidth of their DIMMs, and they don’t get anywhere near the bandwidth of M1X. And integrated design can do better.
Complaining about the price of Apple’s RAM seems reasonable, but complaining that they didn’t choose DDR5 DIMMs when they appear to have chosen something better seems silly.
> The USB-C cable situation was and continues to be a nightmare as different cables support different subsets of data, power and video and, worse yet, different versions of each of those.
I’m about to upgrade from a 2015 MBP and am wondering - is there a usb-c cable I can buy which works with everything guaranteed?
I find it funny to appeal to Steve Jobs for a reality check :)
btw my pet theory about the touch bar is: let's use the apple tech skills synergy again, we made tactile swiped device since the ipod, we made that into phones, now let's blend it into our laptops.
I want this to be true but to be honest, I care less about the Ive angle than them simply doing it! It seems like someone with power said, "hey, what are these letters P, R and O standing for, again?".
I don't agree with the USB c story. To me it's extremely handy to have just usb c. All (most) devices work, can use the same cable for charging my phone. super convenient for me.
Touch Bar? This was nothing more than adding expense to raise the ASP (Average Selling Price) of Macbooks, that had fallen precipitously low from a shareholder perspective because of the superb value-for-money proposition that was the 13" Macbook Air.
The butterfly keyboard was Ive shaving off 0.5mm of the width for a worse user experience with a higher production cost and less reliability.
USB-C only was a philosophical move rather than a practical one that forced people everywhere to carry dongles. The USB-C cable situation was and continues to be a nightmare as different cables support different subsets of data, power and video and, worse yet, different versions of each of those. Worst of all, it was the loss of the much-beloved MagSafe. Also, the ports weren't all the same. You were better off charging from the right (IIRC) rather than the left.
Replaceable RAM and SSD being lost is still painful. Personally I don't believe this was about forcing users to pay for upgrades primarily. It was about shaving off a small amount of volume.
Ive is gone and every one of those decisions has been reversed or at least significantly amended. This is no accident.