Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My problem with the TouchBar is that I'm using my laptop with an external screen 90% of the time so I'm never really getting used to it.

If the external keyboard had a TouchBar and I would start incorporating it into my workflow I can see it's use-fullness.



It seems like Apple associates the TouchBar more with “touching” than with keyboarding. As in, when you connect an iPad to a Mac using the Sidecar feature, the iPad will display a Touch Bar emulation, because that’s the accessory they think of a Touch Bar as belonging on: an external touch surface, not an external keyboard.

Which is honestly interesting; it seems almost like they’re suggesting that this could grow into a larger feature, where Sidecar is really a kind of “Remote Touch Bar.app”, and you could add larger Touch Bar controls to your app that are only visible through Sidecar. (So you could have OS-level support for e.g. DAWs to display their VSTs onto your iPad for direct manipulation, without needing their own iPad OS app.)

———

I should note, as an aside, that Sidecar lets you keyboard on the Mac host through an iPad’s attached keyboard-case, but doesn’t really treat finger-gestures done on the iPad screen in the Sidecar app as being equivalent to mouse touchpad gestures on the host.

I’m wondering if that’s a conscious design choice, and whether someone at Apple is thinking that the “new HCI paradigm” for desktops will involve still having an external Bluetooth trackpad, but no external Bluetooth keyboard, with that role instead being served by an iPad with a keyboard-case attached to it.

That’d kind of fit—it enables all five(!) interaction methodologies Apple currently has: mouse gestures, keyboard commands, touch inputs, pencil inputs, and touch-bar controls.

But, importantly, it does so while entirely avoiding “gorilla arm” (unlike the huge Microsoft Surface Studio), because your touch surface is small and on the table in front of you, rather than “being” the display. (For most Sidecar iPad gestures, even with a full-sized iPad Pro, you never really have to lift your arm off the table.)

In Apple’s envisioned desktop paradigm, touch is seemingly an input method that you get from a separate input device—one that happens to have a screen—rather than touch being just a “way to do” mousing.


It's ridiculous, touching imo is a vastly less efficient information exchange than typing.


Maybe now that it's called a "Magic Keyboard" they will sell it as a stand-alone product.

I usually put my laptop in front of me with an external screen above it. That way I get used to the laptop keyboard and can feel right at home in a conference room or wherever.


It's very unlikely, but I would like Apple to release the touchbar as a standalone piece of hardware (not attached to a keyboard at all). That way we could have the best of both worlds: physical function keys and a nifty touch device. Would be really useful for music/video editing


They already sell the Magic Keyboard, and I have one on my desk: https://www.apple.com/shop/product/MLA22LL/A/magic-keyboard-...


What is magic about it? It looks very basic.


The keys actually work.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: