When the TouchBar was announced for the MacBook Pro, I was torn. On one hand, it felt gimmicky and not truly useful (suspicions I mostly confirmed after using one for a couple months). Most of the joy came from using TouchID. On the other hand, it showed that Apple was thinking about how they could do input differently on the standard form factor for work computing. It also shows where they might be headed: TouchBar is not the end goal, but the first step to something better.
Rumors have swirled for years about how Apple wants to merge iOS and macOS, probably taking on more of the character of the former. This goes hand-in-hand with those about ARM Macs and more recently that Apple will again develop their own processors.
Touch macOS sounds nice in theory, but probably won’t work in practice–as evidenced by Microsoft’s attempts with the Surface and Windows 8. Now, the Surface wasn’t all bad–that’s not the point I’m trying to make–but it showed how certain classes of application are still better suited to the old form factor of a physical keyboard and pointing device. That’s not to say all applications are–certainly plenty of apps are best used with multitouch input, and many others still benefit from it somewhat, maybe meaning that they’re really best suited for all three: multitouch + keyboard + pointing device.
The latest MacBook Pro has all three of these. Of course, the multitouch component, TouchBar, is still very much in its infancy. My assumption is that it is set to be expanded upon and mature into something different. What would that look like?
I anticipate a future MacBook Pro with a non-touch, high-definition display as it has today, but with a full-size multitouch input display on the bottom. Basically, today’s rMBP with an iPad Pro covering the entire bottom enclosure. It might even–gasp-have a place to store a compatible Apple Pencil inside the enclosure.
This wouldn’t have been the first time a dual-screened clamshell were attempted. Some new entrants into this form-factor segment are the Toshiba Libretto W100, One Laptop per Child’s XO, and Acer Iconia. Similar approaches have been tried with phones as well, like the Kyocera Echo, Imerj, and ZTE Axon M.
What started as the TouchBar will eventually swallow up, first the Trackpad, and eventually the keyboard. It’ll take over the trackpad first as concepts around display and touch input are combined, similarly to how they are on iOS today. Like the TouchBar does today, it will be able to display custom input views, but for more complex controls than just volume and brightness sliders. You’d essentially have an iOS device dedicated to input for the macOS system.
This leaves the venerable keyboard competing for space. It’s a hard tool to get rid of–for many professions, it still can’t be beat, like programming, publishing or anything else requiring substantial amounts of writing. It’s position of power is validated by the fact that it was emulated in the touchscreen interface for iOS. The downside there is that the on-screen virtual keyboard covers half of the screen, blocking the underlying content or even just the form you’re trying to fill in. And of course every now and then it’ll just lock up or crash–it is software, after all.
With external keyboards, the iPad suddenly feels much more productive, more spacious. In the case of our futuristic rMBP, we don’t have to make this tradeoff–if the physical keyboard were totally replaced by the touchscreen input, it wouldn’t need the main display at all. But, as a touch typist, I’d still be much slower than I was with physical keyboards. And as mentioned earlier, moving from hardware to software introduces reliability issues–this is the impetus behind many people remapping Caps Lock to Esc, which I happily did on day 1 (the closer reach has since proved wonderful).
The easy, and most realistic answer, is to just keep using an external keyboard when you need it. I’d personally have no problem with this; I already use one anyways (my 2014 rMBP’s internal keyboard has stopped working and I currently live 3+ hours from the closest Apple Store, and I can’t mail it and wait 3-4 weeks), and having the external keyboard plus the whole new touchscreen input would probably be nice.
But that’s no fun. What if there were a way to have the touchscreen input be the physical keyboard I want? They could simulate the tactile feel of keyboard keys clicking, those little bumps on the home keys, and their edges and spaces between them as you drag your finger across the keyboard, with their Taptic Engine. They could probably accomplish everything they need for trackpad/keyboard tactile feedback with two engines, centered where each hand would be placed for typing.
As a developer (I promise this isn’t the start of an Agile story), I am of two minds about physical keyboards. Using an iPad Pro with external keyboard and Pencil is a wonderful experience, and I get a lot done with that setup. Indeed, I got it with the intent to replace all pencil/pen work, except personal journaling, and it has performed beautifully. Even with the virtual keyboard, I’m faster and more accurate than hunt-and-peckers or even the fastest thumb-typers on iPhone (yes, even the rare person employing two thumbs plus an index finger). I assume the average iPad user does not need the external keyboard, and maybe even the average macOS user could get by with just an iPad-like one on their futuristic clamshell.
For programming, my bread and butter, I still need a physical keyboard, so I’ll carry one around until I can program without it. While Swift Playgrounds on iPad is nice for toy programs and demonstration, it has a long way to go to completely replace Xcode on macOS. Transitions like that should also be long and gradual, and having an iOS device together in a macOS device provides a nice avenue to incrementally shuttling functionality over. Today, it shows debug controls; tomorrow, it could be a Wakandan wonder that makes programming more like painting–and why not? Code today is poetry, after all!