Midway through Apple’s demonstration of the new iPhone 6S, Craig Federighi took an emergency selfie. A what now? How can a selfie be an “emergency,” you ask? It can’t, at least not really. But Apple’s VP of software engineering was on stage in front of millions to show us otherwise.

With only the slightest effort, Federighi thumbed over to the camera icon on his home screen and pressed down firmly. Typically, this is the moment when the camera app opens and he’d tap the small icon that activates the iPhone’s front-facing camera. Instead, a translucent box popped up with an option reading, “Selfie Camera.” Just like that, Federighi was ready to take a photo. The shortcut saved Federighi one tap and two seconds of time, but it made the way everyone’s been taking photos so far seem cumbersome.

Of course, the point of the demonstration had nothing to do with Federighi’s goofy selfie. It’s that he’s taking those goofy selfie without ever lifting his thumb from the screen. Federighi was revealing the usefulness of 3D Touch, a new feature of the iPhone 6 and 6 Plus S that senses force and distinguishes between short and long presses.

Our glassy screens have always limited users to two dimensions. We’ve swiped, tapped and pinched on our phones, and the physics programmed into the devices have always reacted in a reliably two-dimensional way. 3D Touch gives the iPhone a z-axis, which allows developers to build in complicated contextual information and new functionality without cluttering the user interface. Apple is introducing a new gestural paradigm that it believes will eventually be deeply ingrained in the way we use touchscreens. And indeed, it could be the beginning of what Chris Harrison, a professor at Carnegie Mellon University’s Future Interfaces Group, calls a “rich touch” world where every touch we make has a deeper layer of functionality built in.

3dtouch-story Apple

You could think of 3D touch as a right-click for a touchscreen. It’s a gesture that unearths a vast amount of extra information and functionality with very little effort. To make sense of this new form of interaction, Apple has given short and long presses playful nicknames—peek and pop—that fit neatly into the vocabulary we already understand with swipe, tap and pinch. Peek and pop have essentially turned the iPhone operating system into nesting dolls of information. Press on the screen a little harder than usual and you’ll experience peek—a preview of information like emails, directions, or photos. Press harder yet and you’ll “pop” into that information deeper, navigating directly to the app itself. “It isn’t really a new gesture, just an extension of one you already know very well,” explains Tobias van Schnieder, lead designer at Spotify.

This enables all sorts of things, but the easiest use-case to grasp is found on the home screen. Usually, a long press on an icon makes it jiggle like Jell-O. Now it opens a new layer of information and options. Take Apple Maps. From the home screen, a long press brings up a translucent menu with the option to navigate home, mark your current location, or search for nearby restaurants. A long press on the contacts icon and you see a similar list: You can call your favorites without navigating away from your home screen or even fully opening the app. These half-app experiences are a way of achieving whatever it is you want to do—dropping a pin, calling mom—with greater speed and fewer taps. In this sense, 3D touch is basically a vehicle for shortcuts on a smartphone. And it’s yet another example of how the company is pushing users away from the home screen and shallow app experiences altogether.

3D touch works within apps, too. In messages, short pressing a text lets you see more detail while a longer press takes you to the conversation thread. That same logic follows you even deeper into the app. Let’s say a friend sends the address of the bar where you’re meeting tonight. A long press on the address will bring up a preview in Apple Maps. Same goes for calendar invites. If you’re curious about an email, but don’t have the to energy to read it, simply short press to “peek” it or long press to “pop” it. “Navigating within an app will feel lighter because there is a level in between,” says van Schnieder. “I can peek before I fully commit.” This is true of swiping between apps too. Instead of double tapping the home button to pull up the multi-tasking tray, you simply long press on the left side of your screen while you’re still in an app and you can scroll through the various other apps you’ve recently used like tangible cards.

Apple is hardly the first company to explore force touch on computer screens. Harrison points out that this sort of research has been going on for decades. You can see something roughly similar in a 1976 prototype from MIT that shows a touchscreen reacting to pressure and the direction of the pressure. Similarly, Chinese phone manufacturer Huawei recently launched a phone with force touch capabilities. Android has a somewhat familiar functionality with its long press and hold feature. And yet, Apple has always had a way of taking technology that’s been obfuscated by academics and less design-centric companies and giving it purpose.

All this is possible through some clever hardware engineering. Capacitive sensors embedded in the backlight of the Retina display sense how hard you’re pressing. It might feel like you’re pushing against an impenetrable slab of glass, but the screen depresses ever so slightly. The sensors read that measurement and software interprets it as a short or long press. Using Apple’s “taptic engine,” these presses are communicated as vibrations you feel on your finger. A short press gets 10 milliseconds of haptic feedback; a long press 15 milliseconds. There’s hardly any latency between your touch and the haptic feedback, which makes it easy to know when something’s changed on screen without being distracting.

Apple’s done a great job of introducing the feature in a simple, restrained way. So much so that it almost feels as if these interactions are a solution in pursuit of a problem. “It’s not like Facebook will be twice as good with peek and pop,” says Harrison. “I think it will be 2 percent better.”

True 3D touch doesn’t feel essential in the way that the first generation of multi-touch interactions do today. It might not for a while. It’s fun to press on a live photo and watch it move like a GIF, and it’s marginally helpful to preview an email without actually opening it. But are these things necessary to actually use the iPhone? Hardly. And you can bet that plenty of people won’t use the functionality right away.

As it was with multi-touch, the gaming industry will play a big role in showing what’s possible with these new interactions. And Apple has hinted with its new touchscreen TV remote that there are other applications and interfaces where 3D touch might show up. In many ways, Apple is simply training us to get comfortable with deeper experiences on our mobile devices, and they’re doing so by stripping away the excess UI elements—the reliance on the home screen, the clunky act of switching apps, the “back” buttons—that have bogged us down in the past. Peek and pop are just our training wheels.

Go Back to Top. Skip To: Start of Article.

See original article – 

The Smart UI Design Behind Apple’s Frictionless 3D Touch