Exclusive interview with Apple Senior Vice President Craig and Vice President Alan: 16 years after the birth of the iPhone, where is interaction design going?

16 years ago, a 3.5-inch large-screen device, equipped with neatly designed apps with rounded rectangles, changed the direction of the smartphone trend. Since then, the Internet has been put in everyone's pocket.

After decades of graphical interfaces dominating human-computer interaction on PCs and smartphones, software is still devouring the world, but the iPhone's human-computer interface is brewing a revolution.

In an exclusive interview with iFan'er, Craig Federighi, Apple's senior vice president of software engineering, said that when the App Store was launched in 2008, Apple built a simple and intuitive paradigm to encapsulate information in each App, and each App It is equivalent to an independent world, and each App is independent of each other.

Today, Apple is transcending these information silos.

How do iPhone interactions evolve as people are surrounded by information?

Starting from iOS 14, widgets have gradually become a set of interface components independent of the App. You can see real-time weather changes, stock fluctuations and other information on the home screen without opening the app. In iOS 17, launched this year, widgets have been upgraded to interactive widgets, making to-do lists, playing music, calling a ride, and taking notes all within reach.

We hope that developers can transcend the limitations of the App and allow users to obtain information in the system space, so that multiple pieces of information can be integrated in one place.

Widgets, which originated in early versions of Android, have become Apple's new interface to help users combat information overload. In fact, the essence of iPhone human-computer interaction is to help people obtain and process information efficiently.

When the iPhone was born, the only way to get specific information was to open the app. It was not until the emergence of the iOS 3 notification system that information changed from active acquisition to passive reception. However, after many years, due to the business orientation and competition of Internet companies, the notification system has become a traffic portal that competes for users' attention.

I asked Craig if this was a problem facing interaction design, and he agreed:

It felt like there was a billboard pulling on me, which made me feel uncomfortable.

Different from the era of all-in-one mobile phones, in today's Internet of Everything, people have become accustomed to simply and crudely adding a screen to their devices. Countless devices, countless screens, countless reminders and notifications are tearing users apart, and algorithms are feeding us The comfort and superficial happiness at our fingertips pull us into endless time traps and information anxiety.

▲Image from: Wallpaper by Jason Schmidt

Alan Dye is Apple's vice president in charge of human-computer interaction design. He studied under Jonathan Ive. He has been the design director of a fashion brand and has also held important positions in advertising companies. He has gone from being an information producer to its opposite. He feels more different than ordinary people. For real:

In the past, the way people got real-time information was one notification after another, so we thought there should be a better way.

The better way Alan calls it is "real-time activity", which is created to deal with a type of information that changes in real time. On the lock screen, the timeline will inform you of the progress of hailing a taxi or taking out food. When you are chatting in WeChat, you don’t have to switch apps frequently. The license plate number, vehicle distance or delivery time will always be displayed in the dashboard in the form of a dashboard. island.

Alan said that real-time activities actually take advantage of the basic capabilities of widgets. It will only appear in the short time you need it, neither occupying a place on the home screen nor frequently disturbing users through Push.

Craig emphasized that the user's attention should be respected. He said that the Apple Watch was born to relieve users' anxiety about mobile phones, so when a notification is received, the wrist will vibrate slightly, but the phone will not buzz in the pocket.

iPhone and our other products are here to support you, not make demands of you. A humane solution is to present information in a quiet way and let users decide what they need.

What Crag calls "calm" is very consistent with the concept of Calm Technology proposed by Mark Weiser at the end of the last century. Whether it is information prompts in widgets or dynamic information about real-time activities, AI recognition or situational awareness is used to infer user intentions and proactively adapt to user needs with minimal user input.

Making interaction invisible may be the future of interaction design.

How was Smart Island born?

The human-computer interaction design team led by Alan Dye is responsible for studying the interaction between users and products. This work spans hardware to software, vision, hearing and touch: from the jittering animation when moving icons in iOS in the early years, to the sci-fi hand-eye interaction in Vision Pro, from the delicate gear vibration brought by the Apple Watch digital crown, to the dual The air gestures of double-clicking each other are all made by the team.

But it’s extremely difficult to trace the origin of an idea back to the team. Alan said that in the center of the Apple Park design studio, there is a huge picnic table where many discussions take place.

▲Image from: Wallpaper by Jason Schmidt

Alan told me that Apple's design process "usually starts with an idea, a goal, even a problem we want to solve." Teams often ask themselves: "How do you interact with our product? Why do you want to interact with our product?"

The birth of Smart Island actually stems from a common internal question:

If the sensor area on the screen can be made smaller, what can be done with the remaining space?

Because the iPhone OLED screen is capable of true black, it means designers can hide these dark sensors within the rendered black areas, blurring the lines between sensor and screen, hardware and software, creating a fluid, multi-dimensional Functional space.

Earlier discussions tend to gather better resources. Smart Island has assembled display, industrial design teams and human-computer interface teams from the very beginning. Apple's goal is to blur the boundaries between hardware and software. In the words of Craig:

Let users not see where the hardware ends and the software begins.

To achieve such a goal, experts from different disciplines need to work together. Alan revealed that industrial designers, interaction designers, color experts, sound design experts, typesetting experts, and dynamic design experts all participated in the development of Smart Island.

Our first consideration is how the design meets the needs of the user. But at the same time, we also think about how to make the design pleasant and engaging so that users want to interact with it.

The coordinated use of visual, tactile and sound effects is Apple's usual practice. When the interaction logic of electronic devices is consistent with experience in the physical world, the cost of understanding can be reduced.

The flexibility of Smart Island is reflected in the use of inertia, elasticity, gravity and damping. When two apps enter the Smart Island, the Smart Island will be divided into two. The splitting process is slow and muddy. The time and signal icons on both sides will also be squeezed and rebounded, which is completely consistent with the kinetic energy between elements in reality.

In addition, Apple pays special attention to the layering of visual elements, which Alan said is to show the hierarchical structure of information:

Perhaps the best example is Vision Pro. The material we use seems to be the material in the real world, and it feels almost exactly the same as the real world.

Designed for all users

Products are often images of culture. If you take a closer look at the new features developed by Apple in recent years, you will find that the spiritual appearance and animation are only the appearance and means of Apple's interactive design, and human nature is the source of Apple's design philosophy.

The iPhone hopes to reduce the pressure on users due to App information overload, and introduces interactive widgets and real-time activities; AirPods Pro does not want noise reduction to block users from communicating with the outside world, and launches adaptive noise reduction and conversation awareness; because it firmly believes that people should not Feeling isolated, I spent several years inventing Vision Pro's field of view penetration, just so that others could see the wearer's eyes…

If Apple wanted you to revolve around the device in the past thirty years, today, not disturbing has become Apple's gentleness.

Users will only have this design when they need it, otherwise we hope that the design will be invisible.

Alan emphasized that Apple's design philosophy not only focuses on appearance, but more importantly, on functionality. The quality of the function depends on whether the user knows how to use it after picking it up.

This can easily be understood as a “low threshold for design,” but for Alan, this is Apple’s deep-rooted principle—design should start from “all users.”

Apple's developer design document introduces the principle of "design for everyone" in detail, and mentions an "empathy" survey:

To design intuitive experiences, you need to understand people’s needs and expectations so you can create content that resonates.

Dimensions of empathy include: age, gender and gender identity, race and ethnicity, gender orientation, physical characteristics, cognitive attributes, disabilities, language and culture, religion, education, political or philosophical views, social and economic background…

The definition of "disability" is very interesting, including: permanent disability, temporary disability and situational disability.

For example, inconvenience to hands and feet due to injury is a temporary disability, while a person with normal hearing who cannot hear a phone call clearly in a noisy environment is a situational disability.

Perhaps it is precisely because of this inclusive principle that barrier-free functions can move from a minority group to the general public .

In many cases, we will find that the functions we develop for these accessible people are actually of great benefit to ordinary users. Double Tap on Apple Watch is a good example.

I think this is the best comment on humanities. Designed for all users, it will not be modified due to proportion.

From being a bystander and recorder of technology, to becoming a practitioner of how technology affects lifestyles.

# Welcome to follow the official WeChat public account of aifaner: aifaner (WeChat ID: ifanr). More exciting content will be provided to you as soon as possible.

Ai Faner | Original link · View comments · Sina Weibo