"You received a new message"
Have you counted how many times you need to tap the screen to reply to a WeChat?
Swipe the bottom to switch applications or return to the desktop, enter WeChat and click the dialog box to reply to the message, a total of 3 operations; pull down the top status bar, click WeChat message, a total of 2 operations; directly pull down the message pop-up window, reply to the message through the small application window , Only operate once…
High-frequency operations such as replying to messages are being streamlined by mobile phone manufacturers, but in any case, users still have to hold the mobile phone and click on it to operate the mobile phone.
Is it possible for users to complete operations without swiping and tapping with their fingers?
A new feature recently updated by Google allows you to control your phone with emoticons, put your hands down, and just give a "eye" and the phone will understand.
Look left, look right
In most cases, directly tapping and touching the phone is indeed an efficient way of interaction, but for users with severe motor and language impairments, touch interaction does not always bring convenience to them.
If the finger cannot touch the screen, then the user needs a new set of interactive methods to achieve control, such as using eyes, brows and smiles.
Google has introduced a new feature in the latest accessibility kit. It can use the front camera of the mobile phone to observe the user's facial expressions in real time, and can recognize six expressions to achieve different operations, including: look left and right Look, look up, raise eyebrows, smile, and open your mouth.
You can find this new feature called "switch control" in the latest update of the Android Accessibility Kit.
When you turn on this feature, a smiling face icon will appear in the center of the status bar to remind you that the front camera will always be on to detect your facial movements.
Google claims that all calculations are performed locally, and the feature does not save image data, nor does it perform facial recognition like unlocking.
It should be noted that this is a new set of interaction methods, with completely different operating logic from touch operations.
Therefore, when using it for the first time, you'd better sit upright, put your phone vertically in front of you, and make sure that the front camera can fully capture your entire face, and then patiently follow the setup guide to complete the setup (very important).
In the setup guide, you can map each of the six expressions to one operation. For example, set "Look to the right" to "Next", "Browraising" to "Select", and "Open mouth" to "Pause or enable" Recognition…
These three are the most basic and most important operations. When you can use these three operations proficiently, you can continue to set up other more complex operation mappings.
After completing the settings, Android will automatically recognize all the key elements on the screen at this time, and a small blue box will appear, which is the key to "switch control".
Through the "next" operation (look to the right), you can move the blue box, select the area you want to click, and then use the "select" operation (raise eyebrows) to click.
Of course, the actual experience is much more complicated than the text description. During the experience, I sometimes encounter facial recognition failure or misrecognition, which leads to interruption of the operation.
Once the emoticon management capabilities are not in place, the phone may also be operated randomly due to misrecognition, and it is not easy to operate the phone with emoticons.
But when I opened the video software for five minutes and clicked on a movie, I felt a sense of accomplishment and touch-not because I finally completed this feat with a face that was almost cramped. "But this means that many users with inconvenient operations can finally use their mobile phones to accomplish more things on their own will.
Google claims that "switch control" is not an exclusive feature of the new version of the system, but will allow all Android devices (Android 6.0 or above) to be installed and used through the Android Accessibility Kit. The convenience of the mobile Internet will therefore benefit more user groups.
In Google’s demo video, it showed some users using "switch control" to play music, order food, and even navigate on their mobile phones. These operations were almost impossible in the past, but now they can finally become part of their lives. Part.
The meaning of "switch control" is no longer simply an interactive breakthrough. It may bring a new lifestyle to a group.
Understand emoticons, understand you better
In addition to "switch control", Google also simultaneously launched another emoji control application: Project Activate.
Through a lot of research, Google has found that communication barriers are common among motor and language barriers. They often need to press specific letters with their fingers and then use the computer to speak in order to communicate with the caregiver, which makes the communication between each other. Slow and difficult.
In order to make communication between the two parties more convenient, Google launched the Project Activate app, allowing patients to convey more information with simple facial expressions.
Like "switch control", Project Activate can recognize the user's six facial movements: looking left, looking right, looking up, raising eyebrows, opening mouth, and smiling.
Users can preset a shortcut command for each action, including sending a text message, making a call, or playing a specific audio.
For example, when “look to the right'', the audio of “Yes'' is played, when “look to the left'' is played “No'', and when “smiling'', a text message of “please come over'' is sent to the mobile phone of the caregiver, and “for joy'' can be played when “raising eyebrows'' Audio.
Compared with physical buttons, using facial expressions to convey information is faster, but at the same time it is easier to misuse. Sometimes the eyes are inadvertently glanced left and right, the phone may misunderstand and convey the wrong instructions to the nursing.
Of course, Google’s accessibility team also took into account the problem of misoperation. In order to avoid similar situations, users can turn on "two-factor authentication" or adjust the sensitivity of recognition.
The latter is well understood. Users can appropriately lengthen the time for a mobile phone to recognize a single expression, so as to avoid short-term expression changes causing misoperation.
The "two-step verification" requires the user to quickly make the same action twice in a short period of time before activating the preset shortcut commands.
If the user accidentally activates a certain action, such as "look to the left", you can keep the action unchanged and wait, and the command can be cancelled after the recognition timeout to avoid unnecessary misunderstandings.
Through Project Activate, users can not only express their ideas more easily, but also integrate into the social environment more easily: when the team scores a goal, they can "raise their eyebrows" and cheer together, and when others show kindness, they can "open their mouths". "Express gratitude, and send blessings with a "smile" when the festival is approaching…
Google said that Project Activate will be launched as an independent app for users in need to help more people overcome communication barriers. You can already download it on the Play Store.
Build a more accessible world
For a long time, patients suffering from gradual freezing syndrome, muscular dystrophy, cerebral palsy or multiple sclerosis have been ignored by the public. Few people pay attention to how they live, entertain or surf the Internet with movement disorders. Embracing the Internet as the general public should be one of their rights.
Nowadays, mobile devices have gradually become the gateway to the digital world. With the help of mobile phones, the lives of these special groups have many more possibilities: they can see the wider world as viewers, or they can be seen by more people as creators. .
One of Google’s missions is to organize information around the world and make it universally beneficial.
There are approximately 1 billion people with disabilities in the world. To achieve this mission means that we need to use as many devices as possible—such as smartphones in pockets—to make both the digital world and the real world more accessible.
Casey Burkhardt, a software engineer on Google's accessibility team, mentioned this in an interview with Forbes magazine.
Casey Burkhardt said that when the Google team is studying accessibility, it will not only consider the number of people with disabilities, but also consider the extent to which the feature can help them improve their lives.
The emergence of "switch control" and Project Activate has turned these "special groups" back into "ordinary users"-they can browse, communicate, consume and even create like ordinary people, and return to equality in the airwaves, which is also barrier-free. The greatest significance of the function.
At present, these two functions are still incomplete. For example, the front camera positions of different Android phones are different, and the accuracy of the expression recognition algorithm needs to be improved; the functions are hidden too deep, and it is difficult for ordinary people to find them; they rely on others for help. Setup and debugging, etc.
These problems need more time to solve, but fortunately, this part of the group has not been left behind by the rapid development of technology-they deserve this "attention."
#Welcome to follow Aifaner's official WeChat account: Aifaner (WeChat ID: ifanr), more exciting content will be provided to you as soon as possible.