Android has a number of accessibility features and APIs, some of which have been used, misused or even abused to open the doors to the users and features of the power supply, Google would not have normally allowed. Many of these accessibility features focused on voice control or screen playback, but a new camera switches feature in the Android accessibility suite extends this feature to include some facial expressions for those who might even not be able to speak or contain an external device to be controlled. their smartphones.
Google had played with the use of sensors before checking the phone on the pixel 4, but which was limited to the hand gesture detection propelled by the SONAR system of the Soli project. It was not discolved as Google Este and it was not good for accessibility either. The beta version of the Android Accessitability Suite app, however, takes a slightly different route and it could be more interesting and more useful for people with physical disabilities.
Called simply “camera switches”, the functionality would allow users who can not speak or use their hands to control their phones. It uses a fixed set of facial expressions that can be mapped over different actions. Rather than using a specialized sensor, the camera switches use the front camera camera, which means that any phone can theoretically support this feature.
Camera switches support only a limited number of facial expressions, including open mouth, smiling, eyebrows, look on the left, look right and look for. These, however, can be used to initiate a number of actions, as go to the back or on the home screen, scroll forward or backward or even map in a Key gesture and maintenance.
XDA says that this accessibility function suite could head to Android 12, but seems to be compatible with at least Android 11. In both cases, the use of camera switches will result in a notification in the status bar, indicating that The camera is used.