X

Google Patent Guesses Facial Expressions Via Eye Tracking

Featured image for Google Patent Guesses Facial Expressions Via Eye Tracking

A new patent from Google details a system that uses eye-tracking cameras and AI to ascertain a user’s facial expression from eye imagery. Essentially, it works by seeing what a user’s eyes look like when they’re putting on a certain facial expression, then comparing that image data to the normal look of the user’s eyes to build a profile of what that facial expression looks like. In this way, it’s able to figure out what set of eyes belong to what expression, then output relevant data based on said expression. In the patent, the technology is supposed to be used in a head-mounted display. Though it could theoretically be applied in other devices, such as smartphones and laptops, the utility of such a function would be decidedly limited, since there would be less compute resource usage involved in simply observing the user’s expression; such equipment does, after all, have a view of the user’s full face at most times.

Background: The way it works is actually somewhat similar to voice printing for training Google Assistant, but with a visual component and repeated over multiple exposures. First, the user’s eyes are photographed in a neutral expression. After that, the system will prompt a user to make a given expression. When that happens, another photograph is taken. These are compared, and machine learning technology is applied to note the differences. In this way, the program improves its knowledge of both what a user’s normal face looks like and what a given expression looks like, speaking purely from the perspective of the user’s eyes. This process is repeated, according to the patent, for all of the facial expressions that user wishes to have recognized. After a facial expression has been registered, the system continues to learn the nuances of how a user emotes by observing subtle differences in each occurrence of the given expression. This is further cross-referenced with other expressions and the user’s neutral face to continually improve performance over time.

Impact: The obvious takeaway here is that Google wants people to be able to use their facial expressions in VR situations. You could potentially make a VR avatar reflect a user’s expression, change in-app elements like text boxes based on a user’s current expression, or use it in games, such as making an angry face to intimidate an NPC or smiling at a certain puzzle element to activate it. There are less obvious use cases, however, and lots of them. For starters, stepping away from the HMD angle, this tech could easily be used to enhance machines’ understanding of a given face and how it moves, allowing AI programs and even robots to better mimic human facial expressions and mannerisms. It could also be used in certain consumer-facing applications that could be changed based on a user’s mood, ascertained from their expression. For example, when a sad or angry user logs into a Chromebook, Assistant could activate and show them something funny, or tell them something good to change their mood.

Google Eye Tracking 1
Google Eye Tracking 7
Google Eye Tracking 6
Google Eye Tracking 5
Google Eye Tracking 4
Google Eye Tracking 3
Google Eye Tracking 2
Google Eye Tracking 1
Google Eye Tracking 7
Google Eye Tracking 6
Google Eye Tracking 5
Google Eye Tracking 4
Google Eye Tracking 3
Google Eye Tracking 2
  翻译: