Siri of the future may be able to read your expression

Since the development of intelligent voice assistant technology, most functions still rely on “understanding” voice-based semantics. For example, I said to Siri, “What’s the weather like today?” “To answer this question, Siri does the following two actions, first by converting the voice text into text messages (displayed on the screen), and then using natural language processing techniques to “understand” the key information in a sentence before finally answering them.”

If any of these two actions go wrong, it will affect the final result output. I think you’ve had this experience when you’re using Siri, and when you give a meaningless voice command, Siri is not sure if you’re “understanding” the right thing, and generally asks you about the correctness of the output. You may need to tap the screen one more time or answer one more time to get the end result you want.

Siri of the future may be able to read your expression

In its latest patent, Apple plans to help Siri better understand users’ requests by introducing “facial analysis” features.

Apple recently filed a patent called “Smart Software Agent” that allows smart assistants to perform different operations based on a user’s facial expression or mood, according to u.S. Trademark Office documents.

To achieve this, when a user uses Siri, the front-facing camera automatically activates to act as an “eye” and pairs the acquired emoticons with facS (Facial Behavior Coding System), helping Siri read the user’s expressions and emotions and ultimately give a more accurate answer.

FacS (Facial Behavior Coding System) divides it into several motion units that are independent and interrelated according to the anatomical characteristics of the human face, and analyzes the movement characteristics of these motor units and the main areas they control, as well as the expressions associated with them, to form a large number of photographic descriptions. The facial behavior coding system, which categorizes many real-life human expressions, is now the authoritative reference for muscle movement softos of facial expressions and is used by psychologists and cartoon painters.

Siri of the future may be able to read your expression

Apple has been pushing Siri for nearly two years, and according to previously leaked Siri files, it plans to bring several new features to Siri in the fall of 2021. According to the report, Apple is working on future Siri updates, starting with a new device that has not yet been named in two years’ time to implement “a question-and-answer session on health issues” and a machine translation feature.

Siri of the future may be able to read your expression

Earlier, there were reports that Apple might launch SiriOS on WWDC next year, which would be a stand-alone project, equivalent to The Action development used by Amazon Alexa’s Alexa Skills Kit or Google Assistant. environment, rather than being part of an existing operating system (iOS, macOS, iPad OS). Based on this, the report predicts that SiriOS will allow developers to write it into the app to provide a stronger custom experience.

Add a Comment

Your email address will not be published. Required fields are marked *