当前位置:网站首页>Too helpless! Microsoft stopped selling AI emotion recognition and other technologies, saying frankly: "the law can not keep up with the development of AI"

Too helpless! Microsoft stopped selling AI emotion recognition and other technologies, saying frankly: "the law can not keep up with the development of AI"

2022-06-23 10:10:00 CSDN information

7d0b4022af2569c2b6c079f8102b9164.gif

Arrangement | Zheng Liyuan

Produce | CSDN(ID:CSDNnews)

Last week, , Google AI The rumor of being awakened has caused great discussion in the industry , Although each major AI The experts came forward afterwards and said that it was impossible , But it still attracted countless people to AI The focus of the field .

see , Today Microsoft is taking AI Make headlines on foreign science and technology websites : To build more responsibly AI System , Microsoft announced that it will gradually stop recognizing emotions based on facial images 、 Gender and age, etc AI Public use of facial analysis tools .

6541895a33f53c17e1a03a5522ce2eec.png

f37d02be24b90f87c7ff1c097d7750f2.png

“ The law lags behind AI The development of ”

In the official blog post released by Microsoft yesterday , Microsoft chief AI Responsible officer Natasha Crampton Announced a 27 Page of Microsoft Responsible AI standard :“AI More and more become a part of our lives , However, our laws have fallen behind , They missed AI Unique risks or social needs . So we are aware of the need for action , Try to be right in design AI The system is responsible for .”

According to the Natasha Crampton Introduce , This time Responsible AI The standard mainly emphasizes that “ Accountability ” And other principles are decomposed into their key driving factors , To ensure who is using their services , And the application position of these tools should be supervised more manually .

Based on this goal , Microsoft has put forward two safeguards :

  • From now on , New customers need to apply for face recognition .

since 6 month 21 The date of , All new customers need to apply for Azure Face API、Computer Vision and Video Indexer Face recognition operation in , Existing customers have one year to apply for approval to continue using facial recognition services according to their use cases . But here's the thing , If the deadline is 2023 year 6 month 30 The application of the existing customers on the th is still approved , Face recognition will no longer be available in the future .

“ By introducing limited access , We have added an additional layer of scrutiny to the use and deployment of facial recognition , Ensure that these services are used in accordance with Microsoft's Responsible AI standard , Contribute to high value and social benefits for end users .”

  • Stop the facial analysis function used to infer emotional states and identity attributes .

Microsoft said in its blog post , Face detection function ( For example, check whether it is fuzzy 、 Exposure 、 Do you wear glasses 、 Head posture, etc ) It will continue to be universally available in the future , However, some facial analysis functions used to infer emotional states and identity attributes will be disabled .

“ We work with internal and external researchers , Understand the limitations and potential advantages of this technology , And make trade-offs .” Microsoft points out , Emotional awareness is not just about privacy , There are also flaws , Part of the function of predicting sensitive attributes API There are also many potential abuses of access .

To reduce risk , Microsoft decided not to support Face API Common system in ( The system is used to infer emotions 、 Gender 、 Age 、 Hair style, smile and make-up, etc ): from 2022 year 6 month 21 The day begins , New customers will no longer be able to detect these attributes , The use right of existing customers only lasts until 2023 year 6 month 30 Japan .

But Microsoft added , Although it was announced that it would stop providing these functions to the public , But one of its products will continue to be used :Seeing AI, A software that provides technical support for the disabled through machine vision .

besides , Microsoft has also set similar restrictions and reviews on its custom neuro voice features . In essence , This function can support users to create... Based on live recordings AI voice , but “ It's easy to imagine how it could be used to impersonate a speaker and deceive an audience ”.

e322a65f48d372f6406a1de0ac5e83e5.png

use AI It is not reliable to identify emotions ?

actually , Long before Microsoft made this decision , Many industry experts have been complaining about this kind of AI technology , Especially in the past few years, Microsoft 、 Amazon 、IBM And Google and other technology companies “ Emotion recognition algorithm ” when .

325778e7a3ca14e6e84e1beb0fbebf49.png

say concretely , This kind of algorithm will infer the feeling of the target object based on the facial analysis of the face , For example, frowning or pouting means anger , Wide eyes or raised eyebrows mean fear and so on —— Sound , Is this logic a little far fetched ?

thus , Five outstanding experts in the field of emotional science were entrusted by the American Psychological Science Committee , It took two years 、 Reviewed 1000 After many studies , Finally in the 2019 Came to a conclusion in : Emotions can be expressed in many ways , You can't infer a person's feelings from a simple set of facial movements .

for instance : According to the data published at that time , People are less likely to frown when they are angry 30%, So frowning is just one way to express anger . This also means more than 70% People don't frown when they are angry , Conversely, they frown even when they are not angry .

Professor of psychology, Northeastern University Lisa Feldman Barrett To this result :“ That means using AI Companies that assess people's emotions in this way can mislead consumers . You will want to use it in court 、 Recruitment 、 In various scenarios such as medical diagnosis or airport , The algorithm has less than 30% The accuracy of ?”

Although Microsoft did not make a clear response at that time , But now, two years later ,Natasha Crampton Finally, I admit in my blog that :“ Experts inside and outside the company emphasize , stay ‘ mood ’ Lack of scientific consensus on the definition of , On how to cross use cases 、 There are challenges in extrapolating regional and demographic data , Privacy concerns surrounding such features are also growing . So we decided to analyze all the claims that we can infer people's emotional state AI System .”

that , What do you think of passing AI Recognize facial emotion and various identity attributes ?

Reference link :

  • https://www.theverge.com/2022/6/21/23177016/microsoft-retires-emotion-recognition-azure-ai-tool-api

  • https://blogs.microsoft.com/on-the-issues/2022/06/21/microsofts-framework-for-building-ai-systems-responsibly/

— Recommended reading  —

《 New programmers 001-004》 Fully listed

Welcome to scan the QR code below or click to subscribe now

You can enjoy e-books and exquisite paper books .3e3ec070edff4230874826cd5c004a19.png

*B  Station online paid video function ; Tesla was sued by former employees : Layoffs without prior notice ;RISC-V  Announce new specifications | Geek headlines 
* Wu Jun recounts his programming career : Do things without inefficient algorithms 
*M2  Chip analysis : It seems to be an enhanced version  A15?

One key, three links 「 Share 」「 give the thumbs-up 」「 Looking at 」

100 million technical people

原网站

版权声明
本文为[CSDN information]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/174/202206230951366062.html