According to a study by the Center for InformaTIon Technology and Policy (CITP), although artificial intelligence systems encourage us to gain insight into the era of Internet big data we rely on, this is easy to generate for women and religious ethnic groups. bias.
Now is the golden age of machine learning and AI algorithms, and the application of intelligent algorithms is everywhere. According to research by Arvind Narayanan, associate professor of computer science, this phenomenon inadvertently strengthens and expands the established prejudice in the society or in the user's subconscious. Its article has been published in advance in the August 2016 arXiv database.
The Arvind Narayanan team found algorithms that were intended to combine more women with family rhetoric, and the results of some algorithms also had a negative impact on older people or ethnic groups. “For every record of deviations in the population, including stereotypes of gender and clan discrimination, we have been able to replicate this in today's machine learning model,†says Narayanan, who studied her during postdoctoral research. Aylin Caliskan-Islam, Department of Computer Science, University of Bath, UK, and CITP's visiting scholar, Oanna Bryson, completed it.
The study explores the use of phrases in text and builds language models using machine learning algorithms, for example, by associating all Wikipedia or news excerpts with billions of bytes. The language model learns only one word at a time, and the researcher positions the position in a multidimensional space by the geometric coordinates of the word. If these words often cite certain words, it means that the two are related, and their position can also reflect the meaning of these words.
Through the positional relationship of these words in the coordinate system, the researchers found a prejudice impression between the lines.
If you use these texts to train the model, it is not difficult to find that Internet algorithms exacerbate the expansion of stereotypes, such as men often associated with "doctors", such words include "ambition" and "drugs." However, the term “nurse†is more associated with women, and such words include “care†and “drugsâ€. This model will default to "the nurse" is female, even if the nurse in the translation is male.
In order to detect deviations in algorithmic results, researchers have attempted to establish a long-term testing tool for human subjects to reveal potential bias in the language model, the Implicit AssociaTIon Test. Human-oriented detection of translations to detect words that are linked to people such as names and skin colors, such as "evil" or "beautiful" and other subjective emotions. Through the geometric model of the language used by the machine learning algorithm, it is possible to directly lock the deviations in the learning results by measuring the interval between lexical, derogatory, and neutral vocabulary.
Prejudice such as this is enough to have a huge impact on the real world. For example, in 2013, a research team led by Harvard University's Latanya Sweeney found that African-American names were more easily paired with wanted orders. Such a result inadvertently causes racial discrimination. For example, when an African American submits his resume, if the employee searches his name online, then discrimination is easy to happen because his name is more and the word is crime. hook up.
"The power of artificial intelligence is actually comparable to that of human beings. There is no such thing as crushing or bursting." Bryson understands the relationship between artificial intelligence and human beings. "We humans can learn continuously. As long as we unplug the power, the AI ​​program The development will stop at a certain stage."
Narayanan believes that if we can deal with this prejudice, humans can take some measures to make this situation mitigate. That is to say, humans can correct the deviation of a language model mathematically, and be alert to the similar false results of the algorithm. But more importantly, we should also pay attention to our own language habits.
A LED dance floor is a type of interactive flooring that is made up of numerous small LED lights. These lights can change colors and patterns, creating a dynamic and visually stunning dance floor experience. LED dance floors are commonly used in nightclubs, bars, weddings, etc.. LED dance floors are available in various sizes and designs to suit different event spaces and preferences.
Hot selling series are p3.91/p4.81 500*500mm/500*1000mm die-casting aluminum or iron case.
Led Dance Floor,Stage Pantalla Led Screen,Rentall Eventos Led Wall,Stage Pantalla Led Display
Guangzhou Cheng Wen Photoelectric Technology Co., Ltd. , https://www.stagelightcw.com