Artificially intelligent robots and devices are being taught to be racist, sexist and otherwise prejudiced by learning from humans, according to new research.
A massive study of millions of words online looked at how closely different terms were to each other in the text – the same way that automatic translators use “machine learning” to establish what language means.
Some of the results were stunning.
The researchers found male names were more closely associated with career-related terms than female ones, which were more closely associated with words related to the family.
Continue reading at the Independent
Discussion about this post