VnutZ Domain
Copyright © 1996 - 2024 [Matthew Vea] - All Rights Reserved

2016-03-25
Featured Article

Microsoft's Racist AI

[index] [2,412 page views]
Tagged As: AI, Humor, Machine Learning, and Science

Oh humanity and its inventions. #facepalm

Microsoft recently created Taylor, an artificially intelligent chatbot designed to interact with people on Twitter with roughly the language of a contemporary teenager. The project did not go so well and was turned off in about a day. Why? Through it's use of Twitter as a training platform, "Tay" quickly began spewing racist comments and promoting Hitler. Microsoft's experiment may have been a public relations disaster offers a lot to social sciences as a reflection of who we are. Basically, the neural network was trained by the type of content it found on Twitter through the weighted relationships it found between Q&A associations and responses in addition to the direct feedback it received when users interacted with Tay directly.

The bottom line is Tay was a bigoted jerk because we are. The AI fundamentally learned our nature in a "big data" measurable fashion.

For more fun with programming interfaces to society gone wrong - take a look at Robbie the Racist Robot or the Racist HP Webcams.



More site content that might interest you:

An explanation of why China normalized it's foreign policy relations with the United States in 1972 can be found using the Balance of Threat and Hegemonic Stability theories.


Try your hand at fate and use the site's continuously updating statistical analysis of the MegaMillions and PowerBall lotteries to choose "smarter" number. Remember, you don't have to win the jackpot to win money from the lottery!


Tired of social media sites mining all your data? Try a private, auto-deleting message bulletin board.


paypal coinbase marcus