MIT pulls massive AI dataset over racist, misogynistic content


MIT pulls massive AI dataset over racist, misogynistic content



MIT permanently took down its 80 Million Tiny Images dataset—a popular image database used to train machine learning systems to identify people and objects in an environment—because it used a range of racist, misogynistic, and other offensive terms to label photos.

No comments:

Post a Comment

No, Judges Don’t Overturn Elections Because of isolated Irregularities

By Unknown Author from NYT Technology