When a machine moderates content, it evaluates text and images as data using an algorithm that has been trained on existing data sets. The process for selecting training data has come under fire as it’s been shown to have racial, gender and other biases.

  • PhilipJFry@lemmy.world
    link
    fedilink
    English
    arrow-up
    65
    ·
    1 year ago

    Having largely undisclosed and privately owned platforms and algorithms dictate and decide our cultural exchange, spread of news, topics of discourse and other societally important interactions is such a horrible idea. I wish this was more obvious to the public so governments would end this. It divides sociaties, poisons public discourse and skews it with racist biases and towards hatred.