Pular para conteúdo
Victor Magalhães
Disclaimers Aqui não há lugar para o ódio. Este espaço está em defesa das mulheres, da população negra, indígena, pobre, LBGTTIQ, imigrante, muçulmana, judia, refugiada e todas as pessoas sob ataque. #SomosTodasAntifascistas

Apoia o atual presidente? Você não é bem vindo, fascista. Saia 👋🏾

Espécies estão sendo extintas; cidades, afundadas; já está faltando comida. Mas ainda dá pra fazer algo contra a catástrofe climática:

Supposedly 'Fair' Algorithms Can Perpetuate Discrimination - Joi Ito's Web

The computer scientists of today are more sophisticated in many ways than the actuaries of yore, and they often sincerely are trying to build algorithms that are fair. The new literature on algorithmic fairness usually doesn’t simply equate fairness with accuracy, but instead defines various trade-offs between fairness and accuracy. The problem is that fairness cannot be reduced to a simple self-contained mathematical definition–fairness is dynamic and social and not a statistical issue. It can never be fully achieved and must be constantly audited, adapted, and debated in a democracy. By merely relying on historical data and current definitions of fairness, we will lock in the accumulated unfairnesses of the past, and our algorithms and the products they support will always trail the norms, reflecting past norms rather than future ideals and slowing social progress rather than supporting it.