- 2,189
From the Wall Street Journal, dated 10/14/2016
Algorithms Aren’t Biased, But the People Who Write Them May Be
"Mathematical models that create rankings often use proxies to stand in for things the modelers wish to measure but can’t....
From her perspective as a data scientist and social activist, she has written a book about how algorithms discriminate. The problem isn’t caused by math, she reports. It’s rooted in the biases of people who encode their notions in algorithms and apply them en masse to the public in ways that are largely invisible and therefore difficult to challenge.
matical models that create rankings often use proxies to stand in for things the modelers wish to measure but can’t
How well someone drives has little to do with how much they pay for insurance, according to a Consumer Reports analysis. ENLARGE
How well someone drives has little to do with how much they pay for insurance, according to a Consumer Reports analysis. PHOTO: STEVE RINGMAN/THE SEATTLE TIMES VIA ASSOCIATED PRESS
By JO CRAVEN MCGINTY
Oct. 14, 2016 1:35 p.m. ET
84 COMMENTS
A provocative new book called “Weapons of Math Destruction” has inspired some charged headlines. “Math Is Racist,” one asserts. “ Math Is Biased Against Women and the Poor,” declares another.
But author Cathy O’Neil’s message is more subtle: Math isn’t biased. People are biased.
Dr. O’Neil, who received her Ph.D in mathematics from Harvard, is a former Wall Street quant who quit after the housing crash, joined the Occupy Wall Street movement and now publishes the mathbabe blog....
From her perspective as a data scientist and social activist, she has written a book about how algorithms discriminate. The problem isn’t caused by math, she reports. It’s rooted in the biases of people who encode their notions in algorithms and apply them en masse to the public in ways that are largely invisible and therefore difficult to challenge....
In particular, she is concerned about mathematical models that rank or score individuals, institutions or places, often by using proxies to stand in for things the modelers wish to measure but can’t."
Usually you and read for free pretty much any current WSj article by searching (in google) for the article such as "WSJ article on Algorithms Aren’t Biased" and you should be able to then pull up the article and read it for free.
I would enjoy your comments on this article only after reading it.
Algorithms Aren’t Biased, But the People Who Write Them May Be
"Mathematical models that create rankings often use proxies to stand in for things the modelers wish to measure but can’t....
From her perspective as a data scientist and social activist, she has written a book about how algorithms discriminate. The problem isn’t caused by math, she reports. It’s rooted in the biases of people who encode their notions in algorithms and apply them en masse to the public in ways that are largely invisible and therefore difficult to challenge.
matical models that create rankings often use proxies to stand in for things the modelers wish to measure but can’t
How well someone drives has little to do with how much they pay for insurance, according to a Consumer Reports analysis. ENLARGE
How well someone drives has little to do with how much they pay for insurance, according to a Consumer Reports analysis. PHOTO: STEVE RINGMAN/THE SEATTLE TIMES VIA ASSOCIATED PRESS
By JO CRAVEN MCGINTY
Oct. 14, 2016 1:35 p.m. ET
84 COMMENTS
A provocative new book called “Weapons of Math Destruction” has inspired some charged headlines. “Math Is Racist,” one asserts. “ Math Is Biased Against Women and the Poor,” declares another.
But author Cathy O’Neil’s message is more subtle: Math isn’t biased. People are biased.
Dr. O’Neil, who received her Ph.D in mathematics from Harvard, is a former Wall Street quant who quit after the housing crash, joined the Occupy Wall Street movement and now publishes the mathbabe blog....
From her perspective as a data scientist and social activist, she has written a book about how algorithms discriminate. The problem isn’t caused by math, she reports. It’s rooted in the biases of people who encode their notions in algorithms and apply them en masse to the public in ways that are largely invisible and therefore difficult to challenge....
In particular, she is concerned about mathematical models that rank or score individuals, institutions or places, often by using proxies to stand in for things the modelers wish to measure but can’t."
Usually you and read for free pretty much any current WSj article by searching (in google) for the article such as "WSJ article on Algorithms Aren’t Biased" and you should be able to then pull up the article and read it for free.
I would enjoy your comments on this article only after reading it.