Reddit Reddit reviews Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy

We found 26 Reddit comments about Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Here are the top ones, ranked by their Reddit score.

Business & Money
Books
Business Education & Reference
Business Statistics
Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
Crown Publishing Group NY
Check price on Amazon

26 Reddit comments about Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy:

u/OrelHazard · 23 pointsr/chicago

The information feeding these rankings is unscientific and lacking in grounding in statistical analysis. There is no measurement of learning, nor of much other actual student experience. Instead there is lots of measurement of easy-to-collect and far less important information such as percentage of alumni who contribute money or the opinions of college administrators collected in surveys.

In fact, when the US News study began, it was a profile in weak study design, producing worthless yet popularly quoted results: all it did was survey college presidents. That this obviously empty practice is what gave us the US News college rankings in the first place should make everyone slow their roll about this "authoritative" study.

Source: Weapons Of Math Destruction, Cathy O'Neill, Crown.

u/race_bannon · 22 pointsr/technology

> make his claims but not provide souce code showing how a bias could be hidden in an algorithm without it being immediately obvious to many coders at google

Because with machine learning and AI, even the developers don't understand how the decisions are made.

You should read Weapons of Math Destruction by Cathy O'Neil, which goes into how biased training data, programmers, etc can result in biased algorithms. It's pretty fascinating.

u/richiebful · 13 pointsr/Futurology

Honestly, the more acute danger is shitty pattern matching. A lot of machine learning models applied to targeted policing leads to more people of color getting locked up, for example. Live in a zip code with a lot of delinquent borrowers? You have to pay a higher mortgage rate. Weapons of Math Destruction explains this really well

u/moraisaf · 8 pointsr/MachineLearning

I liked this one Weapons of Math Destruction.

u/DevFRus · 6 pointsr/pbsideachannel

With mathwashing and related discussions on algorithmic bias, you guys have scratched the surface of an amazing discussion on bias and the ethics of Big Data. Cathy O'Neil is an awesome writer to follow on this topic. Just last week she released a new book Weapons of Math Destruction that discusses how algorithms are used to oppress and marginalize people throughout their lives and the guise of 'objectivity'. Here is a link if you want a quick review or countless others.

I'd love to hear more from Mike on this topic and the injustices perpetuated by algorithms for the sake of efficiency.

u/doodcool612 · 5 pointsr/law

There's a great book called Weapons of Math Destruction. If you're interested in these kinds of problems, this is a quick resource to get up to speed.

u/draka1 · 4 pointsr/datascience

I highly recommend Weapons of Math Destruction to understand the impact of data science applied in the wrong way:
https://www.amazon.com/Weapons-Math-Destruction-Increases-Inequality/dp/0553418815

u/Ezili · 4 pointsr/userexperience

I wouldn't describe any of those situations as unethical per se. Bad business decisions yes, but it's not inherently unethical for a company to make a bad product due to crappy or poorly run research.

That being said, if you worked for a government agency with a duty of care, then perhaps.
Or if you were conducting research to be the basis of an algorithm which would potentially have a social impact - like for example approving morgage loans - and were pressured to do an incomplete job which might impact, for example, a particular minority group. But by and large doing bad research is just bad business.

You might be interested in a book called Weapons of Math Destruction which investigates how algorithms and other models used by businesses and governments can have social impact, although I think it's less a matter of user research, and more generally about the topic of poor or limited research more generally

u/svenhof · 3 pointsr/datascience

Good list of books.

I've also heard good things about Weapons of Math Destruction written by one of the authors of Doing Data Science. Haven't read it myself though.

https://www.amazon.com/Weapons-Math-Destruction-Increases-Inequality/dp/0553418815

u/velos · 3 pointsr/datascience

Don't want to be the devil's advocate here, but I think everyone interested to get into this field must read the book Weapons of Math Destruction by Cathy O'Neil
Of what 'good' DS can do, that has been well promoted everywhere.. Of what 'disaster' it can bring, few would want to shine a spotlight on... Pursue this field, knowing both its light and dark side...

u/thequeensucorgi · 3 pointsr/onguardforthee

You're amazingly optimistic, I'll give you that.

I know I am not eloquent enough (or even picking the right arguments here) to convince you.

I encourage you to read Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O'Neil. She details a lot of the ways that the "data" Big Tech gathers to help governments ends up doing a lot of harm. She's way smarter than I am (and it's a really good book).

u/Yeti_Not · 2 pointsr/ChapoTrapHouse

IIRC, they already tried this in New Orleans, and before that, with the military (!).

Speaking as someone who has heard a bit about this sort of thing IRL, I'll make a couple of points about this:

(1) It's about the datasets as much as anything else. Ghouls like Thiel will roll out these half-baked products and 'work with' local authorities, and in the process get access to loads of sensitive data to train their technology on. For people working in AI, these sorts of mass public datasets are extremely valuable. So, this is not only about racism, but about privacy and the enclosure of the intellectual commons.

(2) If we don't get proactive, the end result of this sort of thing (also being trialled in e.g. university admissions, and potentially in all sorts of stuff) will be a crappy version of Gattaca. As with eugenics, the technology doesn't actually have to be premised on truth to work as a technology of control.

(3) I really should get around to reading this book.

u/mcdowellag · 2 pointsr/ukpolitics

There is a book which is kicking up a lot of fuss about this - "Weapons of Math Destruction" https://www.amazon.co.uk/Weapons-Math-Destruction-Increases-Inequality/dp/0553418815. I haven't read the book, but I have heard various comments on it by its author and others. E.g. http://www.econtalk.org/archives/2016/10/cathy_oneil_on_1.html

Some of this has been argued about before - if you run the numbers and find that women have fewer car accidents than men, is it OK to charge them less for car insurance?

Some of this boils down to "what this company is doing offends my political principles. That can't be right!"

If you really decided that the government had the right to regulate company choices that involved algorithms and it did make sense, there is still no requirement for the government to vet particular algorithms. Instead you could have a government algorithm which monitored the company algorithm. One obvious way of doing this would be with some sort of quota system e.g. our insurance company will be free to set different insurance rates for different women drivers as it chose according to some trade secret formula, as long as the average female driver was charged the same as the average male driver. So in this particular case of equality, female drivers would subsidize male drivers. Come to think of it, I wonder if insurance companies would advertise more in women's magazines - there's got to be lots of clever ways to game this particular system.

As far as I know there is no link between this sort of concern and concerns about things like the safety of automated cars. Safety-critical software is a very specialized and extraordinarily expensive area, because it is enormously difficult to guarantee that software doesn't have dangerous bugs. I think the concern here is that the software is working properly, in that it takes decisions that are competitive in whatever the company's market is, but somebody has an objection to whatever that winning strategy turns out to be.

u/Booie2k1 · 2 pointsr/datascience

This was an interesting and thought provoking read. Not too long either.

Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy https://www.amazon.co.uk/dp/0553418815/ref=cm_sw_r_cp_apa_pp5UBbY4DGFYA

u/linusrauling · 2 pointsr/math

>The majority of adults today, even highly educated, do not know basic math.

Yes.

>they rely heavily on technology to do easy calculations

I don't reliance on technology is the issue.


>they do not understand basic statistics.

Yes, but most adults have never seen a course in basic statistics/probability so this is to be expected.

> Do you think this is an issue?

Absolutely.

> Do you think this affects the society as a whole?

Without a doubt. For a little slice of this, check out Weapons of Math Destruction or for an explanation of how Republicans are able to maintain their grip on Congress see Gerrymandering


u/Neltsun · 1 pointr/GiftIdeas

You can try an alternative like FIXD. They work in Canada.

Also, Weapons of Math Destruction is a great read for anyone who loves or works with large sets of data.

u/puppy_and_puppy · 1 pointr/MensLib

Weird how I just finished the book Designing Data-Intensive Applications, and it ended with a section on ethics in computer science/big data that ties into this article really well. I'll add some of the sources from that section of the book here if people are curious. Cathy's book is in there, too.

u/itacirgabral · 1 pointr/brasil

>Em tese, de uma forma lógica e isenta. Analisaria de forma fria e calculista todos os aspectos positivos e negativos de cada candidato

Não confie tanto nos algoritmos, eles são feitos por humanos. Reproduzem o mesmo sistema de valores mas de forma 100% automática e sistemática. Eles parte de uma base de treinamento não ideal.

"Algoritmos são armas de destruição matemática"

https://youtu.be/_2u_eHHzRto

​

Além do mais política está em outro plano, não é uma questão técnica de otimização. Por exemplo, o desmonte da educação é um projeto de poder e não um incidente de má gestão. Se o objetivo da classe dominante é explorar e extrair riqueza, o problema é o próprio sistema e não a arquitetura dele.

​

Queremos uma polícia que não seja corrupta?

https://youtu.be/2NYtJ9LrXhk

ex-chefe de Polícia Civil do Rio de Janeiro Hélio Luz

u/TheBlackUnicorn · 1 pointr/AntiFacebook

>“There is a tendency to want to see AI as a neutral moral authority,” Riedl told BuzzFeed News. “However, we also know that human biases can creep into data sets and algorithms. Algorithms can be wrong, and there needs to be recourse.” Human biases can get coded into the AI, and uniformly applied across users of different backgrounds, in different countries with different cultures, and across wildly different contexts.

This is the Garbage-In-Garbage-Out problem. For more on this check out this book:

https://www.amazon.com/Weapons-Math-Destruction-Increases-Inequality/dp/0553418815

TL;DR of this article

  • It's going to take a long time for this to be of any use at all.

  • AI can totally be biased anyway.

  • Even worse, AI APPEARS to be neutral but is in fact biased by its human creators.

  • AI got us into this mess in the first place.


    I'd like to add an additional issue: The same powerful AI tools that Facebook may one day in the future have access to in order to clear up fake news on the platform will ALSO be used by the powerful nation-state actors that are trying to make fake news and bot posts go viral on Facebook.
u/a1studmuffin · 1 pointr/australia

This whole mess-up is a textbook "weapon of math destruction"... for anyone looking to learn more about how big data is making the wealth gap wider, this book is a great read.

u/cavedave · 0 pointsr/MachineLearning

This isnt a job posting. I am posting this for a discussion raised from a website I have no connection with.

Firstly these are interesting ideas and seem ideal for blockchain based business models.

Secondly I think the Question at the end about whether these suit men or women is a good one

Thirdly on a weapons of Math destruction level what does it mean to do jobs effecting peoples lives that involve only maths and not meeting the people?

I posted this to start a discussion about the particular ideas and the concept of interaction free jobs and I'd like to hear your opinion

u/thisaccountmaybemine · -9 pointsr/newzealand

I disagree - we need transparent and honest systems. If we're going to use racial / gender discrimination, we shouldn't be able to hide behind 'big data'.

> Mr Murray denied it was racial profiling and said immigrants' gender, age and the type of visa they whītiki would all be fed into the data sets.

If you don't think this guy is either a liar or an idiot, I recommend reading Weapons of Math Destruction (or reading about it).