The Scariest Ways Big Data is Used Today

The Scariest Ways Big Data is Used Today

In this post I look at the scary ways big data can be used. The post first appeared in my column for Data Informed

Like any information, data — big or small — can be used for good or for ill. There’s nothing inherently evil about data; it’s what people choose to do with it that can be beneficial or harmful. 

But big data has some inherent problems that need to be addressed. First is privacy. Because we’ve never been able to collect or analyse data on this scale before, we’ve never had to come up with rules governing privacy around things like your shopping habits, Internet browsing habits, or even walking habits if you’re using a health tracker. Privacy laws and regulations haven’t caught up with the data-collecting capabilities of big data.

Second is the fact that any set of data is full of hidden biases. Humans create the data sets, we decide what should be included and excluded, and then we interpret those data sets. Many people believe that “data” is infallible; numbers can’t be fudged, right? But it’s simply not the case. One example of this would be correlating student test scores to teacher performance. Test scores measure how well students completed the questions posed to them; we must infer from that whether or not the teacher was effective, and that’s where things can get muddy.

Finally, as statistician Nate Silver says, “We love to predict things–and we aren’t very good at it.” Much has been made in the past of Google’s ability to predict flu outbreaks based on search traffic, but their model didn’t work in 2014. Google itself admits that just because you search for “flu symptoms,” it doesn’t mean you’re sick.

5 Scary Uses of Big Data Happening Today

This isn’t all the stuff of science fiction or futurism, either. Because the technology for big data is advancing so rapidly, rules, regulations and best practices can’t keep up. Here are a few examples I’ve seen where big data can feel a little more like Big Brother: 

  1. Predictive policing. In February 2014, the Chicago Police Department sent uniformed officers to make “custom notification” visits to individuals they had identified as likely to commit a crime through a computer generated list. The idea was to prevent crime by providing certain individuals with information about job training programs, or let them know about increased penalties for people with certain backgrounds. But many community groups cried foul and called the practice profiling. Just one step towards Minority Report? Time will tell.
  2. Hiring algorithms. More and more companies are turning to computerized learning systems to filter and hire job applicants — especially for lower wage service sector jobs. These algorithms may be putting jobs out of reach for some applicants, even though they are qualified and want to work. For example, some of these algorithms have found that statistically, people with shorter commutes are more likely to stay in a job longer, so the application asks, “How long is your commute?” Applicants who have longer commutes, less reliable transportation (using public transportation instead of their own car, for example) or who haven’t been at their address for long will be scored lower for the job. Statistically, these considerations may all be accurate, but are they fair?
  3. Marketers target vulnerable individuals. Data brokers have begun selling reports that specifically highlight and target financially vulnerable individuals. For example, a data broker might provide a report on retirees with little or no savings to a company providing reverse mortgages, high-cost loans, or other financially risky products. Data brokers have been around for decades, but the quality and amount of data they can collect and the clarity with which they can analyse it has never been higher. Very few rules or regulations exist to prevent the targeting of vulnerable groups.
  4. Snapshot insurance may put you in the wrong category.  Since 2011, car insurance companies like Progressive have offered a small device you can install in your car to analyse your driving habits — and hopefully get you a better rate. But some of the criteria for these lower rates are inherently discriminatory. For example, insurance companies like drivers who stay off the roads late at night, and don’t spend much time in their cars, but poorer people are more likely to work the late shift and have to commute further to work — both of which would be strikes against them when it comes to calculating their rates.
  5. Walmart and Target determine your life insurance rates. OK, not directly, but Deloitte has developed an algorithm, based on “non-traditional third party sources” that can predict your life expectancy from your buying habits. They claim that they can accurately predict if an individual has any one of 17 diseases, including diabetes, female cancer, tobacco related cancer, cardiovascular disease, and depression by analyzing their buying habits. So be careful what you pick up at the drugstore — it might mean your insurance rates increase.

Have you heard of any uses of big data that you find scary? I’d love to hear your examples in the comments below. (Hat tip to the excellent report, “Civil Rights, Big Data, and Our Algorithmic Future” for many of these examples.

Thank you very much for reading my posts. Here at LinkedIn and at Forbes I regularly write about management, technology and the mega-trend that is Big Data. If you would like to read my regular posts then please click 'Follow' and feel free to also connect via TwitterFacebook and The Advanced Performance Institute.

Here are some other posts from my Data Informed column:

About : Bernard Marr is a globally recognized expert in big data, analytics and enterprise performance. He helps companies improve decision-making and performance using data. His new book is Big Data: Using Smart Big Data, Analytics and Metrics To Make Better Decisions and Improve Performance'. 

You can read a free sample chapter here.

 

 

Photo: Shutterstock.com

Matthew Camp

Ignorance is a privilege you can only claim after you have put in all of the work.

9y

Good read.. Really makes you think

Like
Reply
Bhojraj Parmar

Service to Humanity | Cyber Security | Public Speaking

9y

This is great Bernard Marr - policy makers and corporates are defining consumers of Data Analytics at this stage and if privacy concerns are not addressed and checkpoints are not put in place now - We humans have a tendency to accept the norm and imperfection today becomes a default state of tomorrow. I hope we collectively spend more energy on what is positive about Big Data while being mindful of things that can go wrong and create checkpoints to prevent to risks. So your article works well in highlighting some of those risks. I also read through one of your old posts - Big Data: The Predictions For 2015 and enjoyed it. Interested to know if you will collate a list of amazing things Big Data has done and will do to the society and public services? In other words positive impact it has or could make in general population's life, if used in constructive ways.

Like
Reply
Benjamin Thomas

Business professional with a passion for analytics

9y

In light of this abuse of data and privacy, I think we should add more noise to the signal.

Like
Reply
Lana Maskell

z/OS MVS Mainframe Software Developer, Architect, Support

9y

Here's my question -- If Deloitte can predict illnesses based on analysis of purchasing patterns, wouldn't there be a medical application there? Instead of just raising someone's health and life insurance rates, wouldn't it be an idea to notify the customer about the medical danger they think they're seeing? As in: Dear Mr Smith, we noticed you're buying a lot of Twinkies and lollipops lately, have you considered getting tested for diabetes? Early detection can improve outcomes and that helps keep insurance rates down. . . . Just sayin'. It doesn't always have to be the dark side.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics