From the course: Ethics and Law in Data Analytics
Privacy law and analytics
From the course: Ethics and Law in Data Analytics
Privacy law and analytics
- The sources of US privacy law are identified here on the screen for you. We've talked a little about the US constitution, and we're going to talk more about statutory law that comes up mostly in a business context. We'll identify that in mod three. For purposes of the video here, and in mod two where we're talking about the relationship of the individual and society and data analytics and artificial intelligence, we're going to focus on the common law source of privacy law in the US, which is tort law. Tort law, is law that protects us from emotional harm, financial harm and physical harm. It is state law, it's common law, meaning we find the rules of law in the cases that we read, about what these torts are. These have all been created. These torts were created by our court system in the United States. There are many scholars that are looking at tort law, as one aspect of privacy law, that might be very successfully used to protect individuals, from the use and misuse of their data. The answer to this question is still outstanding. We don't know the answer, but we're opining at this point that this may be an area that we can find some protections for individuals, who feel that their private information has been used and misused in a way that causes them emotional, physical or financial harm. There are four, intentional torts in privacy. Intrusion into seclusion, appropriation of name or likeness, public disclosure of private facts and false light. Each one of these addresses specific, different aspects of privacy, but what they have in common, is that in each case, we have to be able to identify that the individual who's alleging harm, had a reasonable expectation of privacy. That's not always easy in this new technology, but it's definitely mandatory that we're able to show that. Intrusion into seclusion, the first one. Basically is what it sounds like, you're intruding, someone has intruded into the seclusion of somebody else. Classic case, people in a women's dressing room and somebody is peering in and want to lee into this private area of the dressing room space. In the context of new technologies, some are arguing that the Internet of Things, in particular, this interconnected network of sensor data from consumer devices like Fit bits and smartphones and smart appliances and so forth, that the misuse of that information might result in an intrusion into seclusion. It's a tough argument, because you have to show that there has been an unconsented to use of it, and you also have to show that it is highly offensive, to a reasonable person. But we're going to revisit this in mod three, when we look at some of the statutory protections as well, in some of the cases related to the Internet of Things, particularly those that have been addressed by the FTC. The second one appropriation of name or likeness, or sometimes called misappropriation of name or likeness protects our identity. So if somebody uses an aspect of our name, likeness, picture, identity, to their commercial advantage without our permission, that is an appropriation, and that's a privacy violation, because we have an expectation of privacy in our identity. Classic case is somebody taking a picture of a famous person, and putting it on T-shirts that they then sell to their commercial advantage. In the context of new technologies, there's a question of whether or not data brokers, other entities organizations in this new space, who are using parts of our identity, if that is an appropriation. If it has been unconsented to it could be, but it's not clear because the commercial advantage piece has to be established, and again, we have to also show that there's a highly offensive misuse. The third one public disclosure of private facts, occurs when somebody discloses private facts about another, and those private facts are highly personal. The disclosure is public and it is highly offensive to a reasonable person. So if somebody discloses to a community, that their neighbor has a highly infectious disease without their consent, that might be considered the public disclosure of a private or be a true fact. Because it's offensive, it's a violation of that person's privacy and private life. In the new technology space and data analytics, artificial intelligence, there can be aspects of this had happened as well, where things about us that are true, but private, come out. Again questionable, whether or not in this age of sharing forward so much information about ourselves, we can argue that we're offended because we share so much, but at least some scholars are saying there's a possibility here that this kind of disclosure could actionable in this new technology arena. The problem here is that the publicity piece is not clear. Publicity isn't necessarily equated to the trading of our information. So when data brokers are trading forward our information, is that publicity? Is it highly offensive? It's not entirely answerable at this point. The last one false light. This occurs when a person is put in a false light, it has to be publicly but it has to essentially just a misalignment of who we are. So, taking a picture of somebody and putting them under a headline in a newspaper that says something that is false would portray that person falsely. This one is really, really new in data analytics and artificial intelligence. We haven't seen a lot of it, but again, it's possible that you could be falsely portrayed, certainly you've seen cases where our data has resulted in false information and false conclusions about us. That we are falsely portrayed and that causes an emotional, physical or financial harm to us. Still, we have yet to see any of these cases, no data traders have been sued under us tort law to my knowledge. This is all speculation but to the degree that we can stretch these towards a little bit more than we can stretch statutory law, there's argument that there is protection here for individuals.
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
Data, individuals, and society2m 38s
-
Bias in data processing: Part 12m 51s
-
Bias in data processing: Part 23m
-
Legal concerns for equality4m 16s
-
Bias and legal challenges2m 52s
-
Consumers and policy1m 31s
-
Employment and policy1m 24s
-
Education and policy2m 28s
-
Policing and policy1m 52s
-
Best practices to remove bias3m 55s
-
Descriptive analytics and identity4m 25s
-
Privacy, privilege, or right3m 40s
-
Privacy law and analytics6m 29s
-
Negligence law and analytics4m 52s
-
Power imbalances3m 24s
-
IRAC application3m 56s
-
-
-
-