Hold Platforms Accountable for Targeted ‘Dark’ Ads – Monash Lens
A recent briefing by investigative journalists from The markup revealed how Facebook uses detailed information about what people do online – the websites they visit and the search terms they use – to enable drug companies to target people based on the medical conditions for which they showed interest.
This marketing strategy is based on the fact that one of the first places people turn to when they learn or suspect that they or their loved ones might be sick is the internet.
The fact that platforms know more about us than our doctors reflects an apparent paradox – even when public programs like My Health Record face widespread skepticism, online platforms largely escape scrutiny to gather portraits. detailed of any health issues with which we might be associated.
This information is fueling an extremely important structural change in the dominant advertising model, towards advertising that is becoming both increasingly pervasive and less responsible.
Proxy Mechanisms in Mass Media
The era of mass media is defined by a number of familiar communication channels – terrestrial television and radio, newspaper advertisements and billboards, and an array of broadcast media.
Since the mass media have few technological mechanisms to target specific groups of people, advertisers have developed very rough proxies – concentrating advertisements for household products, for example, during the day to reach housewives (d ‘where the term “soap opera”); or by placing toy ads next to the Saturday morning cartoons.
The ads tracked the content and, in some cases, its timing and geography. These ads were accessible to large groups of people, and therefore available for public scrutiny – and often became the subject of concern regarding stereotypes and predatory marketing tactics.
Ads, while privately controlled and administered (in many cases), have remained – in an important sense – public.
Read more: Twitter bans political ads – but the real battle for democracy is with Facebook and Google
We know of the historic struggles that have taken place over racist and sexist forms of advertising; struggles that have highlighted the role played by the advertising system in strengthening particular sets of values and cultural assumptions.
As media historian Michael Schudson puts it:
“Advertising, whether or not it sells cars or chocolate, surrounds us and enters us, so that when we speak we can speak in or with reference to the language of advertising, and when we see we can see through diagrams. that advertising has made salient for us.
Advertising, in other words, is not just the padding between content – it is a form of content that plays an important role in the reproduction of social and cultural values.
The rise of the consumer society – dramatic social change – would have been impossible without it.
It is therefore crucial that advertising be the subject of public scrutiny and discussion as part of our ongoing reflection on the society we live in and how to build a better one.
Multiple reasons for accountability
This may be the main reason to take care of advertising, although there are other important reasons to hold it accountable.
Advertising is not just about selling household products and services. It is also used to rent or sell housing, to promote political candidates and to recruit employees – and, in some cases, to discriminate in these areas based on age, ethnicity or gender.
The advertising delivery model and its associated problems have not gone away. Just as broadcast television and newspapers remain, their basic approaches to advertising remain, even if they operate in more sophisticated ways, with more detailed audience data.
Yet in digital contexts, the degree of consumer tracking has led to new advertising methods that are disrupting the “public character” of historical forms of advertising.
The rise of online advertising represents an epoch change in advertising that invokes the specter of powerful new forms of discrimination that can be difficult to detect.
During the 2016 US presidential campaign, for example, Donald Trump’s digital strategy adviser Brad Parscale bragged about targeting African-American voters in swing states with ads claiming Hillary Clinton already had describes young black men as “super predators.” (She was referring to gang members in language which, while not explicitly racially coded, nonetheless led to a subsequent public apology on her part.)
The purpose of this ad buy was not so much to win voters for Trump, who had very low support among black voters, but to prevent them from voting at all.
There is no way to know if a job offer you find while browsing the Internet is only shown to people of a certain age, ethnicity or gender.
Due to the nature of targeted advertising, which tracks individual users, rather than particular forms of content, and is viewed on a personal device, it was impossible for those who received the ads to know they were being targeted in. as part of a deletion of voters countryside.
The same goes for other forms of online advertising. There is no way to know if, for example, a job offer encountered while browsing the Internet is only shown to people of a certain age, ethnicity or gender.
Indeed, Facebook was fined US $ 5 million after it was revealed that its ad buying system discriminates against ads for housing, employment and credit.
Read more: Facial recognition technology and the end of privacy for good
A series of metrics are being developed to account for “dark ads”, so called because they are fleeting and targeted. Facebook has made a lot of ads available through its Ads Library, although this functionality is limited because it only provides general information about how the ads are targeted.
The NYU Ad Observatory tracks political advertising with the help of volunteers who install a browser extension that captures ads served on Facebook. ProPublica has developed a similar tool, which we have adapted to provide visibility into how individuals are targeted online.
Our tool, which people interested in contributing to the project can install on a Chrome browser, collects basic demographic information so that we can see how people are being targeted by variables such as age, gender, and location.
Anyone who installs it can also use it as a personal ad tracking tool to see how Facebook is targeting them over time.
We tested this with 136 people to demonstrate how a tool like this could work, and even with this relatively small sample, we were able to show people how their online behavior shaped their advertising environment.
A volunteer, for example, was targeted based on information she had searched online about her child’s state of health. In the abstract, we know this is how online advertising works, but it can be difficult to see how detailed and comprehensive the monitoring and tracking is, and how much, for example, a behavior that we might not disclose publicly regarding the activity of alcohol consumption and gambling serves as raw material for advertisers.
Make invisible patterns visible
Equally important, the tool allows us to see overall trends that are invisible to individual users – how men might be targeted differently from women, or the elderly from younger people.
The more data we can capture with this tool, the clearer we will have a picture of the new and old forms of stereotyping allowed by dark ads, and how they shape our information environments.
We know we’ll never have a clearer picture than Facebook, but it’s crucial that we find ways to hold it accountable for the potential and actual abuses that take place in the world of online advertising.