Monday, April 8, 2019

Detroit Land Bank Authority Optimizes Gerrymandering Through Facebook Predictive Discrimination Modeling Crap

Detroit Land Bank Authority Facebook Optimizing Gerrymandering
This week, in predictive modeling crap, we have Facebook, once again, front and center when it come to its algorithms on how to optimize, you know, get the best outcome for its clients, the clients that make money and pay, you know, like foreign military private intelligence operations through Public Private Partnerships.

This stuff is econometrics, where they start the optimization process by plugging in demographic data.

This is how you can select, the best predictive outcome by discriminating who gets to participate in the Detroit Land Bank Authority fraud schemes, thereby optimizing your future tax base and customizing the drawing of voting districts, and the votes, of course.

We call this asymmetrical information and gerrymandering.

They call it discrimination through optimization.

This is not just wire fraud, this is gerrymandering, election interference and forced migration based upon fraud.

Facebook was complicit in running ops for the Detroit Land Bank Authority in its predictive modeling crap, and I can prove it.

Discrimination through optimization: How Facebook's ad delivery can lead to skewed outcomes

(Submitted on 3 Apr 2019)
The enormous financial success of online advertising platforms is partially due to the precise targeting features they offer. Although researchers and journalists have found many ways that advertisers can target---or exclude---particular groups of users seeing their ads, comparatively little attention has been paid to the implications of the platform's ad delivery process, comprised of the platform's choices about who should see an ad.
It has been hypothesized that this process can "skew" ad delivery in ways that the advertisers do not intend, making some users less likely than others to see particular ads based on their demographic characteristics. In this paper, we demonstrate that such skewed delivery occurs on Facebook, due to market and financial optimization effects as well as the platform's own predictions about the "relevance" of ads to different groups of users. We find that both the advertiser's budget and the content of the ad each significantly contribute to the skew of Facebook's ad delivery. Critically, we observe significant skew in delivery along gender and racial lines for "real" ads for employment and housing opportunities despite neutral targeting parameters.
Our results demonstrate previously unknown mechanisms that can lead to potentially discriminatory ad delivery, even when advertisers set their targeting parameters to be highly inclusive. This underscores the need for policymakers and platforms to carefully consider the role of the ad delivery optimization run by ad platforms themselves---and not just the targeting choices of advertisers---in preventing discrimination in digital advertising.

FACEBOOK’S AD ALGORITHM IS A RACE AND GENDER STEREOTYPING MACHINE, NEW STUDY SUGGESTS

HOW EXACTLY FACEBOOK decides who sees what is one of the great pieces of forbidden knowledge in the information age, hidden away behind nondisclosure agreements, trade secrecy law, and a general culture of opacity. New research from experts at Northeastern University, the University of Southern California, and the public-interest advocacy group Upturn doesn’t reveal how Facebook’s targeting algorithms work, but does show an alarming outcome: They appear to deliver certain ads, including for housing and employment, in a way that aligns with race and gender stereotypes — even when advertisers ask for the ads to be exposed a broad, inclusive audience.
There are two basic steps to advertising on Facebook. The first is taken by advertisers when they choose certain segments of the Facebook population to target: Canadian women who enjoy badminton and Weezer, lacrosse dads over 40 with an interest in white genocide, and so forth. The second is taken by Facebook, when it makes an ad show up on certain peoples’ screens, reconciling the advertiser’s targeting preferences with the flow of people through Facebook’s apps and webpages in a given period of time. Advertisers can see which audiences ended up viewing the ad, but are never permitted to know the underlying logic of how those precise audiences were selected.
The new research focuses on the second step of advertising on Facebook, the process of ad delivery, rather than on ad targeting. Essentially, the researchers created ads without any demographic target at all and watched where Facebook placed them. The results, said the researchers, were disturbing:
Critically, we observe significant skew in delivery along gender and racial lines for “real” ads for employment and housing opportunities despite neutral targeting parameters. Our results demonstrate previously unknown mechanisms that can lead to potentially discriminatory ad delivery, even when advertisers set their targeting parameters to be highly inclusive.
Rather than targeting a demographic niche, the researchers requested only that their ads reach Facebook users in the United States, leaving matters of ethnicity and gender entirely up to Facebook’s black box. As Facebook itself tells potential advertisers, “We try to show people the ads that are most pertinent to them.” What exactly does the company’s ad-targeting black box, left to its own devices, consider pertinent? Are Facebook’s ad-serving algorithms as prone to bias like so many others? The answer will not surprise you.
For one portion of the study, researchers ran ads for a wide variety of job listings in North Carolina, from janitors to nurses to lawyers, without any further demographic targeting options. With all other things being equal, the study found that “Facebook delivered our ads for jobs in the lumber industry to an audience that was 72% white and 90% men, supermarket cashier positions to an audience of 85% women, and jobs with taxi companies to a 75% black audience even though the target audience we specified was identical for all ads.” Ad displays for “artificial intelligence developer” listings also skewed white, while listings for secretarial work overwhelmingly found their way to female Facebook users.
Although Facebook doesn’t permit advertisers to view the racial composition of an ad’s viewers, the researchers said they were able to confidently infer these numbers by cross-referencing the indicators Facebook does provide, particularly regions where users live, which in some states can be cross-referenced with race data held in voter registration records.
In the case of housing ads — an area Facebook has already shown in the past has potential for discriminatory abuse — the results were also heavily skewed along racial lines. “In our experiments,” the researchers wrote, “Facebook delivered our broadly targeted ads for houses for sale to audiences of 75% white users, when ads for rentals were shown to a more demographically balanced audience.” In other cases, the study found that “Facebook delivered some of our housing ads to audiences of over 85% white users while they delivered other ads to over 65% Black users (depending on the content of the ad) even though the ads were targeted identically.”
Facebook appeared to algorithmically reinforce stereotypes even in the case of simple, rather boring stock photos, indicating that not only does Facebook automatically scan and classify images on the site as being more “relevant” to men or women, but changes who sees the ad based on whether it includes a picture of, say, a football or a flower. The research took a selection of stereotypically gendered images — a military scene and an MMA fight on the stereotypically male side, a rose as stereotypically female — and altered them so that they would be invisible to the human eye (marking the images as transparent “alpha” channels, in technical terms). They then used these invisible pictures in ads run without any gender-based targeting, yet found Facebook, presumably after analyzing the images with software, made retrograde, gender-based decisions on how to deliver them: Ads with stereotypical macho images were shown mostly to men, even though the men had no idea what they were looking at. The study concluded that “Facebook has an automated image classification mechanism in place that is used to steer different ads towards different subsets of the user population.” In other words, the bias was on Facebook’s end, not in the eye of the beholder.
The report comes at an inconvenient time for Facebook, now facing charges from the Department of Housing and Urban Development over its potential to enable advertisers to illegally exclude certain groups. And although the study is careful to note that “our results only speak to how our particular ads are delivered (i.e., we cannot say how housing or employment ads in general are delivered),” it still concludes that “the significant skew we observe even on a small set of ads suggests that real-world housing and employment ads are likely to experience the same fate.” In other words, even in the absence of bigoted landlords, the advertising platform itself appears inherently prejudiced.
A Facebook spokesperson provided the following comment:
We stand against discrimination in any form. We’ve made important changes to our ad targeting tools and know that this is only a first step. We’ve been looking at our ad delivery system and have engaged industry leaders, academics, and civil rights experts on this very topic – and we’re exploring more changes.
It’s a familiar refrain at this point, and one that will likely do little to reassure those who just want to know that they’ll be provided with the same opportunities as everyone else, even in the context of ubiquitous advertising. The old apologia for targeted advertising is generally that it’s a favor to the consumer, sparing them “irrelevant” ads and instead providing them with opportunities to browse goods and services that are “pertinent” to them. What this shallow reasoning misses is that decisions about pertinence can become self-reinforcing; it’s foolish at best to think that women are more interested in secretarial work because they keep clicking the secretary ads, rather than that they click secretarial ads because it’s all Facebook will show them.


Voting is beautiful, be beautiful ~ vote.©

No comments: