The ACLU showed that Amazon's facial recognition system thinks members of Congress are felons, so now Congress is taking action

May I still purchase denture adhesive [half the super market price] from Amazon and not feel guilty?

3 Likes

So… the program is not just bad at identifying “known arrestees”, it’s also racist. Working as designed then?

2 Likes

I only just realised what he wrote. “Racialized”? How does one get “racialized”? Is it like “radicalized”? I guess it’s some sort of new third level euphemism, but sometimes bending the language breaks it.
Still, as others have pointed out, many politicos are criminals, and given racial profiling in other law and order spheres, count me unsurprised about this. :wink:
As ever, it’s not “can we” but “should we”.

3 Likes

racialize [rey-shuh-lahyz]
verb (used with object), ra·cial·ized, ra·cial·iz·ing.
3. to categorize or differentiate on the basis of membership in a racial group.

Not so much.

It’s much more “they” and much less “we”. I know I wasn’t consulted, and I’m an Amazon Prime member in good standing.

He could have called and asked, “What do you think, Collin? Should we be helping refine algorithms for mass identification and profiling of people for various government security agencies?” and I would have said, “No, Jeff. I feel it’s not a good idea.”

But nothing. Not even a text.

5 Likes

It’s a well-established term in academic discussions of the history and structure of racism. It’s a recognition that racial hierarchies are imposed upon the underclass by human agency rather than natural law.

1 Like

Well I had not come across this usage before, so thanks (and @Pensketch) for the education. That said, it still does not chime well with me in this context. Looking at the definitions I can see a usage that says “the software racialised many members of Congress” but not quite what was written here: “The false positives disproportionately targeted racialized members of Congress” which reads as if they were already racialised and then the software (or its false positives) proceeded to target them. Maybe it was a slightly off usage that set my language alarm bells ringing.

(ETA perhaps if the word ‘targeted’ had been removed from the sentence it would have read bettter.)

1 Like

NorCal ACLU busted a move. So, here’s for your Friday.

Seems to be working as intended. Though 28 seems a bit low to me.

2 Likes

…so now Congress is taking action…

In what may be a related story, several cases of these were recently seen being delivered to Capitol Hill.

2 Likes
1 Like

Oh my god, I would so like to see all the Republicans in Groucho glasses and all the Democrats in juggalo paint. C-SPAN could go pay per view and wipe out the national debt.

5 Likes

Over half of the 28 look like WASPs. So, conservatives could argue that the algorithm isn’t racialized.

You are correct, though one of the applications of this is in deciding whether or not to use a particular screening test in a particular context. I’m most familiar with the health context where we weigh positive/negative predictive values with “what is the prevalence of the disease in the group we’re testing”, “can we identify a more targeted subgroup”, “what is the consequences of a false negative” (e.g. is it fatal), “what are the consequences of a false positive?” (e.g. will it prompt expensive or high side effect further testing/treatment)

And in this case, i guess the argument is “applying this en mass to a population is not an appropriate context to use the test” and “do officers recognise the high false positive rate esp. in non-white people”

1 Like

Yeah, that’s the intended meaning. People are generally racialized by the racial norms of society at large. Then, having been “put in a box”, they experience specific discrimination based on the box they were sorted into.

2 Likes

This topic was automatically closed after 5 days. New replies are no longer allowed.