Blackballed by machine learning: how algorithms can destroy your chances of getting a job

Originally published at:

1 Like

Cory, you had me until here:

They slash expenses by replacing human resources professionals with machines,

I’m trying hard to think of a machine that could be more pigheaded, less understanding, or harder to work with than some “human resources professionals” I have known, and so far I’m drawing a blank. My list includes Marvin and HAL 9000, too.


No. They are far worse. Trust me. There are millions of HR drones out there. There are a small number that know what they are doing. The machine filtering is worse. It is like being cut out of the process before you get the chance for a half competent person to look at your skill set.


I have an outside chance of figuring out how an algorithm works.

But if Catbert has a grudge against me or my unit of the company (I’ve been there) I am completely fucked over with no hope.


Ah, but the algorithm won’t even let you in the door. It evaluates you based on things the human is not allowed to consider, because the human knows this and avoids it. The algorithm is like that Microsoft chatbot, and isn’t audited at all. Like Corey said in his post, these tools are never corrected, as neither the developer nor the purchaser has any real incentive to expend effort – rejecting potentially excellent employees never occurs to them.

Human Resources is the key here: employees are resources, not stakeholders.


We’re talking from slightly different perspectives here. I’m seeing things from my view as a manager who is trying to hire the best, not an employee who is trying to get in the door.

1 Like

Funny, I’ve never had a problem with HR. Sometimes they need a little explaining, that’s all. Maybe you just keep getting the bad ones?

The only actual example I’ve heard of algorithms destroying chances is the case of a person who’s been unemployedmot than a few months. They blackball the hell out of those folks. Maybe poor credit is another? What other examples are there?

More of a statement about recruiters. Not HR overall. I have worked with some incredibly dedicated HR people.


So, basically: No one is doing cross validation. Or, at least, cross validation is only being done on the limited set of people that you had hired.


Then the algorithm is even worse of an aid, as like in the example, many excellent employees are getting filtered out by crappy algorithms, oh, and no one is maintaining them. Which means they get progressively worse over time as they fail to keep up. You as a manager are not getting the cream, you are just getting some sort of bland nonthreatening mix of mediocrity at best, and most likely a bunch of deadweights who can game the algorithms. Only you don’t see it, or how they can game the criteria.


I’m no capitalist-dogmatic, but in this case the “market” ought to sort this out right quick. If your algorithm is shitty enough to exclude potential employees on spurious grounds, your algorithm will fail and thus your company will fail.

1 Like

An important difference with a professional athlete is that their achievements are publicly broadcast. I can’t imagine McDonald’s telling Burger King how the folks they passed over are working out. And if they did, that’s starting to feel collusive.

That said, this is a common problem in machine learning: you just use a percentage of hires to explore alternative hypotheses. Usually pick the best applicant, sometimes pick somebody your model might be wrong about.

As for the algorithms using features that are proxies for race/status/etc., that seems more solvable here than in the traditional model. There’s been plenty of studies showing proxy features are used by HR currently (e.g. send out resume with a “white” and a “black” name, see what sort of responses you get). Just mandate that these sytems be auditable in some way (e.g., report the primarily reasons for a rejection back to the applicant so they can work on those issues, or aggregate the decisions and provide them to public advocacy groups). If “your name is Jacob” shows up in the “pro” column, the threat of a lawsuit should be enough to have that particular feature turned off.

Racism in software is a serious issue that we need to get much more serious as a culture about, but oddly it feels eminently more solvable to me that racism in people.


And that same algorithm will let in a buch of dewy eyed fresh from college kids with a cert or two on the list though they have no clue about how things work in a large company but not the guy with 10+ years experience but no recent certs even though that is the guy you are looking for.


Will it? The guys writing the algorithms are selling the emperor new clothes, only there will be no parade, no kid calling out the emperor. And no emperor will admit to being scammed, so the tailors keep moving on. Like @Polama states, there are no controlling metrics, no way to measure how the one that got away turned out to be the better choice.


I have in-laws that cannot get a job if it goes through one of these methods. The outcome is they work total shitshow jobs that nobody else would want. You know the kind, small operations with weird rules, and whatnot. The last one my brother in law got into had a hiring process where they actually gave the resumes to a psychic. (who never picked non white names. . . ) It ends up being low pay, and crazy as hell.


The “just” part may be a bit of a problem, depending on the algorithm that’s being used. You might be able to do something like that with a linear regression model, but deep learning models are pretty opaque. Getting “reasons” out of them is, as far as I know, an open research topic.


Making the systems auditable would help, but I think you’re underestimating how subtle and hard-to-test the “proxies” can be when there is a computer doing the analysis.

Also, what data sources do these algorithms generally consider? Is it just a resume? Or does it also search linkedin and facebook and the kinds of things background checks cover? Does the system disqualify people who were arrested on trumped-up charges because of their race and where they grew up, for example?


I know for a fact that age plays a big role. I spent two years after college trying to get a job in my field. Maybe one or two callbacks. The week I turn 24 I suddenly am getting callbacks from places that completely ignored me before.

“Yes, Mr. LDoBe, you’re resume looks very interesting to us.”

“Why didn’t it look interesting six months ago when I last submitted? You were hiring for the same position then. And my credentials and experience hasn’t changed.”

“Uh, I’m not sure we’ll have to look into that.”

“I’m sure you won’t”


People in business and government act like algorithms don’t have owners and authors, but this has been settled in the arts for decades.

1 Like

In an effective, intelligent, and ethical market that ought to be true. The present so-called market based economy has little-to-none of those qualities.

From the article:“The company may be satisfied with the status quo, but the victims of its automatic systems suffer.”

Judging by my own circle of acquaintances, young and old, I observe that it has become pretty much a standard expectation when hawking around for a decent career. It’s yet another symptom of the much greater systemic dysfunction driven by the same vices that fuel the obscene economic divide. It’s unsustainable on many levels, and when it inevitably falls apart, my hunch is that these mindless IT-based “solutions” will get tossed into the tank first.