Really Human Resources
AI has infiltrated the personnel selection process, gamifying it
Personnel selection has been transformed into a calculation problem: the assumption is that a life is measurable and comparable. The curriculum vitae is the mould of this idea. It forces workers to compress their history into keywords, roles, results, and dates. Everything that does not fit into a list (contradictions, care contexts, irregular paths) must be removed. Empty sections are a flaw — so what? Let us think outside the box…
This is only the beginning 1 of the dehumanising chain: the aspiring worker must produce a competitive synthesis of themselves, and then someone else must hypothesise what this person can do. It is already a brutal filter when human, but in the so-called era of automation we are supposedly living through, this automated system does not listen — it ranks, producing a list from which the first three or five candidates are selected for the subsequent stages.2
Breaking the circuit: siege, stress, narrative
This is where Really Human Resources enters. We do not ask for a technology that tries to be fair — this process cannot be saved: we want to make the idea that a person is calculable unliveable.
We “help” job seekers apply en masse and in a coordinated way3, with true stories but optimised texts4, until we transform the application chain into pure noise, with very little signal. A process designed to optimise human effort is forced to pay the price of its own abstraction: more calls, more inconclusive interviews, more wrong hires5, more time burned on both sides. When the error becomes systemic and measurable, it stops being individual bad luck and becomes a political matter.
The operational sequence
Various components are at play. Follow the progress on XXX-SITE-TODO, and given the open-source and decentralised nature of the effort, consider participating:
-
Leaking platform with strong anonymity to surface internal rules, metrics, and procedures.
-
Cartography of filters to understand where automated rejection occurs: the software used by companies, the portals and their filters. Being able to name the problem.
-
Open assisted application tools built with developers and job seekers: they reduce the cost of applying and allow the process to be measured. Political side effect: breaking trust in numbers.
-
Weekly bulletin of stress observation, serial and public: how many applications have we supported?
-
Measurement and publication: response times, bounce rates, prematurely closed listings, invasive requests, inconsistencies. For this, we need to assist the application process from start to finish, or receive reports from those going through it.
-
Escalation where possible: engagements with trade unions and, in more advanced cases, collective actions linked to automated decisions. As suggested above, we know that a citizen should not be subjected to automated choices that impact their life — but rights must be won, it is not enough for them to be written in data protection regulations when technology and practice go in other directions.
-
The rapid evolution of generative models (LLMs, generative AI, text-to-image, deep-fakes) enables the creation of convincing texts, identities, and careers with very little human effort. The study “Unmasking Fake Careers” is from 2025: a sign that we are already in full “age of synthetic generation”. arXiv ↩︎
-
The increase in application volume (on a global scale, remote jobs, online platforms) makes full human review of all CVs impractical. This is why many selection processes remain, a priori, entrusted to automated systems: but these — as studies show — are vulnerable to “data poisoning” or manipulation. arXiv ↩︎
-
Automated form filling, email sending, created “tailored” to the job description. Tools/services that already exist today allow exactly this. loopcv.pro ↩︎
-
Even without falsifying data, a candidate can use AI to generate very clean text, optimised for automated screening systems (ATS), that conceals very little real competence or experience. This increases the probability that underqualified people pass the initial filtering. Job in Tourism ↩︎
-
Companies trying to optimise time and costs seriously risk lowering the quality of the hiring process, perhaps hiring unqualified people — with legal, reputational, and organisational consequences. Bradley ↩︎