Finding a Place to Live in an Ordinal Society
Let’s begin with three vignettes, each featuring a young person in a different city—London, York and Sydney—all navigating the challenges of finding a private rental property in a global housing market increasingly shaped by digital risk profiling technologies.
Vignette 1: “Open Banking Scrutiny” – London, UK
Jordan receives a job offer with a higher salary but must undergo open banking checks to apply for a rental. The system flags a recent spike in medical expenses as “irregular.” Despite confirmation from Jordan’s employer that the salary is sufficient, the final rating remains “amber.” The landlord demands six months’ rent upfront—a sum Jordan can’t afford—ultimately costing them the chance to relocate closer to work (Figure 1).
Vignette 2: “The Thin Credit File” – York, UK
Mary, a freelancer with a consistent record of paying bills, applies for a modest city apartment. However, the digital referencing system flags her as “high risk” due to her limited credit history and irregular deposits. Despite evidence of steady freelance income, the landlord’s agency accepts the algorithm’s decision and rejects her application within the hour. No one investigates her unique but adequate financial situation—she simply fails to meet the algorithm's criteria.
Vignette 3: “Tenant Passport Pressure” – Sydney, NSW, Australia
Aditya uses a “tenant passport” service, uploading bank statements, references and income details for a quicker rental process. The system assigns a “tenant score” based on every purchase, from groceries to entertainment. While Aditya secures an apartment, suggestions such as cutting back on leisure spending feel intrusive. Even after moving in, Aditya remains concerned that future referencing checks will scrutinize his everyday expenses.
Figure 1. A street of predominantly rented properties in York, UK. Photograph by Roger Burrows.
What is happening here?
In an epoch when virtually everything we do leaves a digital footprint, sociologists Marion Fourcade and Kieran Healy suggest that social stratification now turns less on class, race or gender and more on a constantly shifting web of scores, rankings and metrics. The so-called ordinal society positions personal data at the center of decisions about who can access loans, housing and other basics of life. Algorithms crunch financial transactions, online interactions and behavioral patterns to produce quantifiable measures of “worthiness.” At first glance, supporters of these systems claim they expand fairness by applying uniform standards to everyone. Yet beneath the promise of data-driven objectivity lurks a troubling reality: historically embedded biases slip into the code, reshaping old prejudices into new quantitative forms and perpetuating inequality under an ostensibly neutral façade. The result is a world in which mathematical processes play a decisive role in granting or withholding life chances, often amplifying disadvantages they claim merely to measure.
One concept illuminating this shift is eigencapital, which builds on Pierre Bourdieu’s notion of economic, social and cultural capital by introducing a fourth domain: digital value. In this new paradigm, an individual’s “data self”—formed by the products they purchase, the online conversations they initiate or join and the platforms they frequent—accrues or loses worth, depending on whether it aligns with algorithmic standards of stability or reliability. Traits as mundane as the timing and frequency of expenditure can become indicators of “trustworthiness,” while irregular habits or gaps in employment records lower one’s score. Crucially, this process is not presented as a moral judgment but as the mere output of neutral software, lending the entire operation a veneer of impartiality. Nonetheless, it encodes moral assumptions about what behavior should be rewarded. Eigencapital becomes a vehicle for labeling specific lifestyles as “stable,” effectively privileging those who fit conventional norms of financial conduct and penalizing those who deviate for reasons that may have nothing to do with their capacity to pay rent or repay a loan.
The mathematics underpinning eigencapital exposes how such judgments become encoded. Drawing on linear algebra, algorithms identify the “weight” of certain traits—like steady income or consistent spending on everyday necessities—and treat these as strong indicators of low risk, awarding higher scores to individuals whose data reflect them. Meanwhile, those living on precarious or irregular incomes become flagged as high risk, regardless of any record of meeting obligations over time. This numerical process filters directly into moral categorization, implying that people with minimal financial variation are more virtuous. The result is that patterns once viewed as simply “different”—maybe working several short-term contracts or occasionally making a large cash withdrawal—are now read as warning signals of potential instability. Essentially, the code codifies certain socio-economic assumptions, elevating them into gatekeeping tools that increasingly determine access to essential services.
A particularly vivid illustration of eigencapital's impact is found in the private rented sector (PRS), where tenant-referencing platforms use open banking to harvest granular data about prospective renters. In the United Kingdom, renters are often compelled to divulge spending histories, monthly outflows and streams of income to appear attractive to landlords and letting agents. This deep inspection is promoted as beneficial for those with thin credit files, but in practice, it functions coercively: people learn that to have a fighting chance at housing, they must consent to levels of financial scrutiny that would have seemed overly invasive only a few years back. The algorithmic ranking then amplifies or penalizes specific patterns, propelling those deemed “stable” to the top. Overdrafts, sporadic work schedules, and certain spending categories labeled “discretionary” can act like red flags, plunging applicants to lower tiers. Similar dynamics play out in Australia, where platforms like Snug generate “Match Scores” that, despite proclaiming neutrality, routinely deprioritize welfare recipients or freelance workers whose incomes fluctuate. In the United States, tenant-screening tools such as RealPage incorporate eviction histories, credit data, and sometimes even social media profiles to produce an overall rating.
Within this landscape emerges the tenant passport: a comprehensive digital profile that consolidates credit scores, references, employment data and spending patterns into a single package for quick evaluation. Marketed as a convenience, the tenant passport shifts the onus onto applicants to ensure their digital footprints conform to algorithmic ideals. Any irregularity—a gap in employment, a period on welfare, or spending patterns deemed problematic—can tarnish the passport and significantly reduce the prospects of securing a decent home. Lost in these screenings is the ability to demonstrate one’s genuine reliability through nuance, context, or personal explanation. The tenant who consistently pays rent despite juggling multiple short contracts, or who went through a single financial rough patch, may appear “unfit” simply because their data points do not align with a pre-approved shape of predictability. What is masked by these digital composites are the resilience, adaptability, and interpersonal skills that do not translate neatly into numeric fields.
“In an “ordinal society,” data-driven scores now shape access to housing, often rebranding old inequalities as seemingly neutral metrics.”
This process encapsulates the symbolic violence described by Bourdieu, as it legitimizes particular forms of social dominance by framing them as the natural output of objective metrics. People who happen to have predictable cash flows and conventional employment histories reap benefits, while those living in more precarious conditions face algorithmic suspicion. Moreover, the ostensibly neutral judgments exert tremendous power in dictating life chances, making it seem inevitable that certain groups will be labeled high risk. Algorithms or data sets that incorporate historically biased patterns reinforce and even magnify discrimination, all while purporting to be impartial. Minor deviations such as spending on “flagged” items or maintaining a less traditional work schedule are branded as irresponsible, regardless of actual rent-paying capacity. Under these conditions, many are forced to sanitize their digital footprints, developing patterns of spending that attempt to conform more to the norm, to avoid having their finances scrutinized or misinterpreted, an effort that disproportionately burdens lower-income renters who lack the resources to camouflage financial irregularities.
Despite the stark inequalities these systems can produce, their expansion is not necessarily inevitable or irreversible. The key issue lies in using them as wholesale substitutes for a more nuanced, contextual form of evaluation. Strong policy interventions could mitigate their worst effects. At a minimum, transparency should be mandatory: tenants deserve to know how their data is collected, what weights different factors carry, and how they might correct or contest inaccurate or decontextualized information. Open banking, for example, need not operate as an all-or-nothing proposition; people who decline to share intimate transaction histories should not automatically be classified as unreliable. Regulators should examine the design of these algorithms to uncover embedded biases and mandate adjustments or outright bans on data points shown to perpetuate injustices. Allowing tenants to explain the reasons for irregularities—unforeseen emergencies, caring responsibilities or seasonal work—would inject some critical humanity and context into these otherwise brittle calculations.
The stakes could not be higher, for housing is a fundamental necessity rather than a luxury. Reducing it to a contest of who can present the most sanitized financial record risks normalizing a new form of digital discrimination. Fourcade and Healy's critique of eigencapital underscores the broader truth that data-driven tools are never purely neutral; they reflect and reinforce cultural values. Yet they could, with careful design and oversight, be tuned to recognize the diversity of human circumstances and reward genuine capability rather than penalize financial irregularities. The questions that remain—whose values shape these algorithms, who oversees their creation and whom they ultimately serve—speak to core ethical dilemmas in a society increasingly governed by digital scoring.
As reliance on algorithmic decisions deepens, it becomes ever more important to scrutinize claims of efficiency and impartiality. The promise of a perfect data-driven system can quickly devolve into a dystopia for anyone whose life does not conform to the numeric template of ideal financial behavior. We risk designing a world in which housing opportunities and other essentials hinge on the ability to demonstrate unwavering adherence to middle-class spending norms. Notably, even some landlords acknowledge that standardized metrics do not capture qualities like honesty, neighborliness, or stability in crisis—traits that might matter just as much for successful tenancies. Nonetheless, the systems persist, because the immediate convenience of pushing a button to generate a score often overrides the more labor-intensive process of understanding someone’s story.
If we accept that housing is a basic human right, then challenging the encroachment of purely data-based eligibility tests becomes crucial. Under the pretense of objectivity, these systems can re-embed structural inequalities into new frameworks, effectively outsourcing moral judgments to automated processes. Protecting renters and ensuring equitable access requires that policymakers, developers and citizens collectively resist surrendering decision-making power to undisclosed algorithms. No single score can reflect the complexity of human lives. Data-driven tools can still serve the broader good, provided they remain accountable, transparent and open to revision. Yet in a society eager for quantitative certainty, it falls on all of us to remember that people should not be reduced to numbers, especially when that reduction determines whether they can keep a roof over their heads.[1]
Citation
Roger Burrows, “Finding a Place to Live in an Ordinal Society,” PLATFORM, Feb 10, 2025.
Notes
[1] For a more detailed treatment of this subject, see Alison Wallace, David Beer, Roger Burrows, Alexandra Ciocănel, and Hames Cussens, “Algorithmic tenancies and the ordinal tenant: digital risk-profiling in England’s private rented sector,” Housing Studies, 27 January, 2025.