Fight for our right
Data, Compute, and the Conditions of Justice in Informational Societies
Stefaan Vossen
1. Introduction
We may currently be witnessing a natural stage in humanity’s adaptation to a new global resource: data.
Throughout history, societies have reorganised themselves around dominant forms of power and resource control. In early societies, physical power determined survival and governance equally to access to healthcare today. In agricultural civilisations, land and food production shaped social order. In later economic systems, financial capital and industrial capacity became the defining resources.
Today, the emerging resource is informational. Increasingly, social, economic, and institutional decisions are derived from the interpretation of data.
As societies adapt to this new resource, the question of how informational power should be structured and governed becomes unavoidable. The purpose of this essay is to argue that the current global debates surrounding data, artificial intelligence, and computational power represent another stage in a recurring historical pattern: the emergence of new technological power structures followed by attempts to construct moral and legal frameworks to regulate them.
Recognising this pattern may help decision-makers respond more constructively. Rather than treating current debates as purely ideological conflicts, we can analyse them as part of a broader process of technological and institutional adaptation.
The central rhetorical question addressed here is therefore this:
If modern societies increasingly interpret reality through data and computational analysis, should access to the information used to describe and evaluate individuals not be considered a matter of fundamental justice?
2. Historical Pattern
Human societies consistently reorganise around the dominant resources available to them.
As communities adapt to new resources, whether physical, agricultural, financial, or technological, their institutions evolve accordingly. Legal systems, governance structures, and social norms develop to regulate the distribution and use of these resources.
Seen in this light, the current transformation driven by data and computational systems is not unprecedented. It represents another stage in a recurring historical cycle: technological innovation produces new forms of power, and societies respond by developing moral and regulatory frameworks intended to limit abuse and preserve social stability.
In earlier eras, societies struggled but needed to regulate:
physical power
land/resource ownership
industrial capital
analytical systems
Today the central resource struggling for regulation is informational.
3. Data as a Social Resource
Data differs from earlier resources in an important way. While gold, land, or physical power could be controlled directly, data fundamentally represents observations about the world and individuals within it. This creates a link by association intrinsic to metadata.
Modern institutions increasingly rely on informational representations of reality and meta-data-based digital twins. Medical records, financial histories, behavioural data, and algorithmic profiles are used to evaluate individuals and guide decisions affecting their lives.
In this sense, data, including its metadata, can be understood as representing a fundamental social resource: the informational perspective through which reality is interpreted.
This raises an important moral and legal question: how should the informational systems holding data representing individuals be governed? If decisions affecting individuals increasingly depend on informational representations of those individuals, then the accessibility of that information becomes ethically significant and defines the asymptotic path forward.
A system in which institutions possess extensive informational representations of individuals while those individuals cannot access or evaluate the data used to assess them creates a structural asymmetry in notions of control and morality. Easily solved by globally insisting on individual access to that data as a fundamental right. This can be done in agreement in line with good-faith actors, as it can be demonstrated to be a process without morally unjustified impact.
Traditionally, such asymmetry raises concerns about fairness, accountability, and the capacity of individuals to understand and contest decisions affecting their lives. From this perspective, the emerging challenge is not simply technological. It is institutional.
The question becomes whether societies will construct informational systems that preserve individual agency and transparency, or restrict access to data from systems that concentrate interpretive power within a small number of institutions. This is not giving access to their software, just what their software did with the data to assess their data.
This essay argues that a foundational principle for informational societies should be the following:
Individuals possess a fundamental right to access the information used to evaluate them and to meaningfully interpret the systems through which those evaluations occurred.
Why This Matters
If societies increasingly operate through informational representations interpreted by computational systems, then justice may depend not only on the distribution of material resources or legal rights, but also on the distribution of interpretive capacity.
The emerging politics of data and computation therefore represent more than a technological transition. They represent a new stage in humanity’s long-standing effort to construct institutions capable of managing power while preserving individual freedom.
4. The Politics of Compute
If data has become the primary medium through which modern societies represent reality, computation has become the mechanism through which that reality is interpreted.
Data alone does not produce meaning. Raw information only becomes useful when it can be analysed, compared, and interpreted within context. Increasingly, this interpretive work is performed through complex computational systems.
Statistical modelling, machine learning systems, predictive analytics, and simulation environments now play a central role in interpreting complex informational structures. These systems allow institutions to identify patterns within datasets far larger and more complex than those that individuals can reasonably interpret unaided.
In this sense, computation functions as the lens through which data becomes meaningful to the operator of the computation.
The emergence of large-scale computational interpretation introduces a new dimension to political and social power. Those who control advanced computational infrastructure gain the ability to interpret informational representations of reality at a far greater scale and resolution than those who do not.
Historically, power was often derived from the control of physical resources such as land, labour, or industrial capacity. In informational societies, a growing share of power derives from control over interpretive capacity.
The politics of computation therefore concerns not only technological development, but also the distribution of the individual ability to interpret reality itself.
5. Interpretive Power
Interpretation has always played a role in social power.
Legal systems interpret laws. Scientific institutions interpret evidence. Economic systems interpret markets. In each case, the ability to define how information is interpreted influences the outcomes of social processes.
What distinguishes the current moment is the scale and automation of interpretation.
Advanced computational systems can now analyse large populations, complex economic systems, and detailed behavioural patterns. Corporations use these systems to predict consumer behaviour. Governments use them to model social trends. Financial institutions use them to assess risk.
These developments allow organisations with sufficient computational resources to interpret informational environments in ways that were previously impossible.
The result is the emergence of a new form of structural power: interpretive power.
Interpretive power refers to the ability to shape how informational representations of reality are analysed, categorised, and used in decision-making processes.
When interpretive capacity becomes concentrated within a small number of institutions or actors, those actors gain influence over how reality is understood within social systems. This concentration itself is unavoidable and the basic premise to all social conventions, but the absence of systems to assess it, is a failing of the basic test to establish criminality and harm. Giving individuals access to this data and metadata unrestrictedly is therefore a basic requirement to establish any form of regulation in the age of AI.
6. The Concentration of Interpretive Capacity
The emergence of computational interpretation does not occur in an institutional vacuum. While interpretive capacity exists in principle wherever individuals analyse information, large-scale computational interpretation requires substantial infrastructure.
Training advanced artificial intelligence systems, analysing global datasets, or running complex predictive models demands access to large computational clusters, specialised hardware, and extensive energy resources. These capabilities are expensive to build and maintain, which has resulted in a relatively high concentration of computational infrastructure within a small number of organisations.
At present, the majority of large-scale interpretive compute is controlled by a limited set of actors. These include major technology companies operating hyperscale cloud infrastructure, organisations developing frontier artificial intelligence systems, and state actors operating national computing facilities.
A small number of technology companies operate the world’s largest computational platforms, providing cloud infrastructure through which governments, corporations, and research institutions conduct large-scale data analysis.
The result is a technological landscape in which the capacity to interpret large informational systems is distributed unevenly across society. This unevenness is only apparent in the absence of direct and individual access as a fundamental human right. In its presence however, these same active actors become storage facilities akin to the home computers and servers that built the internet.
This concentration does not necessarily arise from deliberate or nefarious attempts to monopolise interpretive power. Rather, it reflects the economic and technical realities of building large computational infrastructures.
However, the consequences of this concentration are politically and ethically significant and must be addressed as a matter of human rights.
When interpretive capacity is concentrated, a relatively small number of institutions gain the ability to analyse, model, and predict large segments of social reality. We need to be able to check on that collectively. Parsing the metadata away as if it were merely a commercial product is no longer correct. It is method, and when it comes to being able to evaluate the legality of things we must evaluate method, motive and outcome. Keeping this data inaccessible to individual users therefore jeopardises the very basis of every legal framework.
The question raised here is therefore not whether such computational infrastructures should exist or be regulated, but how societies should govern the informational environments they create.
7. Epistemic Asymmetry
The concentration of interpretive capacity produces what can be described as epistemic asymmetry.
Epistemic asymmetry arises when some actors possess significantly greater ability to interpret informational representations of reality than others.
In modern informational systems this asymmetry is increasingly visible. Corporations model consumer behaviour. Financial institutions construct risk profiles. Governments analyse social and economic trends.
Meanwhile individuals frequently lack access to the computational tools, datasets, or analytical models used to evaluate them.
This creates a structural condition in which individuals are increasingly interpreted by systems they cannot meaningfully interpret themselves.
When individuals cannot access or evaluate the informational systems that influence decisions affecting their lives, their ability to contest or understand those decisions becomes limited.
8. The Justice Problem
Justice traditionally concerns the fairness of institutions, laws, and decision-making processes.
In informational societies, the fairness of these processes increasingly depends on the informational conditions under which decisions are made.
If individuals are evaluated through informational systems they cannot access or socially interpret, a fundamental imbalance arises.
From this perspective, justice may depend not only on legal rights and institutional design, but also on the structure through which interpretive capacity is distributed within society.
9. Access to Compute as an Emerging Right
Human rights have historically developed in response to structural changes in social organisation.
Legal equality, education, and freedom of expression emerged as fundamental rights because societies recognised that individuals could not fully participate without them.
In informational societies another condition may be emerging.
If individuals are increasingly evaluated through computational interpretation of data, then access to the informational and computational tools required to interpret those systems may become an important condition for meaningful agency.
Individuals should possess meaningful access to the information used to evaluate them and to computational tools sufficient to interpret that information.
9.1 Interpretive Data and Algorithmic Judgement
A distinction should be made between full disclosure of computational systems and access to the informational consequences of their use.
This essay does not propose that all computational systems must be fully transparent. That would be equally immoral. However, when computational systems produce judgements about individuals, such as classifications, risk assessments, behavioural predictions, or eligibility determinations, those outputs become informational objects associated with those individuals.
Therefore these outputs may be understood as interpretive data.
If interpretive data is used in decisions affecting individuals, it should be accessible to those individuals under the very same principles that govern access to other personal data.
In informational societies, justice increasingly depends on the conditions under which individuals can access and interpret the information used to evaluate them.
10. The Asymptotic Path
Human knowledge is inherently incomplete. Interpretations of complex informational systems are always subject to revision as new information becomes available.
For this reason, improvements in justice rarely occur through sudden institutional transformations.
Instead, societies move gradually toward better outcomes by improving their capacity to understand the systems they inhabit.
This relationship can be expressed simply:
As understanding improves, error decreases.
As error decreases, justice increases.
As justice increases, peace is approached asymptotically.
Peace does not arise from eliminating disagreement, but from improving the shared capacity to understand the consequences of actions within complex systems.
11. A Principle for Informational Societies
Societies already recognised the importance of expanding interpretive capacity through education, libraries and more recently access to the internet.
Education increases the ability of individuals to interpret information about themselves and their environment. In this sense it is obvious, but the transformation described in this essay may be understood as an extension of this same principle to informational infrastructures as it pertains to constructing an appropriate framework for today’s information highway.
By principle then, and as computational systems increasingly mediate the interpretation of individual human reality, individuals must as a matter of legal principle retain meaningful access to the information used to evaluate them and the interpretive outputs generated by those systems.
Such access does not guarantee perfect understanding, but it preserves the conditions under which individuals can understand, contest, and participate in the informational systems shaping their lives.
Peace grows where understanding improves.
Stefaan