Datafication to ‘Tequiology’: Shaping Tomorrow’s Institutions
Embracing ‘Tequiology’: The Future of Data-Driven Institutions
Our recent obsession with automation as a society has had a profound impact on our relationship to one another and transformed our way of life. The shift toward automatic societies presents both opportunities and challenges, requiring societies to adapt, innovate, and establish ethical and regulatory frameworks to navigate the complexities of this automated world.
Data > Algorithm > Automation
Data, algorithms, and automation are the building blocks of modern technology and are interconnected. Data is collected and parsed. Algorithm processes the data and takes a decision or action. Automation system executes tasks or processes based on the decisions made by algorithms.
The combination of data, algorithms, and automation enables organizations to make data-driven decisions, optimize operations, and deliver enhanced user experiences. Technologies like self-driving cars, recommendation systems, predictive maintenance in manufacturing, and intelligent virtual assistants are based on these fundamentals working together to enable automation and ‘intelligent’ decision-making (Artificial Intelligence).
However, automatic societies also reflect biases in our society and exacerbate inequality. There are also many concerns related to data sovereignty, privacy, surveillance, security, ethics etc. that come into question when dealing with automated systems.
Decoloniality and Automatic Societies
There are hegemonic visions of the world in terms of knowledge, bodies, institutions, languages, technologies and relationships. When bias-ridden systems and infrastructures are allowed to exist, they perpetuate inequality.
In systems that rely on biased data for decision-making, such as those used in criminal justice, lending, and hiring, marginalized groups can face discriminatory outcomes. This not only deepens social inequalities but can also lead to injustices and violence. Additionally, when algorithms reinforce stereotypes and prejudices, they may contribute to hostile behavior and discriminatory practices, ultimately posing a threat to social harmony. Addressing these biases in algorithmic decision-making is crucial for creating a more equitable and just society.
In automatic societies, epistemic violence manifests as epistemic processing like datafication, algorithmizing, automation.
- Datafication — the conversion or the transformation of our life and our environment into data
- Algorithmization — the mediation of algorithms to comprehend this data
- Automation — the process in which the state and corporation try to make decisions about our societies
These processes are epistemic and are socio-technically mediated and reinforce global orders of classification that lead to epistemic, racial, gender, economic, social, cultural and environmental justice.
Decoloniality is not a theoretical framework. It is a political praxis.
~ Dr. Paola Ricaurte
‘Tequiology’ : technology of radical care
‘Tequio’ (Nahuatl) ancestral practice to achieve a common goal.
When we think of technology and innovation in terms of tequio or something that is made and used for the common good, it encourages us to think in terms of relationality or how we relate to one another. The idea of ‘tequiology’ is to promote autonomy, self-determination, sovereignty, communality, pluriversality, sustainability. Participation in technologies that are centered around collaboration rather than competition will ‘foster forms of technological development that emphasizes living with dignity’ (Yasnaya Aguillar, 2020).
Interpreting the Manifesto
We have a shared responsibility to defend the possibility of a dignified future for all; a future where we resist and re-exist.
- We foster relationships and technologies of radical care.
- We repair, reconnect, rewrite our histories.
- We recover our environments and body-territories.
- Our affections and our sensibilities transform institutions and forms of organization.
- We build inclusive infrastructures for all and transform knowledge systems that are exclusionary
- We work towards fair labor conditions.
- We manifest intersubjective relations that are relatioships of care.
- We strive towards new imaginations and new narratives.
- We keep our memories and decolonise our desire.
- We imagine our new future in common.
We have to repair the relational nature of existence to be able to imagine alternative futures that allow for being, thinking, feeling, doing and living together.
~ Dr. Paola Ricaurte
Anticolonial AI
In the realm of constructing grand narratives of AI ethics, there is a noticeable absence of critical spaces that delve into decoloniality. AI, often envisioned within oppressive systems, can also be perceived as something wild and untamed, existing beyond the boundaries of established norms. This concept prompts an intriguing inquiry: can we reimagine AI at its very core, starting with the mathematical underpinnings? When examining AI and data technologies, their relationship to epistemologies and metaphysics emerges as a complex assemblage, one that necessitates specific knowledge and logical frameworks to function effectively. The math of probability and the art of probability hacking invite contemplation on metaphysical dimensions. This conceptualization reflects not just the measurement of uncertainty but also the profound philosophical implications of quantification. Beyond this, the lifecycle of AI at the design level becomes crucial, calling for an inclusive vision of anti-colonial AI that challenges established paradigms and redefines the way we perceive, interact with, and shape the future of artificial intelligence.
AI as Tequiology?
Technology as sociotechnical assemblages; AI is an algorithmic assemblage
~ Dr. Paola Ricaurte
It’s imperative to consider not just the technology itself but the institutions that wield it. These institutions shape knowledge production systems, economic structures, cultural processes, and social relations. To forge a path toward feminist and decolonial AI, we must realign the matrix of power, asking pivotal questions about control, ownership, and the distribution of costs in tech development. The inevitability of technology warrants scrutiny; we must ponder whether we truly need it, weighing the technologies of war against those that promote life, humanity, and radical care. This vision extends beyond anthropocentrism, positioning humans as integral components of the ecosphere. Community-driven infrastructures, rather than market-oriented ones, could provide the foundation for more equitable and inclusive technological landscapes.
This post was inspired by a presentation by Dr. Paola Ricaurte at the 50th Anniversary Speakers’ Series at the SFU School of Communication.