Economikit

The Context - Why?

The following attempts to distill the shared context we have developed over the last year and a half. The guiding question motivating this work is, “what is it about the confluence of digital technology and global economic circuits that needs to change?”

Economikit emerged in reaction to a debt-creating engine meant to produce capital from capital, in a negative loop reinforcing the system’s self-perpetuation. Our work lives as a set of experiments responding to this situation.

Here we are, in the collapse of American Capitalism, along with the rise of Black Lives Matter and the tide of social awareness in the White world. Movements of consciousness-raising and techno-utopianism become entangled in the promise of Bitcoin, blockchain, and other distributed tech.

But are these technologies anything other than the last gasps of the Western mind to make some “good” of the materialist and separatist mindsets that they’ve become beholden to under Scientism? Can technology ever be generative and sustainable, or does it inherently depend on externalising the suffering of other beings? Is trying to engineer our way out really a valid path forward, or is this all just spiritual bypassing?

Whatever the answers to these questions, posing them is a necessary first step. We believe it is important to begin this process with a very clear picture of what we perceive these harms to be and how they are related. We hope to find solidarity with other groups practising and experimenting with anti-colonial patterns- a solidarity born of the increasingly universal nature of the harms of capitalism as we live into the 6th mass extinction.

The Problems of Social Technology

In the globalization of capital flows, there is an increasingly-abstract, increasingly-volatile nature to Western wealth and economics. This involves enabling but also obscuring violence.

  • In the early 70s, the US dollar was decoupled from gold and became a purely fiat (debt-based) money, backed only by its own word (and military). And that debt-based currency relied on credit cards to increase indebtedness while expanding buying power.
  • In the 80s, corporations continued the shift from creating product to creating brand. The derivatives market purchased the stock exchange. Abstractions became more valuable than the material economy. Globalization as neoliberal policy.
  • In the 90s and into the 2000s, corporations shifted from material economics to digital economics and turned their attention to the invisible web of relationships between people.

It is as if there is a need to extract value from ever more distant places in order to fuel the physically-as-in-physics-impossible idea of perpetual growth. (Despite The Limits to Growth coming out in the early 70s.)

The use of surveillance as an effective form of social control has roots in the U.S. military’s attempts to squash left counter-insurgency all over the globe. The likes of Google pointed these snooping technologies toward U.S. citizens (and others in the Western world, and of course beyond) in the name of its advertising business which automatically selects and recommends content to users. Its algorithms are described in popular media as wielding mystical, God-like powers to see and know us, as individuals, better than we know ourselves.

These images we have of humanoid robots and other Jetson-like oddities distract us from the fact that this unwieldy surveillance apparatus is a highly advanced form of artificial intelligence with vast social ramifications. Moreover, rather than eliminate jobs it has contributed to what David Graeber called the increased “bullshitization of labor”, especially in healthcare and education, where an outsized amount of time is spent in administration and figuring out how to share tasks with machines.

Ethical Consequences for Social Beings

Globalizing flows of capital and communication creates further abstractions. The effects of our actions and those affected are obscured, making alienation and powerlessness deeply common experiences. The abstractions leave us suspicious. We are closer together through social technology despite the geographical/nation-state boundaries which held some aspects of our identities. And yet we feel watched, in this intimacy, by advertisers or even national governments.

Social chilling is an umbrella term describing the effects of the surveilling gaze. The latter stunts free expression and intimate disclosure because one does not know what agency is watching or why. When major hacks make mainstream news, for example, or when past browsing history eerily reappears verbatim in some other corner of the web, the “average user” catches a glimpse of what we might call the promiscuous data economy of today’s social web.

Lesser known data sharing events, like police profiling of high school Facebook users in Florida in the name of gun threat preparedness, or clinical researchers scraping social media to build models detecting early warning signs of mental illness, express the breadth of possible gazing agents.

And of course, there is also the frighteningly unethical and rapid-speed social experimentation that continues to be undertaken by the research teams of major Silicon Valley platforms daily.

Social chilling doesn’t only describe the effects of government or corporate surveillance, though. It is also a product of excessively public speech and the context-mashing of speaking in a loudspeaker fashion. The membrane-free world of social media makes the stakes of peer-to-peer surveillance unreasonably high. We do not know who can see what (my boss? my uncle? a future dictator? a casual blackmailer with access to any of the former?) This surveillance takes advantage of our distinctly human desire to build shared memories - a process crucial for reflecting on what we are doing and where we are going together, in relation to each other and the natural world. Sociality is our opporunity for building intersubjective coherence, shared awareness, stories, and understanding.

The profit-driven and/or power-grabbing use of social data that diminish our collective ability to create shared stories and memories might be described as “archival control”. AI/ML also holds up a mirror that reflects how racism and bigotry radiates outward from the presumption of white men as “normal people”. Even if these systems were corrected for bias, interpretation is one thing data analytics just can’t do. An equitable interpretation in the context of any human group culture is an emergent political phenomena that requires multiple interacting perspectives in order to better society. Archival control stages a struggle for socially-useful data between those vested in keeping things the same versus those who want to collaborate for collective betterment.

Tying these points together in a unique way— Google fired a Black woman, Timnit Gebru, from its research team for daring to comment on the obscene volume of carbon emitted from training large data sets.

Failed Social Dynamics

The profit motivation of major platforms is met by keeping users occupied with each other in a semi-dissociated, anxiety-ridden, ultra-performative manner so that advertisers may also profit off of our just being there. This is to say that we are not elsewhere, taking risks that lead to serendipitous encounters. The engineered push for user relations built on mutual fascination is felt as superficial interactions, virtue signalling, one-upping, good news only, strong opinions on everything, and cruelty-as-humor (bullying). Inner life, too, suffers from provocations (and means) to air all thoughts and knee-jerk responses, however nasty or hateful these may be. We become abstractions to others and even to ourselves, in a devastating process of understanding active, living relations as things.

Failed Economic Well Being

21st century (informational) economics is built upon our sociality. The corporate gaze is more than a gaze, as it has psychic influence but also feeds back into the design of features, functions, and form. Social media companies and their clients (advertisers) construct an environment mutually beneficial to one another. Social media expresses its value to advertisers and shareholders as a function of volume of users and use.

This is, perhaps, the ultimate expression of the capitalistic preference for quantity over quality. On the social web this attitude dominates our relationship to ourselves and others (the reproduction of society per se), rather than the production of washing machines and hula hoops.

This, we believe leads directly to:

  • Massive concentration of monetary wealth via predatory behaviour and network effects on one-stop-shop platforms - the Walmart-ization of the social web.
  • Increased dependence on advertising-fueled social mediation, entrenching consumerism into the very heart of being with others.

The legal and financial systems that have historically aimed to hold capitalism’s process of abstraction and commodification in check face overwhelm. After decades of robber-baron remniscent wealth concentration and inequality, we begin to see a renewed interest in antitrust legislation. Currently there is some attention to data management practices that is difficult to enforce and further blurs juridical boundaries. Yet, these early legal and financial shifts do not yet address the systemic drivers.

In the industry of Surveillance Capitalism, self-reinforcing feedback loops are created where economic drivers get encoded into the tools and apps we use daily. These apps create and then recreate the social consequences of anxious performative sociality. Everybody is on display to the world, vying for acceptance in spaces where the stakes are high and outrage is instantaneous. These designs and side-effects reverberate in ways that increase disparities and oppression.

Using Feedback Loops to Realign Relationships between People and Planet

We therefore aim to build tools that heal the wounds of abstraction (not seeing or hearing/being seen and heard) and suspicion (lack of trust) arising from the dynamics described above.

Making small optimizations within these reinforcing loops is not likely to accomplish this nor lead to significantly different social outcomes. The overarching feedback loops of Surveillance Capitalism must themselves be challenged and changed. Proponents of the economist Karl Polyani or (more recently) Kate Raworth’s ‘Donut Economics’ model have long spoken to the need for re-embedding the economy in society and nature as being one of the great and necessary challenges of our time.

A grand challenge like this requires that we create a solid ethical-social layer as the human foundation on which to enable new economic patterns. Our basic human needs (material, social, emotional and spiritual) must be informed by the needs of our biosphere and bioregions, and both must come together to direct the behaviour of our economy rather than the other way around.

Economikit is one of many groups hoping to rearchitect society in this way. We aim to create the tools of a new socioeconomic order— tools which support alternative patterns and processes for social organizing with the ethical imperatives of our historical moment embedded in the roots. Where will this path lead? We are not entirely certain, but we are confident that success hinges on our ability to embed these ethical imperatives into the artifacts we create and the way in which they are created.

In future articles, we’ll share some of the theories, practises and products we’ve been stewarding which we believe can be useful components in meeting the challenges of our time with compassion and integrity.