Why The Arts (Not Technology) Will Define The Future Of Geopolitics.

To avoid a Third World War, leaders must adopt cyberresilient strategies. In doing so, they will need to come to an understanding of what that represents to a diversity of sovereign states. In this piece, I take a stab at defining Cyberresilience and explore what such strategies and partnerships should involve.

Think of Cyberresilience as the upgraded definition of Cybersecurity. It is everything needed to make a technical system secure. Similarly to Sustainable Finance, which is now mainstream finance, Cyberresilience is about responsible investments and governance frameworks that take into account the environmental, social and governance (ESG) implications of AI.

Cyberresilience is the science and practice of understanding the social, cultural and political environment in which artificial intelligence operates, at every step of the Society-Data-AI-Society sociotechnical loop and value chain.

Dangerous Assumptions

Eric Schmidt published in the Foreign Affairs issue of March/April 2023 an article titled Innovation Power, Why Technology Will Define the Future of Geopolitics. His arguments rely on the assumption that the new force of international politics is technological innovation, as opposed to social innovation. A number of experts including myself believe he is wrong and propose that on the contrary, social innovation is the force of international politics. Schmidt rests his vision on a US-China competition without much regard for what that implies for the rest of us. Furthermore, he sees tech platforms as a means to use culture to spread American values, as opposed to a medium to share a diversity of cultural expressions.

In short, he presents a Big Tech-centric approach that fuels the risks of a Third World War. This approach is symptomatic of a broader, systemic & gendered problem in AI governance, a problem where men’s political power games in AI will kill women (See our policy brief on Gender Equality and the Environment in Digital Economies).

As countries come together, including the G7, calling for concerted efforts towards the governance of AI, I call upon leaders to consider human and planet centred Cyberresilience Strategies. I believe there is much to be gained from strategies and partnerships that focus on social innovation, on closing the digital gap and its underlying causes, and on trisectoral innovation ecosystems; on a peaceful, sustainable and mutually beneficial shift in global centres of power (via data partnerships for example); on the arts and culture being a tool and a space where we can learn together and shape a different landscape; on reserving AI and data’s use for the Sustainable Development Goals, for Reconciliation with Indigenous Peoples, and for human rights. An overwhelming majority of AI and tech innovation strategies focus on economic, military, and technological competition, leaving the required social, cultural and intellectual innovation and freedom for later, forgetting that a polarized society is a paralyzed society.

AI strategies that omit to include solutions to the root causes of digital gaps, to disinformation wars and to polarization in political and media public debates, are doomed, and put our kids’ futures at risk.

Watercolour on film captured with a macro lens (Nikkon 40mm)


Power is watercolour.

Watercolour, for those who haven’t dabbled in this simple yet challenging art form, is a relationship between different coloured pigments, water and papers of a varying degree of absorbency. That relationship is influenced by pressure and movement and, for most dabblers and artists, forever keeps an element of uncertainty. As an artist, it requires the courage to accept that element of unknown, and let go of control. Only few will result in an inspiring assembly of shapes and colours that will be admired and cherished for centuries.

Watercolour very much ressembles the art of, and the power required to navigate the chaos brought upon by a revolutionary technology, AI.

Quality, independent, content creation, be it journalism and media content, or art, is pivotal to the future of democratic societies. It is also vital to the survival of industry, who wouldn’t be able to make millions without it. It isn’t surprising that UNESCO recently published a report calling for support of creative voices and increased possible collaboration between artists and journalists as vital supportive pillars of democracies.

From Narrative Ethics to Normative Ethics, the Path is Political

The ethical implications of AI are political choices that citizens must be able to make with information free of industry-led communication and investment strategies. It is widely commonly accpted that AI Ethics Principles include fairness, such as unbiased algorithmic outputs, and that bias in AI impacts some social groups more than others, causing unfair access to knowledge, financing, freedom, etc. AI’s cultural and economic impacts affect the political power of these social groups, making AI ethics more of a political science than a computer science. Inaction in the face of such impacts (ex: the use of facial recognition for the surveillance of a racialized community) is an unethical and irresponsible path, and, in fact, the current emerging AI laws hope to take action against such negative impacts. That being said, many still promote the separation of AI ethics from AI politics. In response to that, I argue the following.

Narrative ethics are a form of engagement through the arts that lead to informed, independent and diverse perspectives in policy innovation. On the other hand, normative ethics examine political and economic implications of new technologies. These critical considerations can then inform regulatory innovation and shape laws that can protect human rights and promote equal opportunity. Artists (inter-arts, storytelling and documentary making, interactive creative media, etc.) who engage participants in a conversation on the ethical, social, cultural, economic, political, legal implications of AI create a path where a broader, more diverse number of citizens can take part in informing, shaping the future of AI. In other words, the ethics and politics of AI are not two separate fields, but one continuous flow of learn-think-act interventions along AI’s sociaotechnical pipeline that facilitates (and accelerates) civic engagement with AI systems and their governance.

In the context of a Cyberresilience strategy, this means that the Arts can be an important tool that contributes to the prevention of disinformation and polarization, one of the most important risks to National Security in Canada.

Mix Media Inter-arts on Law and Power.


Disconnected Discourses: a Risk Factor in Cyberresilient Societies

I recently had an interesting discussion with a group of media start-ups who use social media platforms to relay information to users. When questioned, they struggled to position their role in relationship to democracy. Despite efforts to come up with business models that would allow them to create quality, independent, unbiased content, our short discussion led to more questions than answers. In the same venue, at the same time, two world-leading experts discussed the risks AI presents to democracy. At a different conference during the same 2 days, an international gathering of academic researchers gathered to discuss the role of current media in shaping the narrative on AI, while struggling to image the future of news-sharing forms of media altogether.

During those two days, in different spaces of the same city, influential investors in AI (who expect a Return on Investment), tech start-ups, government innovation support organizations, leading academic researchers in AI and in social sciences including art and action-based research in Digital Humanities (who expect society to be just and fair), were gathered.

The discourses between these groups are disconnected, and that is a weakness in our capacity to deploy AI in a safe and responsible manner. Cyberresiliency leaders will need to address this disconnect.

The private sector based operates on old financial theories that make profit a priority. The public sector’s role is to govern, however, a resource-poor public sector could lead to a dangerous unbalance of power which would be easily exploitable for ill-intended purposes. Furthermore, AI and data governance represent a new expense for all governments, with a globally unevenly distributed potential to benefit from such AI technologies and data ownership. The third sector, the civil sector, ensures trust and balance of power in democratic systems. These organizations are the closest to citizens, the reason why we put in place political systems in the first place. Making sure Civil Society Organizations have a seat at the table means that citizens have a say on how they want AI to improve public services, how they want to be protected from harmful uses of AI, and how they can take part in shaping the future of their children and the planet. Besides, research in the past 10 years has shown that innovation ecosystems that work with civil society (trisectoral collaboration) are more efficient and beneficial to society*.

In short, having a different discourse in different social groups is normal, on the other hand, having a disconnected discourse that is increasingly polarized on AI governance is dangerous, for all of us.

Conclusion

So in answer to Schmidt’s article, I say that Innovation Power led by humans, collaboration and independant artists will define the future of geopolitics!

That’s a little short and an abrupt ending, but hey, it’s Friday, way after 5pm. I leave you with these thoughts for now. I will come back and add some thoughts at a later time, and will translate into French soon. I left a few typos so you know ChatGPT didn’t write this. I feel like this might become my new official excuse! :-) Feel free to chime in with suggested readings on emerging cyberresilience, and, or, tell me what you think about all of this. p.s. all the art is mine. Please don’t use without permission (OpenAI, don’t you dare).

*I will dig out a specific publication I have in mind on that and update with that information soon.

UPDATE: Who has the right to define art? It shouldn’t be left to the courts, it shouldn’t be left to AI research institutes, and it certainly shouldn’t be left to big tech companies. It should be in the hands of artists and the organizations that represent and support them.

This is the heart of the conversation that the AI Impact Art Coalition, initiated by AI Impact Alliance and supported by nearly 2,000 signatories, aims to facilitate.

Sign the petition and join this conversation.

Self-portrait, Acrylic and collage.

Previous
Previous

Art X AI Conversation: Recommandations et appel à action

Next
Next

Faisabilité d'une fiducie de données pour le fleuve Saint-Laurent