Highly recommended reading - How Artifical Intelligence is Reshaping Repression
Some points of note:
“Around the world, AI systems are showing their potential for abetting repressive regimes and upending the relationship between citizen and state, thereby accelerating a global resurgence of authoritarianism.”
AI technology, primarily in the form of facial recognition is being adopted by security forces, driven, in part, with investment from China.
Zimbabwe is implementing facial recognition - this is a country that recently carried out large post-election crackdowns.
Other technology, such as biometrics, hacking and disinformation is coupled with AI to some worrying effects.
Since AI reduces the need for labor-intensive security operations, it enables wider scale repression. In addition, AI is not going to rise against its masters. Repression at a lower cost.
There is a concept of a “chilling effect” with AI for monitoring - the ability to hide is harder and citizens often conform. To quote from 1984: “It was terribly dangerous to let your thoughts wander when you were in any public place or within range of a telescreen. The smallest thing could give you away.”
In the post Cold War era authoritarian regimes are coming to power via popular uprisings and electoral defeats. AI can be used by these same regimes to now control messaging, disinformation and monitor popular discontent at a broad scale especially with the rise of social media.
If protests arise, AI can help identify these areas before they get out of control. In China, WeChat already uses AI to produce heat maps showing crowd density.
In outlying regions where ethnic minorities make up a large portion of the population, using AI to control these areas effectively. In China’s Xinjiang province, predictive policing is one such method being employed.
Using AI to effectively create disinformation is on the rise. The increase in hyperpersonalized targeting, identifying key influencers, and creation/modification of content is fuelling an increase in effective disinformation.