This page is a mini-dictionary of important notions and key terms that are useful to understand today’s digital world, and the place that humane technology can have in it.
Humane technology:
Technology that is designed to align with human needs and desires, and to allow human sophistication to express itself fully. Humane technology is the opposite of the technology we have today in the biggest social media platforms, search engines and so on, which is designed to make as much money as possible by making users think in more simple and predictable ways.
Persuasive technology:
Technology that exploits human psychological weaknesses and cognitive loopholes to persuade users to adopt certain behaviours that are deemed more profitable. Persuasive technology combines the power of known methods of persuasion with computational ability to act as an extremely efficient and relentless persuasion machine that outmatches the human brain by several orders of magnitude.
Race to the bottom of the brain stem:
A process in which different platforms of the digital world compete to trigger reactions in users more and more efficiently, by showing them content that appeals more and more to lower instincts.
Cognitive downgrading:
A process in which users are conditioned to adopt increasingly repetitive and predictable behaviours in order to be more valuable to attention-seeking platforms. Examples of cognitive downgrading include shortening attention spans, making users think in simpler and less intelligent terms about the people around them, or conditioning users to accept certain uncivil attitudes as normal.
Surveillance capitalism:
A term coined by professor Shoshana Zuboff, for which she provides eight definitions at the beginning of her book, The Age of Surveillance Capitalism. Although it is best to read her 500+ page book to understand what surveillance capitalism really is, we can quickly say that it is a new economic process invented by Google in 2002 and adopted by the rest of the world in the following years, in which ubiquitous surveillance is used as a tool to produce certainty, with the goal of making exponential profits.
The attention economy:
An online extractive market system in which the most valuable resource is people’s attention, because it is the only finite resource; this then becomes the basis for everything else: everything is geared to first and foremost capture as much attention as possible, no matter the means.
Filter bubble:
An online user is in a ‘filter bubble’ when the content that is presented to the user is filtered by an algorithm in order to maximise engagement and profitable behaviour modification, the result being that the user is isolated from certain things because they only see a very specific portion of the world, which may have nothing in common with the content seen by another user in another filter bubble; effectively, each user has their own unique filter bubble.
Information ecosystem:
The complex intertwining of information flows in society: which information is created, what kind of information it is, how is it transmitted, to whom, how often, in which quantities, which purposes it has, and so on. The information ecosystem can be understood as the way in which people interact with information and information transmitting and displaying devices in society. Similarly to a natural ecosystem, an information ecosystem can be conceived as healthy or unsustainable, ordered or chaotic, asymmetric or well-balanced, etc.
Overwhelming human weaknesses:
While artificial intelligence may not necessarily have reached the singularity, that is to say the point beyond which machines understand humans more than the other way around, persuasive technology has reached the point where it can exploit human psychological weaknesses so well that it holds more power over humans than humans hold over themselves.