Updated: Jan 4
Internet of Communities blog series part 1 of 8
My first book, Human Interactions (2005), was about a theory of collaboration that emerged from research in Information Technology (IT). Somehow, the IT aspect ended up in sharper focus, and the next 6 books I contributed to were IT-related. I'm aiming to get back on track with my new book, Supercommunities (out in Jan 2021), and look at how better collaboration and other game-changing ideas can be used to improve society. For this reason, the new book discusses IT only where it is essential to creating antifragile communities. However, there is a wider takeaway, which anyone involved in IT should think about.
Like any form of technology, from smelting and writing through the printing press and steam engine to nuclear power and gene editing, IT isn't good or bad in itself. There is no such thing as good tech, only good uses of tech. So, where is our society right now with IT? I would say that we are at a crux point, poised for either utopia or dystopia, depending on choices made by IT thought leaders over the next few years.
A 2019 survey by PwC lists 11 technology breakthroughs. This kind of analysis is useful, but it is hard to take an overview with so many different things to think about. As a high level analysis, I separate IT mega-trends into just 2 types. Some advances in IT are primarily about observing the world - big data, internet of things, artificial intelligence, and cloud computing. Others are about acting in the world - DNA sequencing, advanced robot technology, nanotechnology, mobile payments, mobile internet, energy storage, and 3D-printing. Looking through this lens, the key question becomes clearer. Who will be doing the observing, then acting on the observations?
Since 2016, we have learnt just how effectively IT can be used to monitor and manipulate people. Without Cambridge Analytica, there would be no President Trump and no Brexit, and the tsunami of fake news continues to wash over us. How many of the 48% of Americans that voted for Trump in 2020 believe his claims that, really, he won? So, let's look on the bright side, for recent counter-examples of IT being used to enlighten and empower us. Sadly, I'm struggling to think of any. We learnt about inequality through books such as The Spirit Level and Capital in the Twenty-First Century, and took action through mass movements such as Occupy and Black Lives Matter. Similarly, we woke up to climate change through books such as This Changes Everything, television such as Blue Planet II, and protests by charismatic activists such as Greta Thunberg, then took action through the movement Extinction Rebellion. You couldn't get more old-fashioned.
To be fair, we couldn't have created coronavirus vaccines so quickly without IT. Medicine has clearly been enabled by IT, as has education. However, the benefits are not evenly distributed, or even unequivocal. The number of people with diabetes rose from 108 million in 1980 to 422 million in 2014, and the above-mentioned books on inequality show how better education does not necessarily lead to higher wellness. Technologists tend to be passionate about the benefits of their own developments, and often do not see the implications of their research until it's too late - the classic example being Einstein, who spent the latter part of his life struggling to come to terms with nuclear power.
Technologists should not, and perhaps cannot, walk away from important ideas. However, as the instigators of step change, they do have a responsibility to look up from the lab bench. At the same time that we start to realise the parlous state of Mother Earth, we are mining the seabed for minerals to build better IT devices, using supercomputer modelling to support exploration of the earth's mantle, and extending the reach of IT networks far out into space. What is the endgame here?
We need to agree what is desirable and what is unacceptable, then legislate accordingly, but this requires us to consider IT as part of a process. That process may be the age-old cycle in which the rise of oligarchic power breaks down stable government. Alternatively, it may be a new and emergent process, by which communities develop the ability to govern themselves in the interests of their members.
John Boyd's OODA loop shows us how effective governance goes from Observe to Orient to Decide to Act, then back round again. Communities have always had the ability to Orient and Decide, but lacked the means to Observe (for which we need a new kind of data) and Act (for which we need a new kind of funding). It is these 2 gaps that new IT developments must fill, and they correspond exactly to the above 2 groups of IT mega-trends. So there is hope. The question is whether or not new IT developments will be used to monitor and manipulate, or to enlighten and empower.
It is irresponsible for IT innovators to throw their hands up and say that, as mere technologists, they have no say in the uses made of their work. For one thing, developers can choose how they build and target their products. For another, their children and grandchildren will live in the world they bring into being, and as Roman Krznaric says, we all need to be good ancestors. For an IT innovator, that means using your power to help create a world in which knowledge and agency is available to all.