Get the key ideas from

New Dark Age

Technology and the End of the Future

By James Bridle
16-minute read
Audio available
New Dark Age: Technology and the End of the Future by James Bridle

New Dark Age (2018) investigates the fundamental paradox of our digital age: as new technologies allow us to gather more and more data on our world, we understand less and less of it. Examining the history, politics and geography of the complex digital network we are enmeshed in, James Bridle sheds new light on the central issues of our time, from climate change to wealth inequality to post-factual politics, and explains how we can live with purpose in an era of uncertainty.

  • Tech skeptics and tech enthusiasts
  • Critical thinkers fascinated by the geopolitics of our networked world
  • Anyone interested in the silly and profound ways technology shapes our lives

James Bridle is an artist, publisher, and writer on technology whose work has appeared in the Guardian, Wired, Frieze, Observer, Atlantic and many other publications. New Dark Age is his second book.

Go Premium and get the best of Blinkist

Upgrade to Premium now and get unlimited access to the Blinkist library. Read or listen to key insights from the world’s best nonfiction.

Upgrade to Premium

What is Blinkist?

The Blinkist app gives you the key ideas from a bestselling nonfiction book in just 15 minutes. Available in bitesize text and audio, the app makes it easier than ever to find time to read.

Discover
3,000+ top
nonfiction titles

Get unlimited access to the most important ideas in business, investing, marketing, psychology, politics, and more. Stay ahead of the curve with recommended reading lists curated by experts.

Join Blinkist to get the key ideas from
Get the key ideas from
Get the key ideas from

New Dark Age

Technology and the End of the Future

By James Bridle
  • Read in 16 minutes
  • Audio & text available
  • Contains 10 key ideas
Upgrade to Premium Read or listen now
New Dark Age: Technology and the End of the Future by James Bridle
Synopsis

New Dark Age (2018) investigates the fundamental paradox of our digital age: as new technologies allow us to gather more and more data on our world, we understand less and less of it. Examining the history, politics and geography of the complex digital network we are enmeshed in, James Bridle sheds new light on the central issues of our time, from climate change to wealth inequality to post-factual politics, and explains how we can live with purpose in an era of uncertainty.

Key idea 1 of 10

Modern computation originated in military attempts to control the weather.

What do computers have to do with the weather, and what does the weather have to do with the military?

Well, everything. For decades, devising methods to predict and control the weather was a chief concern for Western armies — and in that project lies the origin of modern computation.

The first person to make calculations on atmospheric conditions in order to predict the weather was mathematician Lewis Fry Richardson. This was during World War I, when he was volunteering as a first responder on the Western Front.

Richardson even came up with a thought experiment that could be conceived as the first description of a ‘computer’: he envisioned a pantheon made up of thousands of human mathematicians, each calculating the weather conditions for a particular square of the world, and communicating the results between one another to make further calculations. Such a machine, Richardson dreamed, would be able to accurately predict the weather anywhere, at any moment in time.

His futuristic idea didn’t come into view again until World War II, when big military research spending spurred the advent of machine computation. The Manhattan Project, a US military research project that led to the creation of the atomic bomb, is closely linked to the development of the first computers. Many of these first computers, such as the Electronic Numerical Integrator and Computer (ENIAC) from 1946, were used to perform automated calculations to simulate the impact of different bombs and missiles under certain weather conditions.

Often, however, the military origins and purposes of the computers were concealed.

In 1948, for example, IBM installed its Selective Sequence Electronic Calculator (SSEC) in full view of the public in a shop window in New York. But while the public was told the computer was calculating astronomical positions for NASA flights, it was actually working on a secret program called Hippo — carrying out calculations to simulate hydrogen bomb explosions.

From the beginning, the complex, hidden workings of computers provided a convenient cloak for obfuscating their actual functions.

Most of the time, though, they didn’t even carry out their actual functions all that well. The history of computation is full of anecdotes that illustrate how computers’ oversimplified view of the world, their inability to distinguish between reality and simulation, and bad data can have serious consequences for their human users. For example, the US computer network SAGE, which was used to integrate atmospheric and military data during the Cold War, is infamous for its near-fatal bloopers, such as mistaking a flock of migrating birds for an incoming Soviet bomber fleet.

Upgrade to continue Read or listen now

Key ideas in this title

Upgrade to continue Read or listen now

No time to
read?

Pssst. Sign up to your secret to success: key ideas from top nonfiction in just 15 minutes.
Created with Sketch.