Why Lone Wolves Finish Last in the Digital Age
“Solitude is a catalyst for innovation,” according to Mahatma Gandhi. And according to a lot of other people, he’s exactly right.
Most of us believe that the greatest innovators are also lone wolves. The real geniuses, we seem to think, withdraw from reality in favor of the fantastical chambers of their own minds. There, through personal labor and insight, they change the world. But this picture holds true only with some serious modifications, almost all of which involve other people.
The Innovators
- 19 min reading time
- audio version available
Walter Isaacson’s long-awaited book The Innovators limns how even the most introverted of innovators didn’t actually go it alone. Rather, they were nurtured by a particularly collaborative culture or a group of colleagues that led them to their greatest discoveries. Isaacson’s book dismantles the ideal of the lonely genius. Be it a decentralized organizational model or boozy late-night parties, more often than we think, it’s collaboration and an ethos of sharing that leads to groundbreaking innovations.
Not a yet believer? Below, we’ve pulled out seven examples, Intel to Atari, of how sharing and collaboration have led to giant leaps in the digital age.
1. The team that turned a concept into a (really useful) contraption
Since its invention, the telephone has made sharing across geography and time zones a snap. It’s only fitting that at Bell Labs, former AT&T Bell Laboratories, sharing came naturally. Bell’s collaborative environment proved that innovation blossoms when talented minds come together. Without it, who knows when we would’ve been gifted with the transistor?
John Bardeen, William Shockley and Walter Brattain at Bell Labs, 1948.
The transistor is as important to the digital age as the steam engine was to the Industrial Revolution. Small but mighty, transistors helped us integrate electronic devices into our lives by nestling serious processing power in small computers, calculators, music players, and even the nose cones of rocket ships. In 1939, Bell physicist William Shockley, had already started playing with the notion of using semiconductors rather than the big, slow vacuum tubes which, up until then, had been used to power the computer. To see the idea through, he created a research team including John Bardeen and Walter Brattain. On December 16, 1947 – after two years of collaborative experimentation and theorizing – Bardeen and Brattain managed to wiggle all the components into place to create the transistor. The 1956 Nobel Prize went to all three of these men, demonstrating how collaboration of multiple talents resulted in one of the most important discoveries of the twentieth century.
2. The supplementing notes that envisaged the modern computer
Charles Babbage was the first to conceive of a general-purpose machine, something he called the “Analytical Engine.” But it was Ada Lovelace, daughter of the poet Lord Byron, whose belief in the project’s possibilities made it famous.
To express her deep understanding of its workings, Lovelace took time between 1842 and 1843 to translate a French transcript of a lecture that Babbage had given on his machine. She merrily stamped her work and sent it off to the man himself. Not only was Babbage tickled with her efforts, he also suggested she supplement the transcript with her own set of notes in which she’d be free to share her reflections. Ada Lovelace delivered. In her notes, she presaged the possibilities of the Analytical Machine (the computer) as a device that could process music, patterns, and poetry. Her supplement, more than twice as long as Babbage’s portion, became famous.
A portrait of Ada Lovelace, circa 1840.
Without Lovelace’s willingness to share her ideas and Babbage’s welcoming stance to her contribution, his idea for the Analytical Engine may have faded into ignominy. Collaboration like theirs leads to beautiful and innovative ideas – ideas that might even make it to the history books.
3. The beer bashes that launched the video game industry
Ever heard of Nolan Bushnell? He’s responsible for establishing videogames as an industry that would influence the future of modern devices. Nolan is also the founder of the company Atari, whose philosophy was that sharing – and partying – is caring.
Nolan was a huge fan of Spacewar, one of the earliest digital computer games. In homage, he invented a game console he named Computer Space. In short order, he’d sold 1,500 of these consoles and acquired a cult following. Nolan’s next move was to found Atari, where he made the radically simple and extremely successful ping-pong video game Pong. Voila, the videogame industry was born.
Atari founders Ted Dabney and Nolan Bushnell with Fred Marincic and Al Alcorn. © Computer History Museum.
It might all sound pretty busy – and pretty solitary – but Nolan made sure from the get-go that socializing was baked into Atari’s culture. The company was renowned for throwing beery parties with plenty of pot, providing the opportunity for creative digital thinkers to sit back, unwind, and bat around ideas. Today, Atari stands as a prime example of the philosophy that helped define Silicon Valley: encourage friendly collaboration and nonconformity, question authority, and nourish creativity – even if all you’ve got to feed it is beer.
4. The egalitarian organization model that birthed the modern microprocessor
Intel’s untraditional corporate culture and management style is matched to the personality of its founders, Robert Noyce and Gordon Moore. Both men were staunchly anti-authoritarian, unpretentious, and averse to hierarchy and confrontation. Far from proving stymieing, their style resulted in a flat organizational hierarchy and a culture which included flexible hours, stock options, and, yes, more boozy parties.
Gordon Moore and Robert Noyce at Intel in 1970.
The “Intel way,” which developed in the late 60s, posited that the more open and unstructured the workplace, the “faster new ideas would be sparked, disseminated, refined, and applied.” Everyone worked in similar cubicles, and by avoiding a chain of command, the Intel team had no choice but to be enterprising. One such novel idea came from Ted Hoff, a former teacher and engineer who, within months of arriving at Intel, developed a general-purpose logic chip. His idea was to broaden computation possibilities by programming a chip that ran a variety of applications rather than having different microchips paired to each function. Today, these microprocessors are found in all kinds of smart devices – from coffeemakers to personal computers – thanks in large part to the encouraging workplace of Intel.
5. The many hands that built the first personal computer
Pop quiz: who owns the patents to the first computer?
If you answered “nobody,” you win. The ENIAC was the first electronic general-purpose computer to be built, and its creators, Eckert and Mauchly, first put it into operation in November 1945. However, the patent they were granted for their work almost 20 years later ignited a legal imbroglio over whether their ideas were actually original. Mauchly had visited the talented physicist Atanasoff for four days in 1941 and examined the computer that Atanasoff had built. Therefore, the judge ruled the ENIAC patent invalid, and stated that Eckert and Mauchly derived their subject matter from Atanasoff.
ENIAC scientists holding various parts of the computer
Inventions as complex as the computer very rarely come from one – or even two – individuals. Rather, innovations are usually what’s left after a collaborative brainstorm. Their coming to light depends, however, upon minds like Mauchly and Eckert’s that are able to synthesize ideas from multiple sources.
6. The “tech nerd” meetups that were the crucible for the modern computer
It started in the 1960s. A very potent mix of cultures began to gather in the San Francisco Bay Area. Mostly, it was hippies and hackers – the free-thinking tech-interested who believed in “the hands-on imperative.”
Steve Jobs and Steve Wozniak with Apple-1 computer. © Computer History Museum.
A resulting initiative of the collective was the Homebrew Computer Club, where so-called tech nerds could meet to exchange ideas and where the philosophy was that counterculture and technology was a perfect match. The meeting’s attendees could admire and be inspired by each other’s work and ideas, and this meant a lot for the development of technology, like the first ever personal computer. Further kudos to the Homebrew Computer Club: Steve Wozniak reports that it was during one of these meetings that he got the idea of creating an all-integrated, fully packaged personal computer – the little gem that would eventually become the first Apple computer.
7. The ethos of open that made the World Wide Web into an efficient collaboration tool
Imagine a world in which the Web was governed by a single authority who received money for every click and controlled what information could be distributed. The digital world (and the IRL one, too), would be totally different. The shocker is that this would have been the reality if one particularly generous guy hadn’t refused to patent the Internet.
This NeXT workstation (a NeXTcube) was used by Tim Berners-Lee as the first Web server on the World Wide Web.
Tim Berners-Lee launched the Web as a collaboration tool. His vision for the Internet was an entity that could make random associations and string together clever links just as an imaginative human might. Tim believed that information should grow and evolve organically, so when the CERN administration who helped fund this idea wished to patent it, Tim refused. Instead, he insisted that a freer Web is a better Web, and that in order to evolve as an efficient collaboration tool, openness was a critical component. Thus, the Web became what he’d always dreamed: a platform for invention, association, sharing, and collaboration.
More inspiring stories of collectives that did great things for the digital age can be found in Isaacson’s new release, The Innovators. To get the book’s key lessons, check out the 19-minute summary on Blinkist.