In 1900, anything seemed possible.
In just 25 years, America had been introduced to the typewriter, telephone, phonograph, affordable photography, skyscrapers, gas powered cars, and durable electric lights. Within the next three years, people would learn that messages, and then people, could be sent through the air.
No problem seemed beyond the reach of technology, and the United States was at the forefront of innovation. It remained there throughout the century, introducing the world to rockets, television, nylon, digital computers, the polio vaccine, nuclear power, the internet, and the smart phone. And it all occurred in such a short time. In 1927, Charles Lindbergh met Orville Wright, and in 1969, he met Neil Armstrong. A person’s lifetime easily covered the history of flight from Kitty Hawk to the moon.
But how often do we hear about breakthroughs today? Research shows that our rate of disruptive, life-changing innovation has slowed.
In 2016, two leading academics at Northwestern University —Robert Gordon of the School of Social Science and Joel Mokyr, professor of economics and history — debated, “Are America’s Best Years of Growth Behind us?” Gordon said that innovations like electricity, indoor plumbing, automobiles, and antibiotics made immense changes to Americans’ lives. But since 1970, innovation has focused on entertainment, information, and communication technology. Americans’ lives, said Gordon, haven’t significantly changed since the 1980s.
We rely on what is essentially old technology. According to The Wall Street Journal, most of America’s electricity is generated by steam turbines, which were invented in 1884. In the past 100 years, their energy efficiency has increased annually about 1.5 percent. Annual energy consumption in the steel industry has been raised by just 2 percent since 1950. The energy density of batteries has annually improved, on average, just 2 percent since 1900.
One reason for the slowdown is that some technologies have reached their physical limit. Take Moore’s Law, which states that the number of transistors on a microchip doubles every two years. But simply put, chips can’t be made much smaller. The rate of performance improvement for single-core microprocessors has slowed from 52 percent per year in 1986–2003 to 23 percent per year in 2003–2010 to 12 percent per year in 2010-2015 to just 3 percent per year in 2011–2018. This slowing affects the rate of innovation in many industries, including telecommunications, AI, weather prediction, infrastructure, and consumer electronics.
Another reason that innovation may be slowing is because researchers must master far more knowledge before they can reach the boundaries where innovation is possible, according to Nature. Most of the basic innovations in our scientific fields have already been made. In the early days, an inventor could make a revolutionary discovery by trial and error. Thomas Edison could produce a long-lasting lightbulb by simply trying 2,700 different types of filament before finding what worked. Today, most of the basic breakthroughs in science have already been made.
Yet another obstacle to innovation is that researchers at universities are under pressure to complete and publish their work as soon, and as often, as possible. So they’re reluctant to take on the long-term studies needed to explore new ground.
Meanwhile, corporate research and development has declined for myriad reasons, including a decoupling from university research, the rise of more nimble start-ups, and outsourcing, according to a working paper from the National Bureau of Economic Research.
The federal government has also cut back. Its support for R&D has shrunk from 1.2 percent of the GDP in 1976 to 0.8 percent in 2016, according to Time magazine — a level of support that hasn’t been seen since 1957.
Is it possible to recapture the spark of creativity and invention that drove Edison or the Wright Brothers? In December of 2022, Derek Thompson of The Atlantic pointed to the remarkable success of Operation Warp Speed. The high mortality rate of COVID offered a clear incentive to fund prioritized research and clear the path for developing, testing, and distributing a vaccine.
Back in 2016, when Professors Gordon and Mokyr were debating the future of America’s innovation, Mokyr foresaw a brighter tomorrow. A new industrial revolution would arise from 3-D printing. Human intelligence was predicted to increase, and we would acquire a deeper understanding of the brain to better address mental health and problems with aging.
“This is a frontier,” he said, “on which I think the progress will be enormous in the next 50 years.”
We may no longer live in an era that let Edison try 2,700 filaments before finding one that worked, but with public funding, corporate research, and good old American stick-to-it-iveness, we may yet invent a better light bulb.
Become a Saturday Evening Post member and enjoy unlimited access. Subscribe now