Technology won’t save us from climate collapse. Where did the innovation crisis come from?
The promised future — with flying vehicles, end of work, and vacation on Mars — has not come. We even stopped waiting for it. Being content with simulating progress, we abandoned the creative technological development of the first half of the 20th century.
In 1900, the World Exhibition took place in Paris, the aim of which was to present global scientific and cultural achievements. One of the remnants of the show are illustrations showing how the year 2000 was imagined at the end of the 19th century. The most common themes are flying vehicles and machines that do people’s daily chores: cleaning the apartment, shaving clients or putting on makeup.
It’s hard not to smile when you see these caricatured vehicles with wings and robots serving people in 19th-century costumes. The contrast between colorful ideas about the future and the real 21st century is downright grotesque. It’s a pity because the fact that our ancestors’ predictions amuse us is one symptom of an unfavorable change that has occurred in the last century.
Letdown
The illustrations perfectly reflect the spirit of the first half of the 20th century. At that time, scientists and artists alike were pouring out promises of technological marvels just around the corner. Flying cars, end of work, colonies on other planets, teleportation, androids, wireless energy transfer, tablet meal or mind to machine transfer. Neither of these promises was realized.
Science fiction cinema is a meaningful reflection of the change in the approach of subsequent generations to the future. As David Graeber points out in his book The Utopia of Regulations, there has been a fundamental shift between productions that were made in the post-war decades and contemporary visions of science fiction. Older sci-fi movies usually take place in a certain place in time. Directed in 1968, “A Space Odyssey” takes place in 2001, and “Blade Runner” in 1982 — in 2019. In a few decades distant future, the heroes of these productions travel back in time, fly into space or use androids. Modern sci-fi movies take place in an unspecified period. We do not know when the action in the new “Mad Max” or “Interstellar” takes place. The future ceased to be a time to come for us, and it began to be an alternative dimension to ours.
Graeber poses a legitimate question in this context: what would be the reaction of a fan of 1960s science fiction movies, accustomed to seeing plastic models of spacecraft if you suddenly showed him the latest Star Wars with all the amazing special effects? He would probably be disappointed. In the 21st century, we were supposed to live in such a world, not simulate it more and more efficiently.
End of innovation
The point is not that our ancestors had naive expectations of technological progress. The spread of washing machines, microwave ovens, contraceptives, lasers, television, cars, planes, new energy sources, air conditioning or vaccines over one generation gave every reason to trust the feeling that in the next several decades the robots will do all the work for us, and we will rest on Mars.
The hopes for the conquest of space were understandable in humans, during whose lifetime the maximum speed at which a man could move increased from just over 100 kilometers per hour (19th-century train) to 39,897 (Apollo 10 ship). Dreams of immortality should come as no surprise when the average life expectancy in the United States increased by as much as 25 years from 1900 to 1980. Meanwhile, not only most of the anticipated technologies did not appear, but the development in many areas stopped at all. The Apollo 10 speed record of 1969 has not been broken to this day, and life expectancy has increased by just four years since the 1980s.
We can also compare the equipment of an average kitchen from 1900, which often had no running water or electricity, with that of an American middle class family in the 1970s, equipped with a microwave, gas stove, refrigerator, dishwasher and oven. Since then, many new appliances have not appeared in our kitchens, and certainly not of the caliber of a washing machine or stove.
The same goes for working time. In 1930, the eminent economist John Maynard Keynes predicted that with the advancement of technology and subsequent increases in productivity, his contemporaries’ grandchildren would work fifteen hours a week. It was a reasonable vision, considering the fact that during Keynes’ lifetime, working time in most sectors of the economy had shrunk from several to eight hours a day. Since then — despite multiple increases in productivity — statutory working hours have not decreased by a minute.
People of the first half of the 20th century had reasonable grounds to believe that at least some of their predictions would come true. This, unfortunately, did not happen. However, there is no reason to believe that we have reached any predetermined level of innovation. Rather, it was scientific and technological progress that at some point changed so that economic development was no longer driven by structural innovation.
Capitalism and simulation of progress
What exactly happened? “In the 1970s, there was a profound transformation, abandoning investment in technologies that opened the way to alternative models of the future, in favor of investment in technologies that strengthen labor discipline and control mechanisms,” says Graeber. The reasons for this change were, on the one hand, economic, related to the spread of neoliberalism. On the other hand — bureaucratic, related to making public institutions similar to corporations.
The economic sources of this transformation are inherent in the very nature of the capitalist system. Competition between enterprises prompts them to constantly reduce production costs, including through the mechanization of work. Thanks to it, the innovator can either offer a product at the same price as the competition, paying less for its production, or increase its market share by offering the product at a lower price. But as soon as an innovative solution is copied by competitors, profits immediately disappear. Moreover, the increasingly effective technology raises productivity to the point where the cost of producing each new copy gradually approaches zero. This is why mechanization that benefits individual firms in the short term, brings about a decline in the rate of profit for all firms in the long term.
In the 1960s, entrepreneurs who noticed this stopped the further robotization of factories in order to transfer production to the countries of the global South. It was made possible by the development of computer science, the only field in which rapid progress has been made since the 1960s. Its development allowed for the implementation of new information processing techniques — ideal tools in the service of control, coordination and communication on a global scale. Enterprises have gained the ability to move production to places where labor is cheaper and where there is no need to make expensive investments to still be able to compete on price.
After all, the clothes we wear were not made by robots, although technological development allows it. They were made by Bangladeshi seamstresses who earn half a dollar an hour, and then brought to the other side of the world. Nowadays, it is cheap work that is replacing mechanization, not the other way around. Moreover, this trend is also visible in the high technology sector.
Last year, entrepreneur Sachin Dev Duggal launched the Engineer.AI platform. Using artificial intelligence, it was supposed to build personalized mobile applications and websites. Mechanization of the work of the development team was supposed to allow us to build applications meeting our needs three times faster and at twice the price. The idea was not stupid, so the startup raised nearly thirty million dollars for its implementation. Unfortunately, it turned out that this artificial intelligence from the beginning was a completely non-artificial group of Indian programmers who wrote the code by hand.
Likewise, Facebook swears that it protects the user community from child pornography or violent recordings of terrorist executions using artificial intelligence. The truth revealed by the document “Internet Cleaners” is quite different. This artificial intelligence consists of thousands of Filipino female workers and employees, whose only task is to view tens of thousands of drastic videos and photos containing scenes of suicides, rape, and executions for eight hours a day. With a margin of error of three mistakes a month, they decide manually what to leave and what to delete.
Who do we owe the iPhone?
The gradual relocation of production to the Third World countries coincided with the growing popularity of the neoliberal ideology, according to which the role of the state in the process of producing innovation is and should be marginal. Convinced, politicians cut corporate taxes drastically, believing a private investment in a free market would generate technological progress. But contrary to intuition, changes in the tax system have stopped the flow of funds flowing into research and development so far. As long as profit remained highly taxed, it was better for the company to allocate funds to R&D than to pay it to shareholders. Now that barrier is gone. At the same time, the financialization of the economy, progressing with the development of IT, gave rise to new work settlement systems, namely remunerating the management staff with company shares.
As a consequence, managers began to spend their financial surplus on the shares of their own company instead of on research and development or pay raises for employees. This increased the value of the stock exchange portfolios of the shareholders, although it did not affect the long-term development of the company at all.
The narrative of the private sector, which is supposed to be the main source of technological innovations, pushed by neoliberal reformists, was dealt with in Marian Mazzucato’s book Entrepreneurial State. As the author points out, there would be no iPhone without the Internet, GPS, semiconductors, LCD screens or lithium-ion batteries — key technologies developed by state agencies and universities. Steve Jobs’s genius was to put together already developed ideas and effective marketing rather than to create real innovations from scratch. That’s why Apple’s research and development spending during the time of major product launches could have been as low as 5 percent of the company’s revenue .
There would also be no Elon Musk if it were not for government contracts and subsidies for Tesla at an early stage of its operation. The company received $ 2.4 billion in subsidies and tax credits . Contrary to popular opinion, it is not venture capital funds that are responsible for financial support for breakthrough technologies, because they avoid real risk, counting on a return in a few years. Much too short for the development of truly radical innovations such as renewable energy sources or nanotechnology.
According to Mazzucato, it is the state that finances breakthrough technologies and creates a market for them through public procurement. It also bears the risk of working on research which — although often unprofitable in a several-year perspective — has a chance to completely change the fate of humanity.
The private sector fairy tale, on the other hand, serves to maintain patent law in a form that allows companies to profit from inventions that were created in a process stimulated by the state. The problem is best illustrated by the example of pharmaceutical companies. Since we bear the risk and costs of conducting research — corporations argue — we deserve a reward in the form of protection of our intellectual property. However, the data shows that state institutions are responsible for funding as much as 75 percent of new breakthrough drugs. The average pharmaceutical company spends 2.5 times as much on marketing than on your own research. The case of the Valeant concern was the case of the loudest in recent years. The company found that conducting its own research was generally too risky. So it reduced spending on the production of new drugs to 3 percent of revenues, recognizing that it is more profitable to buy patents to aggressively inflate drug prices. The pharmaceutical company felt that raising the price of a life-saving drug from $ 890 to $ 26,000 was simply a better business model.
We are dealing here with the classic mechanism of socialization of costs (after all, it was the taxpayers who paid for the research) and the privatization of profits (companies can drive up drug prices to dizzying amounts). Such a morally questionable distribution of benefits is also economically ineffective, as it favors the dismantling of infrastructure responsible for long-term development. The doctrine of free-market liberalism, which was supposed to bring innovative prosperity, which had been dominant since the 1970s, in fact stopped the whole process and removed key players from participating in the game.
Profitability regime
The second cause of the innovation crisis stems from the adoption of corporate management methods by state agencies and universities. In principle, they were to contribute to an increase in the efficiency of these institutions. In order to achieve this, a number of measurable indicators have been introduced to allow the monitoring of the degree of achievement of the adopted objectives. Unfortunately, the results-oriented organizational culture, while it works great when we want to introduce a new product to the market, completely fails when our goal is to develop original research projects. This is because corporate architecture — paradoxically — multiplies the number of unproductive administrative tasks and rewards non-original ideas whose results can be predicted.
To prove this, David Graeber cites observations from many years of work at the university. Although universities employ an increasing number of office workers (an increase of 240 percent from 1985 to 2005; which is significant, twice as much at private universities than at state universities), filling out forms began to take more and more time for scientists. Grant applications, employee qualification assessment, process evaluation, project definition — paperwork instead of conducting research or classes. These tools were introduced to foster competition between researchers and researchers to increase the efficiency of the institution. As a result, however, scientific work was reduced to a fight over who would sell more skillfully.
The fixation borrowed from the corporation on profitability and efficiency kills real innovation in the bud. As the physicist Don Braben put it: “Many scientists who made groundbreaking discoveries in the 20th century, such as Einstein and Planck, would not get funding for their research today. Until 1970, good scientists were guaranteed resources, just like that. They did not have to waste time proving in the forms that their work was profitable, that it would bring concrete results in the future. They could just work.
Graeber adds that social sciences suffer from the same ailment. Contemporary Michel Foucault or Pierre Bourdieu would rather not find a job at the university. That is why, according to the author, no breakthrough sociological position has appeared in the United States for thirty years. “Once upon a time, the university was a haven that society offered to brilliant eccentrics, geniuses, and people whose work did not translate directly into practical results. Those times are gone. Today, professional self-promotion and marketing reign supreme, ”sums up Graeber.
Imagination Crisis
We have carried out a bizarre reversal of means and goals — bureaucracy is no longer a tool for making fantastic visions, like flying to the moon, come true. Today, even in the face of climate destruction, the most powerful country in the world warns against taking overly ambitious steps.
How to get out of this impasse? Graeber suggests starting by rejecting the notion that technology is an ideologically neutral force that develops independently of society. According to the author, claims about the inevitability of capitalism follow from such a belief. If we have stopped believing that any other economic system is possible, we cannot necessarily envision a radically better future, and therefore no new revolutionary technology.
Of course, we don’t feel like we’ve reached an end to innovation. This would mean that capitalism does not generate progress. That is why we are convinced that new miracles of technology are created every now and then. Nevertheless, most of them come down to improving the existing solutions at best (a slightly better camera in the phone), and at worst — to such bizarre ideas of the Silicon Valley as Elon Musk reinventing the metro (with the difference that there are only five people), developing a squeezer for squeezed juice or putting absurd hopes on blockchain.
Do we still have a chance to experience the technological breakthroughs experienced by our ancestors? The answer is no. At least not in the current system that prioritizes exploitation over structural innovation; who shy away from trying to implement “unprofitable” projects; which focuses on the growth of abstract indicators; and which colonizes every type of institution with corporate thinking. Therefore, what we can do is stop looking for apparent solutions among rational calculations and start thinking about a system in which technology will start to serve most of us.