The shock of the virus’s first wave exposed the inner workings of our interconnected system of food creation and delivery—and its weak spots—to many of us who’d never given it a second thought. That system is, of course, a result of decades’ worth of technological advances, from globe-spanning shipping and refrigeration networks to commodity markets (running on high-speed internet and massive cloud-computing infrastructure) that provide the capital to make it all run. There may yet be more unpleasant surprises in store for millions of people around the world as the pandemic plays out. But this moment offers us an opportunity to examine how we got to this point, and how to change things for the better.
The cost of growth
Simply put, the modern food system is a product of the forces inherent in free-market capitalism. Decisions on where to invest in technological research and where to apply its fruits have been guided by the drive for ever greater efficiency, productivity, and profit.
The result has been a long, steady trend toward greater abundance. Take wheat production as an example: thanks to the railways, the introduction of better equipment, and the adoption of higher-yield varieties, output in the US tripled between the 1870s and the 1920s. Similarly, rice production in Indonesia tripled in 30 years after the mechanized, high-input methods of the Green Revolution were adopted in the early 1970s.
But as we all know, overproduction in the US in the early 20th century led to widespread soil erosion and the Dust Bowl. The steady march of higher yields was achieved by using large quantities of fertilizers and pesticides, as well as by discarding local crop varieties that were deemed unfavorable. Farmland became concentrated in the hands of a few large players; the US had about one-third as many farms in 2000 as in 1900, and on average they were three times as big. In the same period, the proportion of the US workforce employed in agriculture shrank from slightly over 40% to around 2%. Supply chains have continued to be optimized for speed, reduced costs, and increased returns on investment.
Consumers have been mostly happy to enjoy the increases in convenience that have come with these trends, but there has also been a backlash. Products that are distributed globally can come across as soulless, removed from local culinary tradition and cultural contexts—we can find blueberries in the middle of winter and the same brand of potato chips in remote corners of the planet. As a reaction, more affluent eaters now look for “authenticity” and turn to food as an arena in which to declare their identity. Suspicions or outright critiques of technology have emerged within the so-called food movement, together with a frequent and uncritical embrace of pastoral fantasies that at times reflect the preferences of richer (and often whiter) consumers.
Such attitudes fail to acknowledge the obvious: the availability, accessibility, and affordability of industrial food has been a major force in reducing food insecurity around the world. The number of people suffering from undernourishment fell from around 1 billion in 1990 to 780 million in 2014 (though hunger is rising again), while the world population grew by 2 billion in the same period.
And criticizing the mass production of food per se is misguided. It is indeed a very flawed endeavor that produces a lot of calorie-dense, nutrient-poor foods. But it is not doomed to ruin our planet and our well-being. Not if we make choices that take factors other than profit into account.
The value of values
The shutdown of slaughtering and meatpacking plants in response to covid-19 caused troubles upstream, forcing farmers to kill and dispose of livestock that were too expensive to feed without the certainty of sales. This is what happens when a system fine-tuned for efficiency, productivity, and profit collides with a shock.
Technology, however, is not inherently opposed to sustainability and resilience. In fact, many of the problems commonly blamed on technology in the food system derive from the legal and financial framework in which it develops. Intellectual property is a central issue here; patent owners have used their patents almost exclusively to maximize profit, rather than to improve food security and food quality.
Genetic modification is a great example. For the most part, its techniques have been applied to commercial crops such as wheat, soybeans, and corn, grown in huge quantities and traded internationally. The goal is single-minded: increase yields, even when that requires heavier use of pesticides and fertilizers—which are often patented by the same companies that own the patents to the GMOs.
That investment in genetic modification and agrotechnology is lacking, however, for many crops that function as staples for millions of smallholders around the world—from taro in the Pacific Islands, South Asia, and West Africa to cassava in Latin America and large areas of Africa. If applied to those crops in the pursuit of food security instead of profits, genetic technologies could be used to create stronger, more resilient local agriculture and a healthier food system—but they aren’t, because that wouldn’t generate profits large enough to interest the private biotech sector. To make matters worse, many low-income countries have also historically been forced to accept trade and financing deals from the IMF, World Bank, and World Trade Organization that open their markets to those heavily globalized commercial crops, regardless of farmers’ or consumers’ customs and needs.
And yet, most debates about GMOs focus on their supposed danger to human health—for which there is little scientific evidence—rather than on the way they tilt the playing field against small farmers and the communities they feed. In short, by focusing on spurious technological problems, we are ignoring very real legal and social ones.
The way forward, then, is in making choices that align technological advances with the causes of sustainability, resilience to shock, and people’s well-being, instead of purely with the bottom line of large corporations. There are plenty of examples already. The Navdanya Community Seed Banks, initiated in India by activist Vandana Shiva, trains local practitioners (mostly women) to become seed keepers, making endangered varieties available to farmers who can then grow and cross-breed them. These low-cost conservation technologies help maintain agrobiodiversity by identifying, selecting, and protecting disappearing genetic material.
The question of ownership and control also touches other aspects of the entanglement between technology and the food system. There’s a list a mile long of sleek gadgetry that promises to revolutionize the gritty work of conjuring food from the land. Farmers can wire their fields with internet-enabled sensors, monitor their crops and livestock with agricultural drones, or manage inventory using a blockchain. They can use their cell phones to access data on weather, pests, and the cost of inputs and crops. But the incentives of the companies behind such innovations are to sell as many apps and devices and data streams as possible, not to feed and nourish as many people as possible. If the companies change their business model, discontinue a product or service, or simply fold, farmers are at their mercy.