For researchers trying to figure how to feed a world of 10 billion people later in this century, the great objective over the past decade has been to achieve what they call “sustainable intensification.” It’s an awkward term, not least because of conventional agricultural intensification’s notorious record of wasting water, overusing fertilizers and pesticides, and polluting habitats. But the ambition this time is different, proponents say: To figure out almost overnight how to grow the most food on the least land and with the minimal environmental impact. The alternative, they say, is to continue plowing under what’s left of the natural world. Or face food shortages and political unrest.
Up to now, the tendency in talking about sustainable intensification has been to focus on the supply side and on exciting technological innovations of one sort or another, from gene editing to satellite monitoring. In his new book Half-Earth, even E. O. Wilson invokes the idea, not too hopefully, that “a new Green Revolution can be engineered” to spare the half of the world he argues should be set aside for nature.
But achieving consensus about what sustainable intensification should mean — or whether it’s the right objective in the first place — has proved complicated and increasingly contentious. “Depending on how one defines it,” one researcher commented, “I’m in favor of it, or against it.”