This will surely be controversial but personally I’m convinced.

It’s reminiscent of the “garden Earth” theory. This holds that, whether we like it or not, there are basically no truly wild places left. Humans have turned the Earth into a de-facto garden - and they now need to own that fact and behave like better gardeners. I was skeptical (even a bit outraged) at first but I’m coming round to the logic.