I may, or may not, be asked to participate in a radio show/podcast about algorithms and the environment….
This is my initial spur of the moment thinking…
I’d start by talking about the difficulties of getting algorithms for a complex system. The whole point of complex systems is that they are unpredictable in specific, while possibly being predictable in terms of trends. For example, we cannot predict the weather absolutely accurately for a specific place in 3 months, but we can predict that average temperatures will continue to rise. Initial conditions are important to outcomes in complex systems, but there are always prior conditions (ie there is a way in which initial conditions do not exist), and because so much is happening and linking to each other, there are always problems determining what is important to the model, and what the consequences of an action were. Another problem with complexity (as far as I understand it) is that it can only be modelled to a limited extent by any system which is not the system itself.
Then the model tends to be taken for reality, so we act as if we knew something and are working directly on that system, rather than working on a model which may increasingly diverge from reality with the passing of time….
Then there is the issue of power relations. We know that one simple way of proceeding with Climate change, is to phase out coal and other fossil fuels and increase the use of renewable energies. However, we can’t even do this transition at the speed we need to because of established power relations and habit (power is often the ability to trigger established pathways of behaviour) – and we cannot guarantee there will be no unexpected side effects even if we could. For example, we may not succeed in replicating something like our current social life with renewables or we construct them in such a way that it harms the environment.
We also seem to need to absorb greenhouse gases as well as cut back on emissions, but absorption can be used to delay reduction (again through power relations), and there is, as yet, no yet established way of dealing with the GHG that have been removed which is safe or long term. Algorithms cannot successfully model the effects of things we don’t know how to do…
On top of that there is the potential power consumption of the algorithms – while hopefully this will not be too bad there is some evidence that bitcoin (which is a complex algorithm of a kind) could end up being the most energy hungry thing on the planet…. In which case our efforts to save ourselves could intensify the crisis.
Now, to be clear, I’m not saying that computational algorithms are never of use, but that they tend to be used without testing because they depend on fictional stories which have a high level of conviction, and are treated as if they are the reality we are working with and not as models of that reality. If the model / algorithm tends to advantage some group more than others, and the appliers belong to people loyal to that group, then it will probably be harder to curb if incorrect, and be more likely to be taken for correct. The same is probably true if the model reinforces some precious group belief. The point of this is that models tend to become political, (consciously or unconsciously) because the axioms seem like common sense.
According to some theories humans tend to confuse the ‘map’ for the ‘terrain’ (to use the General Semantics slogan) almost all the time unless its visibly and hopelessly not serving them and there is an easy alternative. If so, that could be one reason why science is so difficult and so relatively rare, and so easy to ‘corrupt’ when it becomes corporate science.
If we are going to model what we do in the world then we absolutely need something like computer modelling, but we also need to emphasise that these models are unlikely to ever be totally accurate, always are going to require modification and change, will get caught in politics and could always be wrong.
If we don’t do this then the aids to helping us model what we are doing and need to do, could well make things worse.