Should Suffering-Reducers Focus on AI?

If we want to reduce suffering over the long term, should AI be a top priority? As capabilities advance and talk of AGI and the singularity fills the airwaves, some argue we should focus heavily on AI. Others are skeptical. In their view, steering AI may be just one important cause among many—and prioritizing it risks neglecting more reliable ways of reducing suffering. Who is right?

Pro

The strongest case for focusing on AI begins with the claim that it could be a genuine hinge-of-history technology—not just another tool, but something with imminent world-shaping potential. If we reach human-level or superhuman systems, the paradigm of “smart” humans controlling “dumb” tools would be over. If that happens, all bets are off, but it seems reasonable to expect tremendous impacts. [...] 

Read more

The Animal Gap in AI Governance

By Alistair Stewart

How can we steer AI development today and in the future to reduce animal suffering, rather than increase it? One place to look is AI governance: the set of norms, policies, laws and institutions whose purpose is to influence how AI is developed and used. [...] 

Read more