So as you can see, it all depends on how the AI is developed and the implementation. The AI is merely an exocortex to help us think better. Being able to think better we can make better decisions. With enhanced decision making capability we have enhanced capacity for morality. I see it as the only way to survive in a radically transparent world.
To make even what we consider small decisions today, will require in my opinion adopting much higher standards tomorrow as the world becomes more transparent. We will have to capture crowd sentiment on a regular basis using machine intelligence, and we will have to rely on AI to do this because manually mining crowd sentiment is too time consuming, too hard, and inefficient. Decisions don't wait, and so the only way to scale up the human decision makers is using machine intelligence.
Even if it is created by a collective of humans, there are still only a few that decide what code gets included and which does not. Those humans bias the design & functionality of the resulting AI and I contend we can't ever remove the human component of our creations, AI being one of them.
So decentralize it so that it no longer is left only to a few. Let everyone program their AI in the same way the personal computer empowered everyone. Personal AI which you program, which has your values, your morals. We can do this, so why not? Better to do this as it could potentially save lives (literally) because the path toward radical transparency will cost lives for sure, whether it be by "vigilante justice" similar to what we see in some countries which have death squads, or by suicides.
It's not the devs who dictate what pieces of code get into production, it is their managers, those at the top of the Blockstream power pyramid who call the shots. Contributions by devs who disagree will not be included. Decisions about what is and what is not included in the production software are not up to the developer collective, only the Blockstream Bosses.
So you are talking about how development is centralized? That is a problem but we have all the tools to begin decentralizing that. Why not? I see no other way forward which can preserve life and limit unnecessary suffering. The path Dan suggests would lead to unnecessary suffering in the form of people being shunned, bullied, punished in some way for crimes or for being immoral, etc.