A series of AI programs are created with the intention to make fair decisions based on different values. But is it possible for something made by humans to avoid human bias?
NOTE/UPDATE (2023/02/12): I may slowly continue to update, but it's scaring me how quickly this "science fiction" story is becoming nonfiction. In the six months since I published the first chapter, we're already seeing AI being used to make decisions in court. The art style of this comic was supposed to be based on AI-generated art, but now AI is actually being used to make comics, and Tapas is making a rule against it. This story is becoming less speculative and more mundane, and that makes me feel like the stakes for writing it are a lot higher, and a lot more depressing. So, we'll see if I ever get around to finishing this.