There’s a lot of press around algorithms being promoters of inequity or of bias. But we know from the behavioral-science literature that human beings are quite biased. We don’t just look at objective data; we also add our own internal biases. Study after study has demonstrated that when viewing a man and a woman doing a task at the same level of performance, people will make inferences about the woman they don’t make about the man. The mind just adds its own bias. The algorithms, while they may have other problems, tend not to add their own biases. They tend to reflect whatever is in the data.