Garbage In Garbage Out!
Jonathan Sacks, morality and Facebook algorithms
“We have met the enemy and he is us.”
–Pogo, Walt Kelly
Facebook whistleblower Francis Haugen probably thought she’d driven her message home when she recently recommended the removal of algorithms on Facebook, telling the US Congress it was “because I think we don’t want computers deciding what we focus on!” When it comes to computers telling people what to do, whose blood would not boil?
Of course, as with most such reductionist lines, Haugen was able to appeal to our emotions through oversimplification. But computers don’t decide, we do. And even if we allow computers to steer us, those computers (more precisely, algorithms) were all designed by human beings who are ultimately no different from us.
I know. I tried it. Although the default on the Facebook timeline is what the algorithms calculate you “most want to see,” up until very recently there was an icon on the sidebar that allowed you to see everything posted by your contacts chronologically. If you are anything like me, you would have lasted less than two minutes! Do you really want to see what someone you barely know is having for breakfast? Or whether anybody can help with their carpool in Minneapolis?
True, there are many ways that algorithms can determine what you want to see, some of which might be more manipulative than others. And I am not saying that there is no room for more responsibility from the companies that use algorithms, and that government has no role to play in regulating them. But there is an important part of the equation that never seems to get mentioned here.
The cries for more responsibility are all aimed at government or industry. Yet as Rabbi Jonathan Sacks (whose first yahrzeit we are now marking) repeatedly pointed out, in a liberal state, these institutions are not primarily designed to promote morality or to enforce it. Of course, they have a role to play: Industry should understand that the legitimate desire for profits does not make everything legitimate; and government needs to support whatever basic moral consensus still exists. But as Sacks wrote in his last aptly titled major book, Morality, morality’s home is primarily in the third sector – voluntary communities that are formed around tighter and more rigorous definitions of what we should be doing to maximize who we are as human beings.
Accordingly, one of Sacks’s most valiant crusades was the call for individuals and communities to step up and take responsibility for the moral state of society. He argued that we have reached a crisis point because we have spent too much time going to the wrong addresses when the most important address is right at our doorstep.
The issue becomes clearer with a Biblical metaphor, as explained by the famous 19th-century rabbi and commentator, Malbim (Rabbi Meir Leibush Wisser). After telling us that, “He who tends a fig tree will enjoy its fruit, and he who cares for his master will be honored,“ the Bible tells us that, “As in water, face answers to face, so the heart of man to man” (Proverbs 27:18-19). The metaphor is based on the mirror image of our face that we see when we look at water. So too, claims the Bible, is the response of a person’s heart. Malbim understands this quite literally. For him, it is saying that the blood that is pumped out of a man’s heart is the exact same blood that returns to that heart. As for the teaching, he expands it broadly, telling us that what happens to us is often a direct reflection of how we act in a wide variety of contexts.
Does this not sound a little (a lot!) like algorithms. These programs don’t make up anything on their own. Their output – like the reflection of our face in the water – is completely responsive to our input. In this respect, then, the blame society is aiming at social media algorithms is like throwing a rock at the water reflecting the ugliness of your own face.
The result is that we turn to forces outside and tell them, “Show me a prettier face.” In the short term, that may happen. Algorithms can be adjusted to appeal to our better sides or, at least, to mitigate some of the more significant negative outcomes. However, as Facebook has already anticipated, that will lead to less user time, meaning less business. And that may lead to other companies finding a way to fill the vacuum and supply us with what we seem to want.
For if we are allowing ourselves to wallow in partisan hate and never looking at the other side, it means that on some level this is what we prefer. If we are willing to read things the reliability of which is questionable, it means that this is what we want. If we let ourselves be drawn to the bizarre, the silly and the sexually enticing, this too is what we are ultimately choosing. As in real life, knowing that any of these practices is not optimal is not the same as deciding to live otherwise. No doubt, others, including Mark Zuckerberg, have a part in the blame. But what about yourselves?
A more serious and introspective society would understand that there is a deeper problem that goes beyond Facebook and the lack of government regulation.
As Rabbi Sacks never got tired of reminding us, the home of that problem is within ourselves.
No comments:
Post a Comment