“Streaming services often allow account holders to create multiple, separate profiles, which I appreciate. I want the recommendations I get to reflect my taste and not my partner’s. Is this selfish? Is there any virtue in sharing a profile with others? ”
—Island in the Stream
Sharing, at least as it’s often understood, is virtuous only in cases of finite resources. It is generous for a child to share her lunch with a classmate who has none or for the wealthy to give money to the less fortunate. But I find it hard to believe that forfeiting an individual profile would be laudable when there are enough to go around. What’s bothering you isn’t the fear of selfishness but the realization that you see other people’s inclinations and preferences as a form of contamination, a threat to the purity of your personal algorithm. To insist on your own digital fiefdom suggests you believe your taste to be so unique and precise that any disruption to its pattern will compromise its underlying integrity.
At a basic level, prediction engines are like karma, invisible mechanisms that register each of your actions and return to you something of equal value. If you watch a lot of true-crime docs, you will eventually find yourself in a catalog dominated by gruesome titles. If you tend to stream sitcoms from the early 2000s, your recommendations will turn into an all-you-can-eat buffet of millennial nostalgia. The notion that one reaps what one sows, that every action begets an equal reaction, is not merely spiritual pablum, but a law encoded in the underlying architecture of our digital universe. Few users really know how these predictive technologies work. (On TikTok, speculations about how the algorithm functions have become as dense as scholastic debates about the metaphysical constitution of angels.) Still, we like to believe that there are certain cosmic principles at play, that each of our actions is being faithfully logged, that we are, in each moment, shaping our future entertainment by what we choose to linger on, engage with, and purchase.
Perhaps it would be worthwhile to probe a little at that sense of control. You noted that you want your recommendations to align with your taste, but what is taste, exactly, and where does it come from? It’s common to think of one’s preferences as sui generis, but our proclivities have been shaped by all sorts of external factors, including where we live, how we were raised, our ages, and other relevant data. These variables fall into discernible trends that hold true across populations. Demographic profiling has proved how easy it is to discover patterns in large samples. Given a big enough data set, political views can be predicted based on fashion preferences (LL Bean buyers tilt conservative; Kenzo appeals to liberals), and personality traits can be deduced by what kind of music a user likes (fans of Nicki Minaj tend to be extroverted). Nobody knows what causes these correlations, but their consistency suggests that none of us is exactly the master of our own fate, or the creator of a bespoke persona. Our behavior falls into predictable patterns that are subject to social forces operating beyond the level of our awareness.
And, well, prediction engines couldn’t work if this wasn’t the case. It’s nice to think that the recommendations on your private profile are as unique as your thumbprint. But those suggestions have been informed by the behavioral data of millions of other users, and the more successful the platform is at guessing what you’ll watch, the more likely it is that your behavior falls in line with that of other people. The term “user similarity” describes how automated recommendations analogize the behavior of customers with kindred habits, which means, essentially, that you have thousands of shadow-selves out there who are streaming, viewing, and purchasing many of the same products you are, like quantumly entangled particles that mirror one another from opposite sides of the universe. Their choices inform the options you’re shown, just as your choices will inflect the content promoted for future users.
Karma, at least in popular culture, is often regarded as a simplistic form of cosmic comeuppance, but it’s more properly understood as a principle of interdependence. Everything in the world is connected to everything else, creating a vast web of interrelation wherein the consequences of every action reverberate through the entire system. For those of us who have been steeped in the dualities of Western philosophy and American individualism, it can be difficult to comprehend just how intertwined our lives are with the lives of others. In fact, it’s only recently that information technologies — and the large data sets they create — have revealed to us what some of the oldest spiritual traditions have been teaching for millennia: that we live in a world that is chaotic and radically interdependent, one in which the distance between any two people (or the space between any two vectors) is often smaller than we might think.
With that in mind, Island, sharing a profile might be less an act of generosity than a recognition of that interdependence. The person you’re living with has already changed you in countless ways, subtly altering what you believe, what you buy, the way you speak. If your taste in movies currently diverges from theirs, that doesn’t mean it always will. In fact, it’s almost certain that your preferences will inch closer together the longer you share a home. This is arguably a good thing. Most of us have experienced at some point the self-perpetuating hell of karmic cycles, the way one cigarette leads to an addiction or a single lie begets a string of further deceptions. Automated recommendations can similarly foster narrowly recursive habits, breeding more and more of the same until we’re stuck in a one-dimensional reflection of our past choices. Deliberately opening up your profile to others could be a way to let some air into that dank cave of individual preferences where the past continually reverberates, isolating you from the vast world of possibilities that lies outside.