A test subject using the pplkpr heart-rate band, courtesy of the artists.

A test subject with the pplkpr heart rate band (all images courtesy the artists)

Your social life? There are hundreds of apps for that. Options range from WhatsApp, which allows us to text overseas, to Snapchat, which lets us send detailed snapshots of our daily lives to our friends, to Twitter, which distills our inner narratives to 140-character missives, to good old-fashioned email, which… well, you get the point. Social media technologies have proliferated, affording us an almost overwhelming number of ways of sharing our lives with others. The logical extreme of all this virtual interfacing is an app that manages our social lives for us, sparing us the difficult task of curating our online presences ourselves.

So artist-programmers Lauren McCarthy and Kyle McDonald created pplkpr (pronounced “people keeper”). Part art project, part philosophical experiment, and part functional app, pplkpr is an app that “tracks, analyzes, and auto-manages your relationships,” according to the product’s website. The app urges users to record their emotional states when they’re with a particular person (users have the option of wearing a “heart rate wristband” that measures their emotional responses), records and aggregates the resulting data, and then makes decisions for them. In addition to providing users with a comprehensive analysis of their relationships, pplkpr blocks the people it deems unhealthy and messages friends of its own accord. When McCarthy and McDonald tested the pplkpr at Carnegie Mellon, they discovered that the product was especially successful in romantic situations.

What does all of this mean? Can an app make better social decisions than we can? And even if it does, should we outsource our socializing to a machine? Hyperallergic got in touch with McCarthy and McDonald to find out.

*   *   *

The pplkpr interface and heart rate monitor

The pplkpr interface and heart rate monitor (click to enlarge)

Becca Rothfeld: What motivated you to undertake the project? How did you come up with the idea for it?

Lauren McCarthy and Kyle McDonald: We were looking at this increasing trend toward wearables and quantified life and we wondered: when does it go too far? Not taking the time to figure out your own relationships seems ludicrous. We want to think we’re more than bots. Yet we’re constantly complaining about emails in our inbox, too many social notifications, and FOMO. The idea of an algorithm tracking and managing your social life feels creepy, but what if it actually works? What if it actually improves your relationships and emotional life?

Then, what happens when the algorithm gets it wrong? Just like we can chalk up our faux pas to autocomplete or our spam filters, could an app like this provide the excuse or justification we need to say and do what we really feel?

For Kyle, this builds on interest in machine understanding of human experience and interaction, including face tracking, eye tracking and other topics from computer vision and experimental interaction design.

For Lauren, this builds on interest in social relationships, identity, and what it means to be a person at this moment where our lives are becoming more automated.

Together, we both have an interest in challenging social norms and assumptions about interaction, creating spaces that allow us to ask more questions.

BR: Have you used the app? If so, what are your experiences with it? Did it make you uncomfortable?

LM and KM: We used the app a lot together in testing and we were surprised at what an intense effect it had. You think that you know how you’re feeling and you’re being honest about it, but when there’s a sensor tracking your emotions and acting on that data, you suddenly feel a lot more exposed. It’s somewhere in between a heightened sense of awareness and surveilled paranoia.

BR: Do you think that the app will be able to make decisions about our relationships “better” than we can? If so, what does that mean exactly? More rationally?

LM and KM: pplkpr is relatively simple in how it accounts for emotions to make decisions. Most of the decisions it makes are things you’d have trouble making yourself, and having something else make that decision can be a relief or sometimes it helps you make up your mind. But there’s no question that a more complex or elegant algorithm would be able to make “better” decisions all around, even with common, day-to-day situations. What “better” means exactly is complicated, and that’s part of what we want to ask with this app. Fitness trackers introduced the 10,000 daily steps goal as a baseline for activity. What is the baseline for daily emotional activity? And who gets to decide?

One of the most complicated parts of this project conceptually was deciding what “better decisions” meant for us. If you end a terrible relationship, but you learn something important in the process, would it really have been “better” to end it sooner? If we learn from our mistakes, why are we always worried about making mistakes?

A screenshot of the pplkpr login, courtesy of the artists.

A screenshot of the pplkpr login (click to enlarge)

BR: Do you see the app more as an artwork and philosophical experiment designed to probe the bounds of relationships in a digital age, or do you see it more as a functional app? Do you advocate its usage in daily situations?

LM and KM: It is both. The app is a critical response to trends we see in quantified self, big data, and surveillance, but we don’t believe any of these things are black-and-white. Too often the conversations around these topics become reduced to gut reactions and fearmongering. We wanted to create a piece that addressed more of the nuance and contradictions of these new technologies. It was important to build a functioning app: this goes beyond speculative design fiction. Because it is a real app, when you encounter it, you are faced with choices and questions. Will you download it? Will you use it? What happens if it actually improves your life?

BR: What is the relationship, in your view, between a person’s “self” and that person’s digital presence? Do you regard text messages sent by the app as text messages sent by the user? 

LM and KM: A person’s digital presence is just another facet of their “self.” Our digital interactions create emotions just as real as any physical interaction. pplkpr then becomes part of your digital presence and your self. When you use pplkpr, it tells you things like “X makes me feel most stressed,” emphasizing that the app is not a separate entity or companion, but an extension of your self.

pplkpr is available for free download from the iTunes App Store.

Becca Rothfeld is assistant literary editor of The New Republic and a contributor to The Los Angeles Review of Books, The New York Daily News’ literary blog, The Baffler, and Slate, among other publications....