Pplkpr – in with the data, out with the people…
… or at least, out with those that provoke you discomfort, uneasiness, or high level of stress. A fascinating application described in Wired re-positions the role of algorithms and the allowed level of intervention of the “machine” into the human social fabric the under the guise of an art project. It is aptly named “pplkpr”.
“Developed by artists Lauren McCarthy and Kyle McDonald, pplkpr lets you quantify the value of your relationships based on a few data streams. A heart rate wrist band measures the subtle changes in your heart rate, alerting you to spikes in stress or excitement. This biometric data is correlated with information you manually input about the people you’re hanging out with. Based on patterns, algorithms will determine whether you should be spending more time with a certain person or if you should cut him out altogether.”
This brief description extracted from the short article perfectly summarizes the mechanics of pplkpr. The artists explain in an embedded film their take on their own project, viewed as both an ironical comment of what is yet to come, and as a possibility to be taken seriously and possibly developed. Pplkpr, left to its own devices, would erase from the phone data the address and names of the persons that causes the owner the highest amount of stress, and would automatically schedule meetings with those that cause pleasure and excitement. While all this may sound at the first glance an inoffensive, experience enhancing application, I cannot help but thinking about an Eastern saying somewhere along the lines of “the best guru is s/he who upsets you the most”. Indeed the nuance is also caught in the Wired article that ends with the warning that pplkpr has the potential to “nullify the things that makes us actually human”- our ‘out of control’ relationships. Insulating oneself into sameness, or excitement, does not necessarily makes on evolve, individually or as species. As two of my friends, both humanists, observed, this may be the path to the age of mediocrity.
Beyond this observations, one may catch the glimpse of the fact that, for the moment, we, as makers, do not endow algorithms with empathy, yet. Although AI may develop empathy in its own terms, later. A couple of observations:
1. We build technology mostly around our imaginary of what technology is, and not necessarily of what it could be. Therefore, it appears to lack empathy and take cold decisions.
2. Like all other objects, digital applications reflect culture and ideologies. The mode of functioning of pplkpr reflects an individualist, turned towards inside, and instant gratifying ideal in the society. As easy the artists could have imagined an application that measures the level of stress or excitement of the person whom we address, and give us an indicator of this, in order for us to adjust our own behavior. In other words, to sensitize us towards an empathic type of communication. Or could it be that easy to imagine, if it does not really reflect our current values and culture? – kind of telling is the cutting-throat gesture that McCarthy does when she refers to erasing somebody from the address book.
3. In today’s political landscape, in which radicalization on all sides seems to be the rule, plunging further in insulated spaces of confirmation bias is not the solution. Separation will only generate and increase misunderstanding, stereotypification, and ultimately violence. We need to be careful not only on how we use algorithms, but what kind of algorithms we put out there.
The biggest question remains: what kind of A.I. will we be building – the one that would take over the world, as some may fear and warn us against? (among them Stephen Hawkins and Bill Gates). I think the danger does not consists in the technology itself, but in the kind of technology we imagine, a cold, algorithmic reflection of our current cultural values.