Addictive technologies and the race for our attention

Let me start with a confession. A (not-so) well-kept secret. In the five years since I’ve started this blog, this is the first post I’ve written about technology. The very first. This might make me a terrible computer scientist. Or it might make me a really focused one. Let me explain.

Researcher Dr. Eric Baumer studies the disuse of technology – why some people consciously engage in no use or limited usage of Facebook and other social media. Having never used social media myself (exceptions: LinkedIn, Twitter) and being a late adopter to new technology trends, I would have been a prime participant for his studies. Which is why when Tristan Harris posed the question of “How do you ethically steer people’s thoughts?” when designing technology, I was more than intrigued.

deepwork.JPG
Freshly picked wildflowers at my old desk in the People and Computing Lab, University of Zurich.

Harris begins his TED Talk by saying that there are a hundred people in a control room who “shape the thoughts and feelings of a billion people”. He argues that the design of technology is not evolving randomly but rather “in a very specific direction” – that is, “the race for our attention”. As Netflix CEO recently said, “our biggest competitors are Facebook, YouTube and sleep.” Perhaps this is why Netflix has designed their interface to auto-play the featured show on your homepage without waiting for a user click; why Facebook’s notification buttons are red (to take advantage of our visual perception system) rather than blue; why teens go through almost any means to maintain their SnapChat SnapStreaks.

Technology design is not neutral. Why? Because designers of technology are real, live, human beings – like me and you– with values, goals and intentions. The designer may or may not be conscious of such values, goals, and intentions, but nonetheless, they are there. Harris argues that technology companies are actively manipulating its users in the race for more attention and more screen-time. Harris states, “the reason it feels like it’s sucking us in”, is because it was intentionally designed to do so. In his (previous) role as a design ethicist at Google, he would know.

artist.JPG
Photo of an art piece made out of floppy disks in Paris. (Unfortunately, I forgot to jot down the artist’s name).

But why does this matter? Why should we care? Harris argues this is the most urgent problem of our day – “because this problem is underneath all other problems. It’s not just taking away our agency to spend our attention and live the lives that we want, it’s changing the way that we have our conversations, it’s changing our democracy, and it’s changing our ability to have the conversations and relationships we want with each other. And it affects everyone, because a billion people have one of these [smartphones] in their pocket”.

As human beings then, our attention and focus – our ability to do deep work – is our most precious resource. Without a conscious awareness of how technology usage can negatively impact our attention and in turn quality of life, we are losing, I argue, everything it means to be human. To connect and engage authentically and empathetically with our family, friends and strangers. To be creative, innovative and inspired, both when alone and when together. To critically reflect on the problems facing our world today and to collectively work towards solutions. And most importantly, to have the time and space to develop clarity on what we want in life, not only to protect against what we don’t want.

coffee.JPG

Harris proposes three radical changes to reach this goal:

1) To recognize that we human beings (as users of technology) are easily (and almost laughably) persuadable;

2) To encourage technology companies to be more ethical, accountable and transparent in their persuasive technology design. As Harris argues, “the only form of ethical persuasion that exists is when the goals of the persuader are aligned with the goals of the persuadee.”

3) For technology companies to design technology that empower users by aligning with their goals and values. In other words, you say ‘I want this’, and technology goes ‘I’ll help you get there’. Harris offers a simple example: “Let’s say you wanted to post something super controversial on Facebook, which is a really important thing to be able to do, to talk about controversial topics”. Instead of the current design – a big comment box, asking what you want to type to keep you on the screen – “imagine instead that there was another button saying, ‘What would be the most well spent time for you?’ And you click ‘Host a dinner.’ And right there underneath the item it said, ‘Who wants to RSVP for the dinner?’ And so you’d still have a conversation about something controversial, but you’d be having it in the most empowering place on your timeline – which would be at home that night, with a bunch of a friends over to talk about it”.

So let me end this post now, by turning off my computer, getting some tech-free time and doing what feels most empowering to me – cooking with my loved ones, then doing some painting.

I wish you, dear readers, a reflective and empowered day!

 

2 thoughts on “Addictive technologies and the race for our attention

  1. Very good posting. I would add that we a loosing a Just and Free society. In front of all our eyes and as a society we are letting it happen. And all because of an addiction to the new technologies. Yes we need to have “technology with a heart and wisdom”.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.