‘We are becoming less rational, less intelligent, less focussed’ – Stolen Focus, Chapter 7 Notes

Posted on: June 10, 2022
Post Category: Book Notes

About #onepageonepoint

#onepageonepoint aims to summarise new ideas from books on personal and professional development – with (approximately) one point for each page. Read more about this project here.

Today for #onepageonepoint, we have summary notes for Stolen Focus – for chapter 7: ‘Cause Six: The rise of technology that can track and manipulate you (Part Two)’.

If you are interested in getting yourself a copy or learn more about the book, click here.

Chapter 7: ‘Cause Six: The rise of technology that can track and manipulate you (Part Two)’

  • Tristan presents the following question: Facebook does not have a feature that tells you the physical proximity of a friend, even though it may be hugely popular. Why? Because it doesn’t increase your screen time on the platform – people would spend time in a more fulfilling way and Facebook’s stock price would collapse since Facebook’s business model involves making money every extra second a person looks at the screen.

  • Social media companies make money in two ways: (1) advertising, and (2) every time you send a message or status update on Facebook/Snapchat/Twitter, and every time you search for something on Google, everything you say is scanned and stored to build a profile of you, and this is sold to advertisers.

  • Aza explains that: ‘Inside of Facebook and Google’s servers, there is a little voodoo doll, and it is a model of you’ (Hari 2022, p. 120) built on everything you click on, everything you search for and every little detail of your life online.

  • Free and cheap services like Google Maps, Amazon Echo and Google Nest Hubs cost far less than they cost to make, because they are used to collect information – to improve the voodoo doll of you.

  • Professor Shoshana Zuboff (from Harvard University) defines the technical term for this business model, where your personal information is collected and sold using technology, as ‘surveillance capitalism’.

  • Tristan comments on these social media giants: ‘Their business model… is screen time, not life time’ (Hari 2022, p. 123).

  • The insights from Hari’s interviews with Tristan and Aza explain that the greatest damage to our attention isn’t because of smartphones but the way apps are designed (and the incentives for people designing it) – to maximally grab and hold our attention.

  • The Facebook news-feed algorithm is designed to keep you looking at your screen – and unfortunately, on average, we will stare at something negative and outrageous for a lot longer than we will stare at something positive and calm – because of humans’ negativity bias.

  • Similarly, on YouTube, video titles with words like ‘hates’, ‘obliterates’, ‘slams’ and ‘destroys’ get picked up by the algorithm. Moreover, from a study at New York University, every word of moral outrage in a tweet on Twitter drives the retweet rate up by 20%. Furthermore, a study from Pew Research Centre showed that a Facebook post with indignant disagreement will double your likes and shares.

  • Because of these algorithms, people are exposed more to enraging content – building a culture of condemning more and understanding less, ‘turn[ing] hate into a habit’ (Hari 2022, p. 126).

  • Based off what he learnt, Hari presents six ways in which the technology harms our attention: (1) frequent rewards (making you hungry for hearts, likes and retweets), (2) task switching (switching to your phone and switching between tabs/apps), (3) they learn how to ‘frack’ you (with things that specifically distract you), (4) they make you angry (which screws with your ability to pay attention), (5) they make you feel surrounded by other people’s anger (which makes you vigilant, thereby making it harder to attain slower forms of focus), (6) they set society on fire (making it harder for people to identify their collective problems and find solutions).

  • ‘A study by Massachusetts Institute of Technology found that fakes news travels six times faster on Twitter than real news, and during the 2016 US presidential elections, flat-out falsehoods on Facebook outperformed the top stores at nineteen mainstream news sites put together’ (Hari 2022, p. 129); this forces us to pay attention to nonsense and will make it harder for us to identify and solve our collective problems.

  • On YouTube, the algorithm picked up on our negativity bias and now recommends more outrageous, shocking and extreme content after you watch a video – ‘If you watched a factual video about the Holocaust, it would recommend several more videos, each one getting more extreme, and within a chain of five or so videos, it would usually end up automatically playing a video denying the Holocaust happened’ (Hari 2022, p. 130).

  • YouTube once recommended videos by Alex Jones (a far-right conspiracy theorist) and his website InfoWars 15 billion times, who claimed that the 2012 Sandy Hook massacre was faked. After some of the victims’ parents were inundated with death threats, Alex Jones was later sued, admitting in court that the massacre was real.

  • Furthermore, Facebook once supercharged anti-democratic forces. Jair Bolsonaro used Facebook as a platform to say outrageous things (including homophobic comments and praising people who tortured innocent people), and this allowed him to become a social media star and defeat the opposition with a non-existent threat (using a video warning that Fernando Haddad wanted to turn all the children of Brazil into homosexuals). This allowed him to become president.

  • ‘These sites harm people’s ability to pay attention as individuals and pump the population’s heads full of grotesque falsehoods to the point where they can’t distinguish real threats to their existence from non-existent threats’ (Hari 2022, p. 134).

  • ‘[Tristan] was especially worried about this… because we are now, as a species, facing our biggest challenge ever – the fact that we are destroying the ecosystem we depend on for life by triggering the climate crisis. If we can’t focus, what possible hope do we have to solve global warming?’ (Hari 2022, p. 136).

If you are interested in getting yourself a copy or learn more about the book, click here.

Interested in reading more? See my notes for Chapter 8.

Card image cap
About the author

Jason Khu is the creator of Data & Development Deep Dives and currently a Data Analyst at Quantium.