Over the past few months, social media companies have come under increasing scrutiny from media critics, watchdog groups, and US congressional committees.
Much of the criticism has focused upon how Facebook and Twitter facilitated the propagation of inflammatory messages created by Russian agents during the 2016 US presidential elections, ostensibly to polarize American voters. Self-serve advertising, ‘filter bubbles,’ and other aspects of social media have made mass targeted manipulation easy and efficient.
Yet some are voicing deeper concerns about the social, psychological, cognitive, and emotional effects of social media–particularly as they impact children.
For example, Facebook has come under attack from an unlikely group of critics: some of its own former executives. Their comments coincide with the debut of ‘Messenger Kids,’ Facebook’s latest product. According to reports its target audience is 6- to 12-year old children. (Like most other social media apps, Facebook does not allow people younger than 13 years old to create accounts.)
Despite Facebook CEO Mark Zuckerberg’s recent resolution to ‘fix’ Facebook in 2018, ‘Messenger Kids’ reveals a different agenda: to scoop up a new generation of users, habituate them to the virtual life, increase market share, and develop brand loyalty in a highly competitive marketplace. Facebook’s first president, Sean Parker, acknowledged late last year that its creators intentionally designed the platform to consume as much of users’ time and attention as possible. According to Parker, ‘likes’ and ‘posts’ serve as “a social validation feedback loop” exploiting the psychological need for social acceptance. “God only knows what it’s doing to our children’s brains,” he said (quoted in Allen 2017).
Why would the architects of Facebook, Google Plus, Twitter, and other social media platforms resort to these techniques? Facebook’s business model is based on revenue generated from advertising. An early Facebook investor, Roger McNamee (2018), recently wrote:
Smartphones changed the advertising game completely. It took only a few years for billions of people to have an all-purpose content delivery system easily accessible sixteen hours or more a day. This turned media into a battle to hold users’ attention as long as possible. Why pay a newspaper in the hopes of catching the attention of a certain portion of its audience, when you can pay Facebook to reach exactly those people and no one else?
Sean Parker and Roger McNamee aren’t alone. Venture capitalist and former Facebook VP Chamath Palihapitiya admitted last month that he regrets helping the company expand its global reach. (Facebook has more than two billion users worldwide and is still growing.)
“We have created tools that are ripping apart the social fabric of how society works. ..you are being programmed,” Palihapitiya told an audience at the Stanford Graduate School of Business. He added, “No civil discourse, no cooperation, misinformation, mistruth. And it’s not an American problem–this is not about Russian ads. This is a global problem...Bad actors can now manipulate large swathes of people to do anything you want. It’s just a really, really bad state of affairs” (Palihapitiya 2017).
Yet another former Facebook executive, Antonio García-Martínez, went public last summer with his criticism of the company’s techniques: If used very cleverly, with lots of machine-learning iteration and systematic trial-and-error, the canny marketer can find just the right admixture of age, geography, time of day, and music or film tastes that demarcate a demographic winner of an audience. The ‘clickthrough rate,’ to use the advertiser’s parlance, doesn’t lie. . .Facebook has and does offer “psychometric”-type targeting, where the goal is to define a subset of the marketing audience that an advertiser thinks is particularly susceptible to their message. . .Sometimes data behaves unethically. . .Facebook will never try to limit such use of their data unless the public uproar reaches such a crescendo as to be un-mutable (García-Martínez 2017).
Such statements are startling–but not unprecedented.
For years social scientists have warned about how technology can trigger behavioral addictions. MIT anthropologist Natasha Schüll, who conducted research on Las Vegas casinos over a 20-year period, learned that slot machines pull some gamblers into a disorienting “machine zone.” (Schüll’s research builds upon the pioneering work of UC Berkeley anthropologist Laura Nader, who developed the concept of “controlling processes”–how individuals and groups are persuaded to participate in their own domination.) After interviewing machine designers, casino architects and hardcore gamblers among others, Schüll concludes in her book Addiction by Design that the magnetic attraction of slot machines is due in part to their deeply interactive features. Gambling industry experts openly talk about maximizing “time-on-device.” As one consultant told Schüll, “The key is duration of play. I want to keep you there as long as humanly possible–that’s the whole trick, that’s what makes you lose”.
This article has been excerpted from: ‘Starting Them Young: Is Facebook Hooking Children on Social Media.’ Courtesy: Counterpunch.org