An internal report produced by Facebook executives reportedly states the company can monitor peoples emotions in real time and respond to those emotions.
The internal report states that the company can monitor posts and photos in real time to determine when young people feel “stressed,” “defeated,” “overwhelmed,” “anxious,” “nervous,” “stupid,” “silly," “useless” and a “failure”.
This internal report was used by top Facebook executives David Fernandez and Andy Sinn in a sales pitch to get advertising from a top international bank in Australia.
The report shows how Facebook gathers psychological insights on high schoolers, college students and young working people.
Mark Zuckerberg's social media company is, now that the report has become public, trying to walk this back a little bit.
How powerful and invasive is Facebook?
More than most of us know.
The executives told the bank, Facebook has a database of more than 1.9 million high schoolers, 1.5 million college students, and 3 million young workers.
The document said that Facebook has "detailed information on mood shifts of its young users" based on "internal Facebook data" that is not available to the public.
Facebook has faced criticism in the past for "trying to alter the emotions of its users without their consent." Their response Sunday to this story provided seemingly contradictory statements.
In any case, as you can imagine, they are trying to walk the story back a bit, promising to "open an investigation" into the practice.
Former Facebook executive Antonio Garcia Martinez has responded in an article of his own.
He says Facebook founder Mark Zuckerberg is sometimes "disingenuous (to put it mildly)..."
Martinez says during the time he was with Facebook, there was a lot of talk about the fact they could throw an election if they decided to do so.
He says, "I was at Facebook in 2012, during the previous presidential race. The fact that Facebook could easily throw the election by selectively showing a 'Get Out The Vote' reminder in certain counties of a swing state, for example, was a running joke."
But can Facebook actually do that, and if so would they do what they are promising the bank they can do?
Martinez says, "Without seeing the leaked documents... it is impossible to know precisely what the platform was offering advertisers."
He says he does know that Facebook does offer "psychometric"-type targeting, where the goal is to define a subset of the marketing audience that an advertiser thinks is particularly susceptible to their message.
"However," he says, "knowing the Facebook sales playbook, I cannot imagine the company would have concocted such a pitch about teenage emotions without the final hook: 'and this is how you execute this on the Facebook ads platform'. Why else would they be making the pitch?"
He says, "The question is not whether this can be done. It is whether Facebook should apply a moral filter to these decisions."
His response probably reflects the beliefs of his former employer--Facebook.
He says, "Let's assume Facebook 'does' target ads at depressed teens. My reaction? So what. Sometimes data behaves unethically."
Marinez says, "The hard reality is that Facebook will never try to limit such use of their data unless the public uproar reaches such a crescendo as to be un-mutable. Which is what happened with Trump and the 'fake news' accusation: even the implacable Zuck had to give in and introduce some anti-fake news technology. But they'll slip through that trap as soon as they can. And why shouldn't they? At least in the case of the ads, the data and the clickthrough rates are on their side."
He says, "In our current world, there's a long list of Truths That Cannot Be Stated Publicly, even though there's plenty of data suggesting their correctness..." citing these as examples:
- African Americans living in postal codes with depressed incomes likely respond disproportionately to ads for usurious "payday" loans.
- Hispanics between the ages of 18 and 25 probably do engage with ads singing the charms and advantages of military service.
It really is all about clickthrough rates.
Facebook has already engaged in monitoring content. If they feel the content is unacceptable for any reason, it is refused.
Yesterday, Facebook announced it is hiring 3000 people to monitor live feeds videos for violence.
Mark Zuckerberg said they are hiring the 3000 to help in their global operations to "do better for our community."
Ultimately, Zuckerberg, an avowed globalist, will decide what is "better for the community."
As of January, 2017, there are 214 million Americans on Facebook, with 52.82 million of them between the ages of 25 and 34 years old.
While there are a number of social media sites---Snapchat, Twitter, etc., Facebook remains the most popular social network in America according to latest studies. It also continues to be the most popular among millennials in the United States with 15% of all millennials dreaming of working for Facebook at some point.
So...are we suggesting every body should get off Facebook and other social media?
No.
My point is this: Be Informed and be wise in where you go on the Internet and what you share. Be aware of who and what is shaping your thoughts and beliefs, because there is a real battle for your mind. And emotions.
Most importantly, I remind those of you who are Christians to remember Paul's instruction regarding who influences your mind:
Romans 12:2 "And do not be conformed to this world, but be transformed by the renewing of your mind, that you may prove what is that good and acceptable and perfect will of God."
Be Informed. Be Discerning. Be Vigilant. Be Aware. Be Prayerful.