Your Emotions Are For Sale
This week we explore the relationship between AI, voice assistants and neuroscience through the lens of philosophy. Many thanks to Jaron Lanier, Guillaume Chaslot and Zeynep Tufekci for their expert contributions to this episode.
It’s a Friday night.
You have no plans, you’re home by yourself.
You pick up your iPhone, start browsing through Facebook and Instagram because you’re slightly
curious about what everyone else is up to.
You spend a fair bit of time scrolling and then you realise… you’re feeling lonely.
Perhaps you should have organised something.
People love to report about technology making us sad, depressed even saying that using social
media instils a fear of missing out, and that seeing a representation of everyone else’s
beautiful lives is the recipe for depression.
But what if there’s more to it?
What if… it’s not just your friends who are manipulating your emotions, but the platforms?
Emotions have always been a hot topic in advertising.
And even more so on social media platforms.
Studies after studies have shown that emotional posts, especially negative ones that induce
fear or anger have much higher chance of going viral than positive posts.
Now advertisers are taking personalization further and are taking advantage of the special
place emotions have in our hearts in new ways -- using anything from neuroscience to artificial
intelligence to what sounds like sci-fi technology.
What exactly is happening??
Behind the facade of the app, tech companies can make use of a bunch of tools to determine
how you’re feeling.
The words you chose to include in a status update or comment and your typing speed can
be analysed to determine your feelings.
In your device, the accelerometer captures how your phone moves, the microphone listens
for background noise and – potentially – surreptitious photos of your face taken through your webcam
can be run through facial expression recognition software.
Sometimes we simply volunteer this information, like through the “choose a feeling” function
It allows you to express yourself, but more importantly for Facebook, it gives the company
valuable data on how you’re feeling.
These tools all work together to paint a picture of your emotional state.
And this feedback influences the posts, news and advertisements that you see.
So what happens when you go on Facebook or when you go on Youtube is that you've got
this, hidden to you, artificial intelligence machine that's like, what will I show her
that will make her stay here?
That's what it's figuring out.
And as humans we have emotions and they're important to us.
The most influential companies have gotten so good at capturing your attention, now it
seems like they’re coming for your emotions.
This is called mood targeting.
It’s not exactly a new idea, but it’s becoming more important as new technologies
become better at reading our emotions and even manipulating them.
Media companies, too, have jumped on the bandwagon.
In September, The New York Times, ESPN and USA Today, rolled out ad products that they
say can use artificial intelligence to predict the emotional response any piece of content
evokes and then match ads to people in certain moods.
The New York Times for example, allows advertisers to target their ads on content predicted to
evoke emotions like self-confidence or adventurousness in the readers.
For ESPN, this could mean showing travel ads to people whose team is winning, and not even
bother advertising them to those whose team is losing.
Advertisers have always tried to show people ads when they are most receptive to them
But using artificial intelligence to target them based on their mood is sounds more manipulative.
Still, it stops just short of manipulating their actual mood.
Or does it?
This came to our attention in 2014 in what we’ll call ...The Facebook Experiment...
when we found out that Facebook researchers experimented with more than 600,000 people.
Now, in science it’s known that longer-lasting moods, like happiness or depression, can be
contagious – they can be transferred through your in-person networks.
So, the Facebook researchers wondered, is the same thing true in online networks?
The researchers randomly selected users, and controlled the number of positive and negative
posts that appeared in their news feed.
When the researchers reduced positive expressions in the news feed, people posted fewer positive
things and more negative things.
And then when people saw less negative posts in their news feed, they posted less negative
things and more positive things.
So, it looked like that the emotions expressed by others on Facebook do influence our own
And yes, emotional states can be contagious at a mass scale over social networks.
When the results were published an outcry followed.
Was it ethical to run an experiment on people without their knowledge of ever participating
Was it ok to induce sadness across a population that most certainly included people at risk
of depression or suicide?
But the bottomline was, even outside the study, Facebook algorithms manipulate your news feed
all the time, as they try to optimize giving you content that keeps you on longer.
These findings only highlighted just how far they can go.
Are they actually using these techniques though?
It’s not quite clear.
But in 2017, The Australian published a leaked document from two Facebook executives on mood
targeting – where advertisers can target ads to people in certain emotional states.
Using posts, pictures and reactions, they collected data on the emotional state of more
than 6 million young Australian and New Zealanders – many of them teenagers – to find “moments
when young people need a confidence boost.”
Facebook said they were undertaking research to better understand emotions rather than
using the data to serve ads.
Though just 5 weeks later, Facebook was granted a patent that outlines the uses of emotions
And beyond Facebook, many companies have looked into leveraging users’ emotions.
There are 100s of patents for emotion-sensing technology, some so creepy that hopefully
they’re filed just in case they’re useful and they’ll never see the light of day.
Even Apple, which makes a tiny fraction of its profit from ads and is known for advocating
user privacy, has done research on mood targeting.
It claims it can determine mood based on different types of data, including heart rate, blood
pressure, adrenaline level, body temperature and verbal cues, the data gathered by fitness
It can also use signals like what type of content a user is viewing, which apps they’re
using and when, what kind of music they’re listening to, as well as interactions with
social network, all to triangulate on the user’s mood.
Social media can even be used to monitor the mood of a crowd -- Snapchat, for example,
has filed a patent detailing how to detect mood from selfies and images to find the aggregate
mood of a crowd in an event.
All this technology to read our emotions is cutting edge and kinda cool.
But it’s not being developed to help you in those sad moments or make you happier.
It’s used to figure out when it’s the best time to show you the next ad.
Unfortunately, the kinds of emotions that can be picked up on an amplified in, used
by the addiction and manipulation algorithms the best tend to be negative ones.
So it's a lot easier to make somebody's anxious than make them feel comfortable.
It's unfortunately, it's a negativity machine intrinsically.
Now, not all of this technology is actually being used.
At least not publicly.
But there’s no guarantee that businesses are going to stop here.
We'll be more and more vulnerable.
So we'll just stop being in control if we're not careful.
So at any point like we can take back control, but we have to realize first before taking
back control that all these systems are trying to hack us.
They're trying to, uh, to derail us from what we want to do to something else in order to
make us watch more ads.
Now advertising has always tried to appeal to our emotions.
a television network, tries to keep us glued to the screen to sit through more ad breaks.
Competing for eyeballs – whether it’s on TV or Facebook – is what defines the
Attention Economy that underlies advertising.
But now, as more and more research makes it possible to understand and take advantage
of our emotional states, the Attention Economy shifts into an Emotion Economy.
Your emotional state is reflected in your online behavior.
Your activity timeline on social media can be a digital mental health footprint.
So the more connected you are, the more data you provide for piecing together an accurate
snapshot of your mood at any instance.
I really want you to stop and think about this.
Would you be ok if an algorithm figures out you shop more when you’re tired and sad,
and so shows you tempting ads for shoes once it figures out you are a bit down?
Would you be ok if you are shown luxury apartment ads when you are feeling ambitious?
For some people this signifies manipulation, for others it may be just better ads..
But mood targeting doesn’t stop with ads.
All these platforms benefit from keeping us engaged, so even news and information can
be delivered to us based on how we’re feeling If you are sad one day, you may see an online
world literally different from what a happy person would see.
There’s potential for an unprecedented level of manipulation when businesses have so much
access to our information and the technical ability to decipher our inner feelings.
Is there an ethical line that should not be crossed?
More Episodes (166)
The Attention Economy Needs to Change. But How?January 07, 2019
Google Owns 28% of Your BrainDecember 17, 2018
Your Emotions Are For SaleDecember 17, 2018
How One Company Redefined Social NormsDecember 05, 2018
The Psychological Tricks Keeping You OnlineNovember 26, 2018
ATTENTION WARS | TrailerNovember 19, 2018