Wednesday, November 23, 2016
David in the Secret Tunnel Into the White House
And #STIWH screened this month in Hollywood at The Scramble:
Monday, November 14, 2016
Filter Bubbles Change Outcomes (or Facebook EdgeRank Is Flawed)
disclaimer: It's 2:45 in the morning. Insomnia has me thinking wildly, and this may not be good logic, or reasoning, but here goes (caveat lector).
Facebook, and other systems (like Google's search results) insulate and cut us off from people who think differently than we do. This is bad, for many reasons. It polarizes us and leads to bad outcomes for everyone. This filter bubble problem was pointed out at least 5 years ago, and this problem continues and increases today.
To believe that Facebook did not accidentally affect the election with hoax stories may be to misunderstand math, or to forget that small things are still things. Not understanding math is something Mark Zuckerberg cannot believably claim. Hoax stories shared on Facebook probably impacted the election results. An impact can be small, but also be real, and maybe even decidedly so.
EdgeRank is the part of Facebook that chooses what you are shown in your Facebook newsfeed. If EdgeRank decides not to show a post to you in the newsfeed, you have to manually go to someone's page or profile to see the posts; posts won't show up on the newsfeed for you unless EdgeRank puts them there. On average only about 16% of your friends or fans will be shown a post (unless you pay to promote it).
Some at Facebook claim that EdgeRank didn't have any impact on the election. But small things can make a big difference.
On Saturday night, Mr. Zuckerberg posted a lengthy status update to his Facebook page with some of his thoughts on the election.
Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes,Mr. Zuckerberg wrote.Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other
Small does not mean unimportant. It could even be that such a number, the type of number Zuckerberg is dismissing, could change major world events.
Under 1% is a very small amount. 0.52% is a more specific small amount. Somewhere under 1% of stories on Facebook, according to Zuckerberg, are hoaxes. 0.52% is, According to AP, the difference between the number of votes Hillary Clinton got and Donald Trump got (as of November 13, 2016, AP has Donald Trump getting 60,350,241 votes and Hillary Clinton getting 60,981,118).
I am not calling Mark Zuckerberg wrong, but Facebook is calling Mark Zuckerberg wrong. Facebook says people do things based on what's shown to them in their Facebook stream:
Because Facebook Ads are placed in the [news] stream of information people view on Facebook, they’re more likely to see your ads and take action
(emphasis added, from Facebook business: Facebook ads retrieved November 14, 2016).
Facebook tells their paying customers that people act on what they see in news streams on Facebook. Zuckerberg says less than 1% of those stories people act on are hoaxes. Could 0.52% of those stories be something people acted on recently ;-).
For us as actors (and this post is decidedly not focused on our work, but more our online lives), EdgeRank is the machines choosing who sees what we post, including pictures and videos. Machines are not great at programming our work like the artistic director of a theatre or the cinema owner or the traffic department of a TV network (traffic as in what's on air, not what's on roads). It would be nice if everyone who would be thrilled to see our work we share would be shown our work, but no machine in the world can curate that well. Right now EdgeRank, or spending money to promote imperfectly (Facebook sometimes promises a wider reach than a spend will actually give) is all Facebook allows.
If Facebook wants to improve EdgeRank, including some of the posts we don't typically interact with in the news stream is a straightforward option. Remember: liking, sharing or commenting a post is the only data EdgeRank uses. To imagine people don't want to see or aren't affected by posts even when they don't like, share or comment is to imagine human behavior can be reduced to 3 types of database entries. It can't be. That's not even mentioning all the things we do in life that aren't strictly behavior. But this ventures into human-computer interaction, engagement measurement and metrics, and all are probably best for another post on another day...or night. Right now: good night!