Filter/Content Bubbles
Tom Clark
Filter/Content Bubbles Tom Clark Quick History Search engines did - - PowerPoint PPT Presentation
Filter/Content Bubbles Tom Clark Quick History Search engines did not personalize information Simply using keywords to find pages 2005: Google implemented a personalized search algorithm for ALL users Social media followed suit
Tom Clark
○ Simply using keywords to find pages
search algorithm for ALL users
○ Facebook ○ Twitter ○ Reddit (later on) ○ Many more
○ Able to profile individual inquirers ○ “Shallow” just knows specific events
relevant information
○ “You like this, so you must like this”
sides and cover content relevant to their readers
○ A light filter bubble of content for their readers
subscribers wanted.
sources that aligned with your views.
○ Made it profitable for them as well.
○ If you clicked on something, you want more ○ Location history ○ Irregardless if you have an account
○ “Likes” mean you enjoy it ○ Clicks mean you’re interested ○ How fast/slow you scroll ○ Websites you visit (tracking pixels) ○ Facial recognition in tagging = locations
○ User searches show what movies they should add to their service ○ Movie suggestions
○ Suggests people to follow ○ What order to show tweets in
○ You see topics you’re interested in ○ Opinions that agree with you
○ Googling “restaurants nearby” uses location ○ As a CS student, googling “MIPS” should show assembly language content
○ Lack of information diversity
are looking for, but its lack is problematic when you do not” - Thomas Simpson
○ Can’t receive all information if some is hidden
commonality and not fragmentation.
○ “Consumers reportedly use the filters to expand their taste rather than limit it”
○ Looked at how they voted in the 2012 election versus their history ○ Web searches and social media contributed to ideological segregation ○ Found they were only being shown pages from their side of the spectrum
○ Twitter users have access to a wider span of viewpoints directly from political actors or through their friends/relatives.
○ Pull from thousands of feeds into one place
○ Buttons to “see less of this” ○ Ability to hide all stories from a specific source
○ Similar to facebook feed ○ Analyzes scrolling speed, location, clicks
isolated you become.
government deems bad.
○ Gives government control of what ideas are passed through networks ○ Reduces minority opinions
○ Russians used fake accounts to influence voters through social media ○ Worked to further separate opinions ○ Echo chambers of potentially fake information
○ Leads to “information blindness”
susceptible to confirmation bias
○ “Fake news” effect
profiles highlight problems with filter bubbles
○ Christopher Wylie: “...The firm had the ability to develop “psychographic” profiles of those users [to] shape their voting behavior”
○ Leads to better ad recommendation and thus more money for companies ○ Increases user happiness and gives valid information. ○ However, people are becoming more isolated in their ideas i.e. information blindness
○ Facebook/Cambridge Analytica → Yes! ○ Companies sell user metadata to advertisers
○ If the search engines are giving relevant information, is it bad?
○ Invalidates the categorical imperative
but that of filter bubbles being ethically right.
information, the side effect is the bubble.
○ Better recommendations ○ More relevant information ○ Increased user happiness
○ Information blindness ○ “Us versus them” mentality
recommendations.
recommendation they have.
principles and rules?
intellectual isolation.
right thing to do
○ Even though it increases happiness
characteristic of the act and not the result, even though it is good.
○ Scientists are divided on whether or not it is good or bad for users
better than we know our own” - Eli Pariser