Bits of Books - Books by Title
The Filter Bubble
What The Internet Is Hiding From You
Eli Pariser
You're getting a "free" service from Google and Facebook, and the cost is information about you. And Google and Facebook translate that pretty directly into money.
Personalization not just of ads, but of news, and of the sites we see when we search. That applies to what you see on a dating site like OkCupid or a restaurant recommending site like Yelp. So the algorithms are starting to orchestrate our lives.
All recorded data from dawn of civilization up to 2003 amounts to about 5 billion gB. Now we're creating that much data every two days.
More books on Computers
When you read books on a Kindle, Amazon monitors which phrases you highlight, which pages you turn, and whether you read it right through or jump around the text (and then use that data to recommend other books you might enjoy, and to personalize/rearrange the Amazon website when you visit so that you see Amazon's tailored choices just for you.)
Metcalfe's Law: the utility of a network increases at an accelerating rate as you add each new person to it. Pointless to be the only person on it, but if everyone else you know uses it, it's a huge disadvantage not to be in the loop.
A little-known company hit the news after 9/11. Turned out Acxiom knew more about eleven of the nineteen hijackers than the US Govt did, including their past and present addresses and the names of their housemates. Acxiom stores up to about 1500 bits of data on 96% of Americans and half a billion others worldwide, including how often pay credit card bills, what cars they own, what pets, what medications.
They sell data in imaginative ways. For example, if you go on Travelocity to look for flight info, Travelocity sticks a cookie on your computer. It then sells that bit of data to Acxiom, who auction it to, say, United Airlines. The whole process, from the time cookie placed to the sale to United, takes less than a second. United can then follow you around the web, placing ads for relevant flights in front of you as you browse.
Personalization not just on computer or phone - if you sit down in airline, the ads you see displayed are different to the ones your neighbour sees - the airline, after all, knows who you are.
"Freedom of the press used to be just for those who owned one. But now everyone does."
CIA debate led to analysis of how they made intelligence decisions. "Analysts should be self-conscious about their reasoning process. They should think about how they make judgements and reach conclusions, nt just about the judgements and conclusions themselves."
Big problem is that we tend to believe that the world actually is as we see it, even though we should be aware that we do not have complete information. But we tend to believe we have all the facts )we need) and that the patterns we see in them are facts as well.
Need to understand the distorting lenses through which all information reaches us. Some of it is external - lopsided selection of data for example - and some is internal - jumping to conclusions based on appearances for example.
Our brain is very good at compressing data -almost all of what we see never makes it through to conscious notice. And we do the same thing with ideas. When we see/hear news, we forget bits we don't think important, and we ignore most of the context.
A political scientist, Philip Tetlock, invited a range of experts to make predictions about the next ten years - would the Soviet Union fall, in hat year would the US economy start growing again etc. And he asked same questions of 'men-in-the-street'- plumbers and teachers with no special expertise. Results were a surprise - not just that the average folks' predictions beat the experts; the experts' predictions weren't even close.
Experts have a lot invested in the theories they've developed to explain the world. And after they've been working on them for a few years, they start to see everything in terms of those theories. And they ignore any data that contradicts their theories.
Big weakness of personalized news is absence of serendipity. When we read a newspaper we see a headline about floods in Pakistan. We don't care about that so we don't read the story, but we are briefly reminded that there is a place called Pakistan. In the filter bubble we never even see the headline.
In early days of Internet, anonymity ruled. Seen as a strength - everyone would be treated as an equal, without prejudice. But today, that privacy vanishing, for two reasons. Need to know who you are dealing with and how trustworthy they are, and need to be able to trace criminals or trolls. And advertisers want to know who is looking at their products, and what factors are influencing purchase decisions. PeekYou has patented ways of linking pseudonyms with real identities. Phorm inspects packets en route for ISPs for advertising and security. And BlueCava is compiling a database of every computer and smartphone so that even with highest privacy settings, your hardware can tie you to browser activity.
Court case Wisconsin vs Yoder in which Amish parents sought to prevent their children attending public school so wouldn't be exposed to modern life. Commentator argued that it was a real threat to their freedom. Not knowing that it was possible to grow up to be an astronaut was just as unfair as being prohibited from becoming one.
Different people respond to different selling pitches. Some like to have an expert vouch for the product. Others go for the most popular, or money-saving deal, or a trusted brand. Some people like a 'high cognition' message where they have to figure out the story. And, some types of pitches really turn some people off. Some people think that a discount offer means the product is lower quality. And these preferences are fairly constant. The type of books you rad may not predict what sort of clothes you'll buy, but if you go for a "20% off this week only" deal for one product, you will probably respond the same way next time the same incentive is offered. So Amazon offers you different types of deals, sorts out your persuasion profile, and then sells that to the market.
The problem with algorithms: overfitting/stereotyping. Two problems with pattern finding programs like Netflix is when jump to unfounded conclusions. Army might look at your Facebook friends, see that six of them have enlisted, and decide that you are a prospect too. That's fine, because the deductions are obvious and don't preempt your decisions. But what if a bank sees that some of your friends, or that someone who likes the same things that you do, are bad debtors, and refuses you a loan. You aren't told why and you don't get a chance to explain why the prediction doesn't hold for you.
Idea of a 'local maximum'. You could write a simple program to help a blind man climb a mountain. Could say, "feel around you for part of the terrain which goes 'up'. Go one step in that direction. Repeat." Which is fine, except it will probably leave the blind man on top of a small hillock in the foothills, a 'local maximum'. Similarly a movie prediction program that just builds off previous choices, will end up stuck down a narrow alley (genre). Need the random wildcards that lead to change
Books by Title
Books by Author
Books by Topic
Bits of Books To Impress
|