Try this. Get five friends with their laptops in a room. All go to one of the top search sites. All type in the same search phrase. Compare the results.
The chances are they will all be different. Different because the search engines are all trying to personalise our search results. The algorithms written into the search engine determine that personalisation. In a sense it is an impersonal personalisation.
When you browse the web you most likely leave a trail of cookies that give an indication of where you have been and how much you interact with any particular site. That information, as well as other data, is used to display search results that are more likely to be of direct interest to you.
So that’s a good thing right?
Maybe. The technology, although in its infancy, certainly in my opinion has the potential to transform the ability to monetise content. But there are two reasons why I think things need to change.
The first is social. If you have a political bias or a bias to a certain news source then what will happen over time is that you will only ever be presented with that viewpoint. You never see the counter argument. Your personal beliefs are strengthened and never challenged. For a vibrant, open democracy I think that is dangerous and leads to the balkanisation of beliefs and intolerance.
The second is content monetisation. It has been a long held belief that the better the advertising industry can target adverts to consumers the more the industry will pay for each “target”. Jason Kilar of Hulu wrote a great piece explaining the progress they are making in that regard.
There has been a “contract” between TV viewers and advertisers; sit through some (poorly targeted) adverts and get to watch free or subsidised content. But it seems to me dangerous that the very technology that seeks to add the personalisation is largely being hidden from the consumer. The consumer either doesn’t know it is being done or doesn’t have any idea what part of their online wake is being mined for that purpose.
We need an open approach where that same contract can be made. Where you and I fully understand the value ascribed to our personal data. We get to choose if, and how, that data is used. If you want to be fully anonymous and get random adverts and no content subsidisation – then you should have that choice. If I want to share my every waking move and get highly targeted ads and lots of free content then I should know what data is being used.
More openness is important here. Allow individuals to see what they are potentially missing through search personalisation. Allow individuals to choose what data is used and how. Get too clever and hide too much and there will be a drive to technologies like Adblock that simply block all data. Secrecy and obfuscation will also engender a lack of consumer trust that will be hard to ever regain.