Whether it's shopping, listening to music online or streaming a movie, in just a matter of a few years, we have grown so used to the machinations of complex algorithms toiling away behind the scenes, trying to get to know us even better than we consciously know ourselves, all, presumably, in the pursuit of helping us make better decisions, more conveniently and faster.
What these invisible lines of code essentially do is cobble together all the information they can gather, trying to make sense of our unique tastes and preferences, to provide us with an experience that has now grown increasingly personalised.
This technique, known as collaborative filtering, is what has allowed some of the largest and most popular companies today like Spotify, Facebook or Netflix to painstakingly cultivate key technological advantages that have leveraged them in highly competitive marketplaces. At the heart of this technique lies the foundational principle of constant and relentless innovation.
As such, the meteoric leaps and bounds that collaborative filtering has allowed companies to make with regard to personalisation may prompt one to wonder what the future has in store for us. Well, if it does, research being carried by scientists at the University of Copenhagen and the University of Helsinki may illuminate the way forward.
Until now, collaborative filtering has been, largely, centred around a user's, or a group of users' explicit behaviour. Take a simple example involving Netflix. If Netflix finds that you enjoyed Batman Begins and The Dark Knight, it's likely to recommend the final addition to the trilogy, The Dark Knight Rises. However, that recommendation hinges on the actual acts of you watching the two previous instalments in the caped crusader franchise.
The European researchers though, wondered whether they could predict a person's response simply based on his/her pattern of brain activity coupled with those of others. To see whether they could, they placed EEG electrodes onto the heads of the study's participants before showing them various images of faces. Electrical activity from the brain was recorded and machine learning techniques were used to identify which faces participants found more attractive, and which ones they found less attractive.
Then, with another machine learning model, the researchers used brain-based data from a larger pool of participants to predict which new facial images a participant would find attractive. As such, the prediction was not solely based on the individual's prior responses (brain signals) but also, to some extent, on how other participants responded to the images.
The fact that participants needed to have EEG electrodes strapped onto their heads means that, for now, the technique's applications don't extend all that far from the laboratory environment. But by analysing brain activity - “an untapped source of information” - Dr Tuuka Ruotsalo of the University of Copenhagen's Department of Computer Science posits that “the method can probably be used to provide much more nuanced information about people's preferences than is possible today.”
Yet, the researchers believe the applications go well beyond just facilitating improved personalisation for advertisers and streaming services towards selling products or retaining users. They think it can even help us learn more about ourselves.
Says Keith Davis, the lead author of the study, “I consider our study as a step towards an era that some refer to as 'mindful computing,' in which, by using a combination of computers and neuroscience techniques, users will be able to access unique information about themselves. Indeed, brain-computer interfacing as it is known could become a tool for understanding oneself better.”