Muhammad Salim

Reflect on that precise moment when you are about to open the YouTube app while thinking that you will watch only one specific five-minute video, that is it – just a little enjoyment. Once you click on ‘play video’, it will keep continuing until you realise that you spent a lot of time without even noticing because the YouTube recommendation system recommends more and more interesting things to engage with. The way this particular attention-catching recommender technology operates is not always healthy and intellectually enriching. As one study pointed out: “YouTube’s recommendation algorithms, which determine 70% of what billions of people watch, have found that a great way to keep people watching is to suggest content that is more extreme, more negative, or more conspiratorial.” But what is this YouTube recommendation system? How does it grasp our attention so powerfully? How does it know what we like and what we don’t? Why YouTube algorithms are so good at predicting exactly what will keep us watching video after video? These questions provide useful insights in critically evaluating our psycho-cognitive relationship with the emerging sophisticated technological system like the YouTube recommender device.

63% of YouTube’s watch time comes from mobile devices (Statista, 2021).

YouTube’s Recommendation System:

The YouTube recommendation system comprises a set of complicated algorithmic models. From a layperson’s perspective, firstly, it requires input in the form of raw data i.e., available videos from different interconnected sources. In the initial process, there is a specific filter network called ‘Candidate Generation Network’ which selects a certain set of videos based on the user’s personal information through login track, including likes, subscriptions, history, watch time, geolocation contexts, age, gender etc. This network is designed in such a way that it could optimally personalize the selected videos as per the user’s interests and pertinent virtual affiliation.  In the second stage, the selected videos are further proceeded to another network called ‘Ranking Network’. The latter network, in a complex manner, takes the selected videos as a referential pool and extracts more relevant videos from sources other than the previous source (user’s log data) such as newly uploaded materials which might contain unseen videos provided the interests of the particular user. In this way, as explained in the relevant literature, the output data sources from both networks are then combined and subsequently emerge as recommendations. Moreover, in the aforementioned processes, YouTube takes great advantage of various advance technologies like artificial intelligence and deep learning.  

                Architecture of You Tube Video Recommendation

YouTube recommender as a persuasive technology:

The YouTube recommendation system acts as a powerful persuasive force that makes us view as many videos as possible. As abovementioned, YouTube has access to billions of videos and related users’ personal information. This means that access to such information enables YouTube to know individual’s behaviour and cognitive priorities. Given the availability of huge relevant data, the recommender system only chooses certain videos which resonate with the user’s psycho-emotional context. It does so by involving brain reward system. In simple terms, the recommender shows those selected videos that directly or indirectly cause dopamine release through the provision of highly personalized informational rewards. This process keeps reinforcing with the constant supply of more relevant recommended videos – the more views, the more recommendations and vice versa.

     Amidst such impulsively interactive connections, YouTube further increases the viewing probability by engaging the user in such a way that s/he feels autonomous. It makes feel the user as if s/he is completely anonymous and expressive, though this is not the case. It is because of these addict causing practices that recommendation systems are the main source of views on YouTube as mentioned earlier. From the above discussion, it becomes clear that technology like YouTube could never be a neutral tool – the idea of instrumentalism.

“If something is a tool, it genuinely is just sitting there, waiting patiently. If something is not a tool, it’s demanding things from you. It’s seducing you. It’s manipulating you. It wants things from you. And we’ve moved away from having a tools-based technology environment to an addiction- and manipulation-based technology environment. That’s what’s changed. Social media isn’t a tool that’s just waiting to be used. It has its own goals, and it has its own means of pursuing them by using your psychology against you” (in The Social Dilemma).

                                  YouTube dominates kids’ attention

Why not selectively choose what deserves our attention?

The YouTube recommendation engine provides an extremely engaging yet limited kind of videos. The particular algorithms mostly offer the same kind of content that a user has previously watched. This means it channels the user’s attention towards a specific area of interest, without taking the socio-ethical values into account. In doing so, it creates an individualized bubble and echo chamber that prevents users in getting exposure to broader and diverse perspectives. It does not have a sense of responsibility or what is right and what is wrong; all it has is what keeps user on YouTube platform. For example, if a user is interested in certain radical ideologies, the recommendation system will further radicalize him/her by both providing the same stuff as well as thwarting access to better materials. In this case, the user has no incentives for thoughtful engagement rather he/she has to choose among the few options that appear on the screen.

     Another major concern is that the recommendation system might suggest problematic content. You Tuber users have noticed that it often brings up video content which a user could have never thought of searching online. Such content is socio-politically unhealthy given the wider reach and societal influence of the YouTube platform.  For example, as one research pointed out, “users could reach conspiratorial content via the recommender system from videos about fitness, firearms, gurus, and even small houses.” These problematic materials like conspiracy theories and reactionary political substances are very effective at grabbing users’ attention and keeping them online longer. By recommending this harmful stuff, YouTube recommendation system creates such an environment which is prone to myriad social issues, including political polarization, hatred,  intolerance, epistemic violence etc., or in creating what Johnn Haris in his book Stolen Focus calls “an attentional pathogenic culture” – an environment in which sustained and deep focus is harder for all of us.

     Last but not least, the YouTube recommendation system is a sophisticated technology that needs to be understood within the socio-technological contextualities. The system works in such a way that, for a non-expert, it is immensely difficult – if not impossible – to fully comprehend the actual working mechanism behind the recommended video content. Given the capitalistic motives of the modern tech industries, what the given recommender suggests is not only distracting users’ attention but also causing ample socio-political issues. The recommendation system badly affects users’ psycho-cognitive capabilities by distorting and dividing attention. There is an urgent need for pertinent ethical values and rules that could help people (users) in controlling such an exploitative and irresponsible socio-technological system.

The author, Muhammad Salim, is a Master’s student at University College London and this blog represents his personal views and is not a representative of the views of Chitral Academics.

Leave a Reply

Your email address will not be published. Required fields are marked *