Recruiting through Algorithms – How Social Media Grey Zones are exploited

By Franziska Kuehl

Instagram is a safe space for aesthetically pleasing pictures, videos, or otherwise humorous content. Sure, politicians use the platform to communicate with their constituents and push their political agenda – but this can be easily avoided. The objective of the platform’s algorithm is to persuade you to invest an increasing amount of your time on it – that is how they earn money. This results in showing you similar types of content. Hence, if you as a user do not actively seek out political content, it will in all likelihood not be suggested to you. Yet, platforms such as Instagram are one of the main tools for extremist organizations to recruit new members. 

Although it sounds futuristic, a recent in-depth investigation from the independent journalism collective CORRECTIV revealed otherwise. Throughout months leading up to publishing their report on 06 October 2020, the team analyzed an enormous amount of data: 4500 Instagram accounts, 333.000 connections between individual accounts and 830.000 posts [1]. Through a fake account, the journalists were able to map out connections between various actors setting trends in the German right-wing scene, from politicians to activists. Those connections further highlight how subtle some ideological messages can be implanted into the algorithm. 

When asked, some people might think that the promotion of populistic or radicalizing ideologies is a problem of small dimension – a niche phenomenon, only impacting a small number of people. The report revealed this to be untrue. Why extremist messages have such a large but subtle reach can be attributed to various factors. It has presumably little to do with the intelligence or education of the consumers as a singular cause. Instead of blaming the consumer, one should attempt to further investigate the messengers.

That young, populistic or extremist organizations are more tech savvy than their elderly predecessors, is not a revelation. An increasing amount of organizations and companies can be found on instagram – with their professional profiles. It makes sense that those parties use platforms like Instagram as the free advertising machine that it is. However, being able to tell whether someone sells something – a product, a service, or an idea – can no longer be narrowed down by #ad. In fact, if you promote a publicly open ideology, the visual experience could be seen as a more effective way to do so. 

According to an insider and former Identitary Movement (IM) activist, “The girls are responsible for beautiful pictures.” The knowledge and skills of how to create visually pleasing content has not been ingrained in women’s DNA but were taught in workshops, organized by the Young Alternative (YA) or IM [1]. YA is the Youth group from the Alternative fuer Deutschland (AfD), a populist political party. The supporters producing this kind of content are supposed to appear young and attractive, and a photographer working for the YA explains why: Instagram users want to connect to positive people, not to organizations. Hence, it makes sense that not only official organizations’ accounts promote a certain mindset or hashtag – but individuals too [1]. The difference being that those individuals do not necessarily state their ideological convictions.

Why women are primarily used as a forefront for far right ideas is addressed in the report as well. Katrin Degen, a researcher focusing on gender roles in the right-wing scene from the University of Bamberg, highlights the history of groups consciously using women when necessary. They were the ones marching in protests with signs and attempting to be voted into local school boards. Now, women are strategically employed to introduce their followers to conservative, far-right ideologies through the back door [1]. The logic behind this decision? Contrary to men, women supposedly seem to be less aggressive, peaceable [1]. 

How the recruitment process works in real life is described by the former IM activist in the CORRECTIV report. Individuals pushing a right-wing narrative publish Instagram stories with general content – usually nature photography, or motives containing somewhat traditional homeland-themed. If followers or other instagram users are interested, the person posting the story encourages personal contact to comprehend whether the interested individual overlaps with their own ideology. If that is the case, he or she will be added to the “Close Friends” group, which results in more private stories being shared [1]. Those more private stories can contain openly racist or far-right content, as the former IM activist explains. She remembers a story portraying Adolf Hitler with a Birthday cake on his birthday [1].

In addition to recruiting, women appear to be actively used as cohesive communication tissue between different organizations. That is when the CORRECTIV team introduces the public profile of a woman who is linked to both, the YA and IM – although that is against official AfD guidelines [1]. Due to their right extremist and anti-democratic ideology, IM is on the official watch list of the German intelligence service Verfassungschutz. Furthermore, the women appear to choose hashtags with generic meaning but ideological implications, such has #Heimatliebe (#Loveforhomeland). Other hashtags that relate to lifestyle, travel, or sports are also being utilized. Insiders know that similar language has been used in the right wing organizations for decades, whereas it has a very different meaning to an outsider [1]. 

Due to either intentional or unintentional selection of right ideological content, more of that kind of content will be suggested to the user. Sooner or later, the Instagram feed of one’s personal account could primarily or only consist of similar subject matter. In the case of right wing ideology, it can range from embracing traditional gender roles to xenophobic or islamophic commentary. The team of journalists detailed how their fake account’s feed intermittently grew into a continuous warning of the islamisation of the western culture, uncritical patriotism, as well as the rejection of modern feminism, claiming it destroys femininity [1]. Those bubbles that users are creating by preferring specific content or following certain people could suggest that reality is reflected by their Instagram suggestions and feed [1] – even though it is far more likely that it mirrors their personal preferences and interests. 

Another issue arising with the polarization of social media is the difficulty of implementing community guidelines. Of course, it would be ridiculous to insinuate that platforms such as Instagram willingly lend their platform as advertising machines to extremist ideologists. Yet, assuming that community guidelines prevent this from happening would be false as well. Yes, community guidelines exist that forbid the promotion of hate speech and similar messages. But they are not enforced effectively, leaving a grey zone for such people to exploit. This can be traced back to various reasons, but the report presents these as the main ones:

Firstly, Instagram algorithm is only as good as the engineers who created it [1]. This translates into a simple flaw: if an engineer building algorithms for content analysis does not know about terminology, symbols, or language used by any subculture online, then that will consequently not be a dimension of the algorithm. Thus, detecting such items falls to another mechanism.

A second instance of detecting extremist content is a global team of about 350 members dedicated to analyze trends in symbols and language, attempting to prevent extremist individuals, organizations or content to circulate on the platform [1]. Although that sounds somewhat reassuring, it should be questioned whether that capacity can completely cover the daily 500 million active users of Instagram [1]. Additionally, the German speaking team members would need to be knowledgeable about the niche that the German right wing scene is in the global context, how they adapt to changes in social media guidelines and algorithms. 

Thirdly, there is strong evidence that far right activists are well aware of the aforementioned grey zones and the opportunities they present. The CORRECTIVE team revealed that certain symbols and pictures were pixelated or covered otherwise, with the posting users encouraging their followers to find that picture uncovered on Telegram [1]. This demonstrates not only that subcultures know how to bypass community guidelines, but also that their members want to expand their following across other platforms and messengers, submerging their followers into more black-and-white content predicting an exaggerated worst case caused by the other-looking or -thinking. 

The CORRECTIVE report dives into more detail regarding the monetization of ideologies on Instagram, and how the German AfD is connected with other populistic, right-wing and partly anti-democratic organizations without explicitly stating it – even that does not appear to be the bigger picture, but rather a sample of one specific subculture; it might have started of as a niche presence but grew to an open, widely accessible force that sees success by using women to introduce their followers to the ideology in a highly nuanced way. 

What becomes increasingly clear is that the currently utilized algorithms of social media platforms enhance polarization of its users, as well as that our reality is shaped by the content we consume via our blue screen and digital environments. Lastly, what the report truly demonstrates is that the currently employed strategies to counter that from happening are not working.

Sources

[1]https://correctiv.org/top-stories/2020/10/06/kein-filter-fuer-rechts-instagram-rechtsextremismus-frauen-der-rechten-szene/#wie-tausende-rechte

Share this article

Newsletter

Join over 150,000 marketing managers who get our best social media insights, strategies and tips delivered straight to their inbox.