In the era of communication, internet and social networks, memes have acquired a huge power to disseminate and influence people. This phenomenon is mainly due to the ease with which a message resonates with people and with one's friends and acquaintances, the lack of regulation of digital communication channels, and the way social media algorithms are programmed. They tend to favour the financial profit of social media platforms and less to combat bots, fake accounts and the propagation of misleading messages.
In this paper, I will analyse and explain why humorous satirical memes have come to be used in influence actions and campaigns and why they have increased effectiveness, especially among trained people, compared to other techniques. The argumentation is built on two dimensions: that of genetic baggage and programming, together with the chemistry of the human body, where hormones play a very important role and that of the social, tribal dimension, where man is a being who wants and needs belonging to a group to feel safe, accepted and valued.
Keywords: meme; memetic warfare; influence; disinformation; manipulation.
1.Influencing
The influencing of the receiver must be done in such a way that they reach by themselves, through logical deduction, the conclusion desired by the manipulator (reflexive control theory) (Stamatin 2017) and for this, the communication channels, the timing, the selection, the form and the packaging of the information that is delivered are vital to them in order to guide their logic and thinking (Dobrescu şi Bârgăoanu 2003). Keep in mind that less than 7% of people are immune to disinformation and manipulation (Volkoff 2009, 125).
Four decades of studies and research have shown that people are not very efficient at identifying lies in a text, with an average success rate of only 54% (Oprea 2021, 219), 28% of people do not understand the satire of the messages they come into contact with and consider them to be true (Munafo 2021, 31), and if we also add the "sleeper effect", in which the message is more important than the sender, which means the human inability to remember over time whether a news was from a credible source or not (Dobrescu and Bârgăoanu 2003, 149), we see how vulnerable an individual is to manipulation.
Now, in the digital age, where traffic lights were added, for instance, in South Korea, in the asphalt as well, at pedestrian crossings, because people started to keep their eyes on their phones all the time (The Korea Times 2022), we live the dependence on technology, connectivity and we hit the paradox of informational abundance. Every one to two years, the information available online doubles (Taylor 2022). Being assaulted daily with information on all communication channels, each individual can find and select information that confirms their beliefs, without necessarily being entirely false, as McIntyre also said, "depending on what we want it to be true, some facts matter more than others" (Bârgăoanu 2018, 85). For profitability reasons, online social networks apply the technological agenda setting, through which they deliver only the information that feeds our beliefs, and thus amplify the polarizations of opinion and the segregation in informational bubbles and resonance chambers.
In 2023, 5.5 billion people have access to the Internet out of 8 billion total population (Worldometers 2023) and nearly 3 billion have online social media accounts with a penetration rate in Western Europe of 80% (Dixon 2022). Seeing this data and its evolution, we notice that the fertile ground for manipulations and digital disinformation is pencilled in.
Subliminal messages have the property to go unnoticed by the human conscious, but to generate effects on the subconscious (popularized by Freud), modifying perception and inserting doubt, because where doubt occurs, more doubt is generated and this is the most important step in influencing a receiver. "Manipulation must leave the impression of freedom of action" wrote Sonia Stan in her book Manipulation through the press (Stan 2004, 12).
Information processing is of two kinds. The first is central processing, conscious, analytical and systematic, but slow and limited to only three to five simultaneous things and only 1000 bits per second, with an average attention time of only six seconds (Oprea 2021). This type of analysis consumes many resources of the individual, it is tedious and consequently avoided, and rarely used. The human brain prefers the second form of information analysis, which consumes less resources, namely peripheral or heuristic processing. This is faster because it is based on previous experiences that generate patterns, logical shortcuts, preconceptions, with the help of which it frames the surrounding information. This type of analysis is also sensitive to emotions. Because of the shallowness of processing and emotions, heuristic analysis is highly vulnerable to manipulation. The human brain is not a truth-seeking device, but an apparatus that produces a viable reality (Oprea 2021, 18).
In the age of the Internet, of the new media of communication and socialization, when the amount of information is huge, and the narratives are increasingly dramatic, influencing the receivers has become simpler because "the human brain is vulnerable to situations in which it has to understand quickly" (Bârgăoanu 2018, 20).
The actor performing influence chooses and calibrates its techniques starting from the objectives, knowing a person's desires, needs and psychic mechanisms (Dobrescu and Bârgăoanu 2003, 56).
We have divided the influencing processes into three broad strategies.
The first strategy is to exploit the physical limits of the human body's perception. The limits of the visual and perceptual system can be tested by the 25th frame technique. The cinematographic video standard is 24 frames per second, a process that transforms the sequence of those static frames into a fluid continuous image in the human brain (Kurniawan and Hara 2023). Experiments have shown that it is possible to change or add a frame in the film to one that contains a specific message, and the viewer is not aware of this, but the subconscious perceives it, and it is influenced (Karremans et al., 2006). In the case of the limits of the auditory system, frequencies outside the conscious perceptible spectrum between 20 Hz and 20.000 Hz are exploited by transmitting messages outside this range, but still close so that the conscious does not perceive it, but the body and the subconscious sense it, and are influenced by it (Volkoff, 2009) (Stern, 2015). Studies have shown that the olfactory system also has limits to perception and interpretation, and some substances, even apparently odourless, can be used to influence the hormonal system, the brain activity, or the mood and behaviour of a receptor. The most famous case is that of pheromones (Sela and Sobel 2010).
The second strategy refers to the use of perceptible stimuli, but whose manipulative purpose is difficult to be aware of or is overlooked by the receiver. We would like to mention the arrangement of the environment, the space, the time of day or year, the choice of colours, the emanation of smells (Spence 2015), or the preparation of situations, in such a way that the human decision seems free and natural, but in reality it is influenced and channelled.
The third strategy encompasses all the strategies, techniques, and tools of influencing through communication, from the theory of the magic bullet (Dobrescu and Bârgăoanu 2003, 131) to deepfake and AI. "The sword is always defeated by the word!", the PSYOPS motto in KFOR NATO Mission, supports this strategy (Stan 2004).
We consider that the meme, as part of the vast communication environment, is a very powerful and effective influencing tool in the online environment of this period, especially in the case of receivers with a higher degree of training. In this paper, we are going to demonstrate this. Memes are of many kinds, but this theory is based on those that camouflage the message conveyed with the help of the cloak of humour. Thus, the message has the purpose to ridicule an important subject or distract attention from it, or - in the case of people - to ridicule and to cast doubt on their credibility, value, correctness, as well as on their image, prestige and authority, so as rather to attempt to compromise than challenge that person. In the long term, humorous memes can also create an attitudinal background that responds in a desired way to a particular action, at a given time, by mocking and ridiculing certain targets (Dobrescu and Bârgăoanu 2003, 61).
Further, we will explain why funny memes pass filters and personal censorship, partly due to human genetic programming and the objectivity of chemical processes in the body, and partly due to social norms and canons.
2.Humour and Laughter: Exploring the Chemical Processes in Influence Strategies
The term "humour" originates from ancient Greek through the French branch, and in the conception of the "father of medicine", Hippocrates of Kos, it signifies the fluid in the body (blood, phlegm, yellow bile and black bile), fluid balance favouring peoples' well-being (Chelcea, 2019). The related term, "comedy", derives from Comus, the ancient God of fertility, who signifies perpetual rejuvenation (Eagleton 2019, 186).
In the 18th century, the philosopher David Hartley said: "laughter and mirth frustrate our search for truth, because they prevent our minds from perceiving the true nature of things". Thanks to humorous satire, messages, memes, manage to pass the censorship and filters of people's beliefs because jokes suspend social conventions and restrictions, distort meanings, mix hierarchies, confuse distinctions and confuse identities (Eagleton 2019). Thus, the receivers either do not actually realize that they are influenced by the message, or they are aware of its ability to influence, but they appreciate the humorous value and not only pay attention to it, but also distribute it, considering that due to the fact that they have become aware of this, the influence will not be achieved, and they just enjoy the humour conveyed. This perception is false because if the message comes back in various forms, many times the effect of illusory truth is being formed.
In marketing, it is known that a customer is convinced to purchase a product after being invited to do so on average between five and seven times and Joseph Goebbels said that "a lie repeated a thousand times remains a lie, but a lie repeated a million times becomes the truth" (Volkoff 2009, 64).
Everything that makes us laugh or amuses ourselves is perceived and identified both consciously and unconsciously as something beneficial for our physical and mental health, being imprinted in our genetic baggage, so that the reaction to triggering stimuli is one of reception and acceptance. Sigmund Freud argued that jokes are a release of psychic energy that we normally invest in maintaining certain essential social inhibitions (Eagleton 2019, 22). When we enjoy something, the so-called "hormones of happiness" are secreted in our body, the neurotransmitters serotonin, endorphin and dopamine that dictate our "good" mental and physical state. The release of these hormones through the effect they have in the body both at the level of the brain and at the level of other organs, reduces stress, improves the immune system by stimulating the production of white blood cells, reduces blood pressure through the calming effect it generates, improves blood circulation and cardiovascular functions, relieves certain physical pains and plays a very important role in improving mental mood by combating depression and anxiety.
These hormones have a fundamentally positive role, but due to their "mechanical" functioning, they can be manipulated in such a way that they in turn influence individuals' mood, perception and behaviour. The main exploited security loophole of the hormonal system is the addiction that these hormones generate, having a structure very close to opioids. This effect makes us very receptive and permissive every time we encounter a stimulus in this direction in order to receive another dose of "good mood". Concretely, when we see a funny meme on social networks, we read it, watch it, analyse it and have fun with it even if the message transmitted may also be in contradiction with our principles and beliefs, taking precedence over the chemical processes in our body that satisfy our addiction to well-being through the secretion of "happy hormones". Ten minutes of laughter is equivalent to an hour of Zen meditation (Ravich 2017, 108). Of course, there are also situations in which the man is so indoctrinated on a certain subject that he refuses and rejects from the start even the best joke on that subject, but these cases are still few, being exceptions to the majority.
As manipulators exploit the chemical processes in the body, it is crucial to understand what these hormones are, and how they function. Serotonin is considered the most important bioregulatory substance with a role in people's emotional states, having a critical role in their good mood and preventing anxiety and depression. It can alter the activity of neurons by changing their permeability to sodium, potassium and chloride ions. These ions influence the electrical voltage of nerve cells. When the permeability to sodium ions decreases due to serotonin and the permeability to chloride ions increases, neural activity is reduced and thus a state of well-being is established (Houellebecq 2022). Serotonin is produced 80-90% in the gastrointestinal tract and the difference in the pineal gland, in the brain. It was discovered in 1948, by a team of US doctors from a clinic in Cleveland (Dăscălescu 2022).
Endorphin was accidentally discovered in 1960 at the University of California, San Francisco by neurochemist Choh Hao Li, while analysing fat metabolism, but he did not realize the importance of the discovery, at the moment. 15 years later, Hugh and Kosterlitz described the effect of endorphins, and Hao Li returned to the experiments and discovered the amazing effects of the substance. Endorphins are neurotransmitters that have properties similar to opioids and play an important role in regulating pain and well-being. Injected into the brain it is 48 times stronger than morphine, and injected intravenously 3 times stronger. Like morphine, endorphins are highly addictive (Ulieriu 2007).
Dopamine, known as the "hormone of pleasure", is a neurotransmitter closely related to endorphin. It is part of the brain's reward system and plays an important role in creating addictions of any kind. It also deals with the regulation, strength and nature of emotions. Laughter and humour can activate the same areas of the brain associated with reward and pleasure, which can trigger the release of dopamine. Anticipating a funny situation can also increase dopamine levels, improving mood and creating a sense of well-being. Dopamine was first synthesized in 1910 by George Barger and James Ewens of the Wellcome Laboratories in London, and its function, as a neurotransmitter, was discovered in 1958 by Arvid Carlsson and NilsÅke Hillarp, in the Swedish Pharmacology Laboratory of the National Cardiology Council (Nestler et al. 2009).
On today's online social networks, every "like" we receive from virtual friends gives us a shot of dopamine that keeps us coming back online to social networks dozens of times a day. The exploited psychological process that triggers the release of dopamine is that of recognition, appreciation and social belonging.
3.The Role of Humour and Laughter in Social Dynamics
Man, a result of animal evolution, has a few primal instincts inherited for thousands of years: fight or flight (in dangerous situations), feeding, mating, competition, and belongingness (group membership) (Beer 2017). He evolved as a hunter, but discovered over time that he has a better chance of survival in a group and, as a result, began to develop his tribal capacities for coexistence and conformity with others (Oprea 2021, 17).
Humour is a way to facilitate communication, exchange of information and ideas and socialization between people. At the same time, it is a social coagulant because people are social beings and they need to understand and use this tool to be able to get closer and integrate into the community, to achieve social cohesion. Making jokes and laughing with other group members increases the sense of connection and helps strengthen the relationship. From an evolutionary perspective, humour also contributed to the survival and perpetuation of the individual who possessed this ability, due to the increase in the capacity to deal with difficult situations by reducing stress, improving mental health, developing communication skills, forming social bonds, avoiding and de-escalating conflicts. People who had the ability to make and understand jokes were more likely to be accepted and appreciated in the group, which gave them survival and reproductive advantages. Happiness, a sense of security, inner peace and the satisfaction of appreciation, validation of the group are dependent on social inclusion, which is based on the human capacities of communication, interaction and pleasing ourselves. These aspects are decisive for the psychological development of human beings, and the stability of their mental and physical health (Psychologies 2016).
Historically, Aristotle assigned humour the first place in the category of the three social virtues, next to friendship and sincerity (Eagleton 2019, 114), and the first laughter in Western literature occurs in Book 1 of the Iliad, when the gods ridicule the lameness of Hephaestus, the God of fire.
From the point of view of social persuasion, it is proven by many studies that humour plays an important role because it transmits a positive emotion that contributes to increasing the effectiveness of persuasion (Oprea 2021, 114). Humour can function as a means of undermining tension and making difficult, delicate, taboo or tense topics less threatening and easier to discuss. People perceived as having a developed sense of humour are considered by those around them to have better social interaction skills (Bell et al. 1986), as being more intelligent and more attractive to the opposite sex, and this makes it easier for them to find a partner (Greengross and Miller 2011). A sense of humour also plays an important role in creating popularity, and popularity attracts more popularity (Bârgăoanu 2018, 119), characteristic that generally helps the individual in social interactions. Humour also contributes to obtaining material benefits. In this regard, O'Quin and Aronoff did an experiment in which they demonstrated that when a financial request is addressed with humour, individuals tend to accept it in a greater proportion (O'Quin & Aronoff 1981). People prefer the company of those who reward and resemble them, and are more easily influenced by those they like or admire.
Today's popular online social networks are built on exploiting people's need for recognition and belonging, because they fear isolation even more than error (Bârgăoanu 2018, 18) and the sense of humour plays an important role in this equation.
Laughter, a response to humour, is contagious and can have tremendous power over people, reaching the point where it can no longer be controlled. In the past, there have been laughing epidemics in China, Siberia and Africa that have engulfed entire towns (Eagleton 2019, 17).
According to the theory proposed by Rod Martin and Thomas Ford, there are four main types of humour, each with a specific role in social interaction.
The first type, affiliative humour, focuses on consolidating social bonds by expressing positive emotions and creating pleasant experiences. This type of humour can be observed in jokes that emphasize common experiences or positive traits of those around us.
The next type, self-protective/self-enhancing humour, is used to cope with stressful or unpleasant situations by finding humorous aspects in a difficult situation. For example, in contexts involving evaluations or tests, people may use this type of humour to reduce stress and tension.
The third type, aggressive humour, involves hostile or harmful behaviour towards a certain group or individual, through jokes or humour. This type of humour may be considered unpleasant or offensive by those targeted and can lead to social tensions or conflicts.
Finally, self-deprecating/self-defeating humour involves self-irony and selfdeprecation and is used to reduce social tension or avoid conflicts. This type of humour can be found in jokes that emphasize one's own mistakes or limitations, in order to make others feel more comfortable around them (Martin and Ford, 2018).
In Jewish culture and religion, laughter is respected and they even have a saint with this name, Isaac ("the one who laughs") (Ravich 2017, 88), but other nations also pride themselves on the quality of humour. Probably, the most recognized are the British, where banter and mockery have had a real attraction among club members in political discussions, since the 18th century, and among the population, jokes still revolve around the conflict between class cultures (Eagleton 2019, 35). Romanian people have also come to develop their sense of humour - this was probably stimulated by the years of communism, in which freedom of expression was limited and censorship was everywhere. Thus, people turned to humour both as an outlet for frustrations, by making fun of trouble and as a method of protest against the rulers, inventing thousands of banter and sayings, which they told in small groups of trusted members. Thus, the joke became an instrument of popular culture of resistance against the communist regime that loosened the grip of restrictions and had the tacit purpose of realizing the connection between people with the same values and principles.
Anthropologist Mary Douglas stated, in 1968, that "all jokes are subversive" (Douglas 1968).
4.Memes
The word "meme" was coined by Oxford biologist Richard Dawkins and first published in his book The Selfish Gene in 1976. It has since been picked up and used by psychologists and cognitive science researchers. The first definition of the meme stated that it "is the basic unit of cultural transmission or imitation". Later, after the term was taken up in other fields, it also received other definitions adapted to the respective sciences. For instance, there is Plotkin's psychological definition: "The meme is the unit of cultural inheritance analogous to the gene. It is the internal representation of knowledge". Here is another cognitive definition, belonging to Dennett: "A meme is an idea, a type of complex idea that constitutes itself into a distinct memorable unit. It spreads through vehicles that are physical manifestations of the meme". The fourth definition found, that of Richard Brodie, sounds like this: "A meme is a unit of information in a mind, the existence of which influences events in such a way that several copies of it are created in other minds" (Oprea 2021, 169). Finally, we also suggest a definition: a meme is a representation of a message in the form of an image, video, audio or a combination of them, mostly humorous and having the ability to go viral online due to the societal clichés it addresses, the linguistic, attitudinal or behavioural fashion, with which many people resonate.
The science that deals with the study of memes is called memetics and is based on evolutionary psychology. It looks at how memes function, interact, evolve, propagate and multiply (Brodie 2015, 7). This new science combines biology, psychology and cognitive science and coagulated in the 80's with Aaron Lynch, Howard Bloom, Susan Blackmore and Richard Brodie as the main researchers.
Humorous satirical memes are a funny critique of real or invented flaws and weaknesses of people and society. Their characteristics are irony, ridicule, exaggeration, the construction of unusual relationships of absurd appearance and sarcasm, a term originating from ancient Greek, referring to the tearing of flesh (Eagleton, 2019). The term "satire" was coined by Quintillian and comes from the Latin word "satura", which originally meant disorder or clutter. Satire is of three kinds: Horatian, in which humour is at ease and uses moderate comments and statements; Juvenalian, in which humour takes a back seat and criticism becomes more bitter and dark; and Menippean, in which serious moral judgments are used to address some controversial topics (Gotlieb 2019). Most of the memes identified to date, fall into the first two styles of satire, Horatian and Juvenalian.
By the year 2003, with the emergence and then the development of large-scale online social platforms, we can say that memes rooted in satire also took off, and now they have become part of the Internet culture, and are present daily in the streams of social networks in the form of images, videos, text messages or combinations thereof.
Memes are a very important tool in disinformation and manipulation campaigns because they have the ability, through humour, to go viral more easily, including among the educated, sceptical, defensive and insightful ones, and to influence them more or less subliminally to adopt certain attitudes or behaviours. They are widely used in election campaigns for ridiculing press subjects, creating information diversions, denigrating and mocking political opponents or creating a favourable image of a certain political figure. One of the first election campaign that made intensive use of the internet and memes was that in 2004, between George W. Bush and John Kerry.
Major Michael Prosser of the United States Marine Corps, knowing the teachings of the great Chinese strategist Sun Tzu, who claimed that the whole art of war is based on deception, drew attention as early as 2006 that memes can be effective tools for disinformation and manipulation of people and called the creation of a specialized structure to study this new weapon to be integrated in the new types of conflicts, and later the Armed Forces introduced this discipline in the curriculum (Prosser 2006).
Since 2016, the influence capacity of memes was officially acknowledged by including the term memetic warfare in NATO reports (Giesea 2016). The strategy of using memes is considered a part of guerrilla information warfare, with the aim of controlling narratives and psychological space to disrupt, denigrate, undermine and manipulate the perceptions of the enemy, in order to change their attitudes. The use of memes is an asymmetrical action, because the impact can be much greater than the resources invested (Memetic Warfare - Part 1 - Vol 1 - Nr.5 - 2017.pdf).
Among the major global players, Russia has been and remains a master of disinformation and manipulation, with a whole history behind this strategy. Lenin began to perfect information warfare as a cheap, but effective solution in the fight against the richer, and more technologically advanced West - a strategy also adopted by later Soviet leaders.
5.Spreading Memes
In 1580, Pope Gregory XIII established the committee of cardinals De Propaganda Fide, for the propagation of the faith (Guilday 1921). Later, Napoleon propagated his influence through printing (Wayne 1998), and Hitler used radio and television for propaganda (Volkoff, 2000). In recent times, Barack Obama, Donald Trump and Joe Biden have used the Internet and online social networks with real success in the fourth industrial revolution - the digital one, to influence the electorate. We gave this example of the three American presidents as their electoral campaigns are notorious in the media. However, they are not the only ones who successfully used these technique - the practice of influencing the electorate is spread all over the planet, regardless of the political regime, the economical level, and the cultural peculiarities of the country. The spread of disinformation and propaganda has stood the test of time and of geographical barriers, and has always successfully adapted to new technological breakthroughs, because leaders have always believed that it is not necessarily the reality of life that matters, but rather what people perceive and believe.
Social networks are designed and optimized to keep users captive for as long as possible, and thus are able to deliver a large series of advertisements, as these are their main source of income (Hâncean 2018). In order to do this, algorithms select and deliver to users content relevant to their preferences, based on a profile created from hundreds of points of interest (Voicu 2018, 359). For example, from the analysis of likes, the algorithms can predict with 60% accuracy if the parents of the analysed user on Facebook were divorced before the respective one turned 21 years old (Bârgăoanu, 2018, p. 175). This selective delivery of content through precision targeting is good for the platform's income, but dangerous for the individual and even for the society, because it creates so-called "personal bubbles" that exclude information and opinions contrary to the individual's beliefs and opinions, isolating and then bombarding them with content that reinforces their beliefs, resulting in a misperception of reality, that leads to its polarization and radicalization through mutual validation.
At a higher level, personal bubbles turn into group bubbles, where hundreds, thousands, or even millions of like-minded individuals group together and form the so-called resonance chambers. A good example of a giant echo chamber is Donald Trump's new social network called "Truth". In these rooms, the shared content is more easily received, accepted and assimilated because the filter of individuals is much more permissive to messages coming from people categorized as having similar ideals, goals and preferences. The credibility of the persuasive source plays a central role in the persuasive processes (Chelcea 2006, 143). These bubbles and resonance chambers also favour the phenomenon of projection of the individual through which they start to believe that everyone around them is alike, and the phenomenon of identification, through which they believe that they are like others and these false impressions strengthen their belief in their own ideas (Dobrescu and Bârgăoanu 2003, 256). This consolidation of ideas favoured by the group prepares the individual for contact with the world of outside information and ideas. When the individual encounters ideas and information contrary to their beliefs, that is outside the bubble, they self-confirm and reinforce the validity of their reasoning and beliefs, which at some point leads to exaggerated and potentially dangerous self-confidence (Oprea 2021, 130).
People are drawn to this isolation in personal bubbles or echo chambers such as closed Facebook groups for a number of reasons. The first reason is the psychological phenomenon of cognitive dissonance. This refers to the fact that man is psychologically uncomfortable when he encounters information or an idea contrary to what he already knows and believes, and the instinctive reaction is generally to de-stress by denying or avoiding information that contradicts it. Thus, information bubbles are perfect for avoiding cognitive dissonance (Britt 2019).
Another reason why individuals prefer belonging to bubbles and echo chambers is given by a concept from the area of social psychology, namely confirmation bias. This refers to the fact that people select from the total information with which they come into contact only those that confirm and support their existing beliefs, and ignore the rest. Confirmation bias is at the heart of online social media shares and their groups and bubbles. Social network algorithms favour the creation and proliferation of these groups and selective content because they do not want to lose the time spent on the platform by users, a phenomenon that would happen if they were exposed to a lot of dissonant content. Thus, extending the observations a little further in the Internet world, social networks work based on the imposition of the content provided, while search engines have turned, in many cases, from information tools into confirmation tools. Online social networks are also the ideal ground for the formation, deployment and manipulation of phenomena such as social proof, whereby people tend to believe something not because of arguments, but because they have the feeling that many others believe the same thing, the established herd effect (bandwagon) from politics because it gives an advantage to the perceived first place as people tend to join the winning team (Chelcea 2006, 247). There is also the classic spiral of silence, theorized by Elisabeth Noelle-Neumann in 1974, which refers to people's fear of being ostracized and isolated because of their opinions, and to avoid this, those who feel they are in the losing camp cease to be voices and express their opinions. The theory of the spiral of silence is based on alignment with the majority opinion and social conformity (Dobrescu and Bârgăoanu, 2003, 254). Summarizing the lines above, we would quote Walter Lippman: "where everyone thinks alike, no one thinks much".
Given that online social media has emerged as the primary theatre of operations for the dissemination of disinformation and manipulation campaigns, it is imperative to introduce the forces engaged in this conflict. These forces are primarily responsible for generating and disseminating propaganda, disinformation, and manipulation materials, as well as fabricating artificial traffic and engagement to create the impression of widespread support and acclaim. This tactic, commonly known as astroturfing, is facilitated by the use of bots and cyber troops. Bots are software programs designed to mimic the online behaviour of individuals, and are tasked with carrying out specific mission objectives set by campaign strategists. They can be utilized for a variety of purposes, including initiating and amplifying debates with the aim of polarizing audiences, supporting informational diversions, and facilitating the astroturfing phenomenon. Their main advantage is that they are difficult to attribute to a country or private structure, being, in theory, anonymized (Gîrdan 2020). In 2019, 37% of the internet traffic was done by bots, and in 2020 Facebook estimated that 5% of the accounts were fake (Oprea 2021, 156). Cyber troops are teams made up of people employed or volunteers in military, political or private structures, whose role is to influence public opinion on online social networks. The way these structures are organized is often like in journalistic newsrooms, where the editors work on the assignments, edit the content and then disseminate it, and later the activity is audited and rewarded according to performance.
Out of the manipulative content distributed by dissemination structures on online social networks, an important part is made up of memes. In the psychological warfare, state actors such as Russia, China, Iran or non-state actors, such as the terrorist organizations ISIS, Al-Qaeda, Boko Haram, Al-Shabab or even political organizations and politicians in the West, use directly or through specialized companies armed with trolls and bots to disseminate narratives and memes and build them notoriety, support and relevance. This is achieved because it exploits certain cognitive fallacies such as validation through social interaction, where individuals will believe and do what they believe others believe and do (Oprea 2021, 15). Here are two examples: the presidential election campaign for the 2016 election in the United States (Mueller, 2019) and the Brexit campaign for the 2017 Referendum, with the involvement of Cambridge Analytica and AggregateIQ (AIQ) companies (Zimmer 2018).
Memes, as a product used in influence campaigns, due to their apparently harmless character and the humorous message they carry, are disseminated very quickly and easily without being subjected to rigorous analysis. A very important role in this process is played by the means of propagation - in our study - social platforms. They promote the spread of memes by the ease with which this can be done. From just 1-3 mouse clicks or screen taps, the meme goes out to be displayed to hundreds, and even thousands of people. A study done in the US in 2021 showed that more than half of people share content on social networks without carefully analysing it (Pennycook et al. 2021). This happens, according to an MIT study, because fake and manipulative content is made more creative and exciting, specifically to stir up emotions and thus end up being shared 70% more likely than authentic and true content (Empoli 2019, 72) and political content spreads three times faster than that from other sources (Bârgăoanu 2018, 146). Interpersonal spreads are more effective than those generated by a top-down system because people tend to be more gullible when they see something shared by a friend, and in 59% of cases when they share it, they do so without analyse carefully (Oprea, 2021, p. 132). One suggested solution to combat scrutinized distributions was to multiply the steps until the process is complete. In this sense, it was suggested to introduce a small self-assessment of the veracity of the content by ticking on a scale the estimated degree of correctness of the content before the execution of the distribution (Pennycook et al., 2021).
Memes, like the advertising industry, exploit human genetic psychological weaknesses. For a meme to go viral, it doesn't matter how true it is, Alfred North Whitehead said: "There are no absolute truths, there are only half-truths" (Brodie 2015, 19). The most effective memes from the point of view of propagation are those that amuse, anger, scare or conquer us, as Bogdan Oprea claims, "in the face of emotions, the truth has no chance" (Oprea 2021, 21). What matters is the ability of the message conveyed by it to arouse feelings, emotions and, very importantly, to feed a need, curiosity or conviction of the receiver. There is a saying in advertising: "Sell the sizzle, not the steak!" (Dalgleish 2005).
Memes have a technological advantage, due to their mixed structure of text over images or video clips, use of slang, and the fact that the message can be changed. Because of the ambiguity, subjectivity, and interpretive nature of the messages conveyed, which often differ from one country or culture to another and have local specificities in most cases, they are very difficult to identify and combat, at present , by social media algorithms, and, as a result, they spread very quickly (Fisher and Snyder, 2021). Thus, they are considered a powerful tool of influence, because they are resilient, adaptable and infectious (Brodie 2015, 22-35).
Conclusions
Upon analysing online content, it becomes obvious that disinformation and manipulation techniques, whether old or new, have been adapted to suit the particularities of the digital environment, as well as the evolution or involution of human social and psychological capacities and characteristics. The COVID-19 pandemic and the conflict in Ukraine have served as catalysts for the deployment of various strategies, tactics, and techniques of online influence. These include the spiral of silence, the herd effect, social proof, authority, astroturfing, validation through social interaction, cognitive dissonance, confirmation bias, projection and identification, Freud's pleasure principle, Skinner's operant conditioning, and Thorndike's law of effect. These techniques target genetic vulnerabilities, such as the manipulation of neurotransmitters: serotonin, endorphin, and dopamine, commonly referred to as hormones of happiness, or mental vulnerabilities, such as the predominant exploitation of heuristic thinking or social and tribal habits that have been deeply ingrained in humans for thousands of years.
Online social networks have become the primary arena for psychological confrontations over the control of public opinion between state or/and private entities. Within contemporary hybrid conflicts, memetic weapons have emerged as a significant component of psychological guerrilla strategies. These weapons have the ability to generate disorder within a system and gradually instil attitudinal dispositions among the population to react in a particular way to an event, such as a war. Discrediting politicians, decision-makers, and key institutions, as well as denigrating democratic values and patriotism, can create an attitudinal fund for undermining democracy. Moreover, memetic weapons can be successfully employed for short-term objectives, such as influencing election outcomes by targeting specific politicians who are disapproved of and supporting alternatives.
When memes start to go viral, they acquire truth valences due to the availability bias, and their success in viralization is favoured because people tend not to carefully analyse their content and sources, particularly if they come from a friend, acquaintance, or person with whom they share the same values and beliefs. We consider humorous satirical memes in the age of the Internet and social networks to be very effective in influencing receivers with a higher degree of training. This is because if, under normal conditions, they do a fairly good filtering and framing of the received information, the humorous camouflage of messages is perceived as a communicative social act with the aim of entertaining and not to convey something serious, whereas jokes suspend social restrictions and conventions and thus succeed in penetrating the receiver's censorship. Even if memes are perceived as an attempt to influence, it is important that they reach the target and are not rejected from the start and categorically categorized as manipulative. In time, on the principle of the Chinese drop, influence will be achieved.
The memetic weapon will still work in democracies where Article 19 of the Universal Declaration of Human Rights on freedom of opinion and expression is respected. Technological limitations in detecting manipulative content online are beginning to disappear due to efforts by authorities and companies towards continuous improvements in AI-based algorithms and we have reached the point where technology meets ethics and the question arises as to how far we should allow technology to censor. By observing attentively what is happening in countries with dictatorial regimes, such as China and Russia, we could draw beneficial conclusions for democracy.
Discussions and debates will dwell around establishing a fair balance between freedom of expression and protection against disinformation and manipulation. In my opinion, the restrictions on expression must be minimal and limited to categorically disturbing topics and expressions, and the emphasis must be placed on warning messages about the source of information and sources of income, detailing the context, adding an explanatory message about the original source of the image or footage. This is because taking things out of context and using real images from one place or event as if from another place or event is frequent. If a piece of information or image is already identified as fake, it should be marked accordingly and linked to the page explaining the problem, giving and publicly displaying a rating of credibility of the established source, based on the history of the content disseminated, rating to influence and frequency of display in individuals' content streams.
We consider that state and private organizations should be supported both legislatively and financially to combat disinformation. Such initiatives and measures aimed to tackle disinformation were taken at the European Union level. Thus, in 2015, there was established a Department assigned to combat disinformation coming from Russia, called the East StratCom Task Force, then, in 2017, the High Level Expert Group (HLEG) on Fake News and Online Disinformation was created, and in March 2019, the European Commission decided establishing the rapid alert system for online disinformation.
There are private initiatives aimed at educating journalists and citizens who are aware of the issue of online manipulation, such as the "Identifying and Tackling Manipulated Media" project, which has received significant recognition from the world's largest press agency, Reuters, and is financially supported by Facebook.
Currently, the European Union finances the implementation of the European Digital Media Observatory (EDMO) project, intended to be a hub for fact-checkers, scientists and other involved structures, which will become a support for the development of public policies in the field to combat online disinformation.
Another example in this direction is the International Fact-checking Network (IFCN), an international network that brings together 71 verification platforms from around the world, launched in 2015 in the United States. It is our belief that the circulation of memes, news and content of a misleading nature, but accompanied by warnings and explanations, favours improving people's resilience to manipulation, through exemplary practical mass education.
In conclusion, the use of memetic weapons in online manipulation poses a significant threat to democratic values. The adaptation of traditional manipulation techniques to the digital environment, coupled with the vulnerabilities of human psychology, makes it easier for state and private entities to influence individuals and societies. However, technological advancements in AI-based algorithms, along with legislative and private initiatives aimed at combating disinformation, can offer a path towards a balanced approach between freedom of expression and protection against manipulation. We consider it crucial to continue supporting such initiatives and to educate the public on how to identify and tackle online manipulation, in order to strengthen our resilience to memetic weapons and preserve democracy.
BIBLIOGRAPHY:
Bârgăoanu Alina. (2018). #FakeNews O nouă cursă a înarmării (2018th ed). Evrika Publishing.
Beer Colin. (2017). instinct - Freud's Trieb | Britannica. https://www.britannica. com/topic/instinct
Bell Nancy, McGhee Paul & Duffey Nelda. (1986). Interpersonal competence, social assertiveness and the development of humour. British Journal of Developmental Psychology, 7(1), 51-55. https://doi.org/10.11n/j.2044-835X.1986.tb00997.x
Britt Michael. (2019). Experimente psihologice. Meteor Press.
Brodie, Richard. (2015). Virusul mintii. Cum nepoatefidefolosostiintarevolutionara. Memetica. Paralela 45.
Chelcea Septimiu. (2006). Opinia publică - Strategii de persuasiune si manipulare. Economică.
Chelcea Septimiu. (2019). Umorul în comunicarea persuasivă. Romania Sociala. https://www.romaniasociala.ro/umorul-in-comunicarea-persuasiva/
Dalgleish Scott. (2005). Sell the sizzle or sell the steak? Before spending money on images, spend money on a better quality product. Quality, 44(1), 26-27.
Dăscălescu Marius Adrian. (2022). Serotoniną - aspecte terapeutice şi stil de viaţă. Cum stimulăm producţia de serotonină? Medic.Ro. https://www.medichub. ro/reviste-de-specialitate/medic-ro/serotonina-aspecte-terapeutice-si-stil-deviata-cum-stimulam-productia-de-serotonina-id-6213-cmsid-51
Dixon Stacy. (2022). Global daily social media usage. Statista. https://www.statista. com/statistics/433871/daily-social-media-usage-worldwide/
Dobrescu Paul & Bârgăoanu Alina. (2003). Mass media si societatea (2nd ed). Comunicare.ro.
Douglas Mary. (1968). The Social Control of Cognition: Some Factors in Joke Perception. Man, 3(3), 361-376. https://doi.org/10.2307/2798875
Eagleton Terry. (2019). Umor. Baroque Books & Arts.
Empoli Giuliano. (2019). Inginerii haosului. ALL.
Fisher Sara & Snyder Alison. (2021). How memes became a major vehicle for misinformation. https://www.axios.com/2021/02/23/memes-misinformationcoronavirus-56
Giesea Jeff. (2016). It's time to embrace memetic warfare https://stratcomcoe.org/ pdfjs/?file=/publications/download/jeff_gisea.pdf?zoom=page-fit
Gîrdan Emil. (2020). Analiza de intelligence în social media. Editura Sitech.
Gotlieb Evan. (2019). What is Satire? |1 Definition & Examples. College of Liberal Arts. https://liberalarts.oregonstate.edu/wlf/what-satire
Greengross Gill & Miller Geoffrey. (2011). Humor ability reveals intelligence, predicts mating success, and is higher in males. Intelligence, 39, 188-192. https://doi.org/10.1016/_j.intell.2011.03.006
Guilday Peter. (1921). The Sacred Congregation de Propaganda Fide (1622-1922). The Catholic Historical Review, 6(4), 478-494.
Hâncean Marian-Gabriel. (2018). Reţelele sociale în era Facebook (2018th ed). Polirom.
Houellebecq Michael. (2022). Serotonină. Humanitas.
Karremans Johan, Stroebe Wolfgang & Claus Jasper. (2006). Beyond Vicary's fantasies: The impact of subliminal priming and brand choice. Journal of Experimental Social Psychology, 42(6), 792-798. https://doi.org/10.1016/_j. jesp.2005.12.002
Kurniawan Margaret & Hara Hiroshi. (2023). Guide to frame rates in movies | Adobe. https://www.adobe.com/creativecloud/video/discover/frame-rate.html
Martin Rod & Ford Thomas (2018). The Psychology ofHumor (2nd ed). https://www. elsevier.com/books/the-psychology-of-humor/martin/978-0-12-812143-6
Memetic Warfare-Part 1-Vol 1-Nr.5-2017.pdf Retrieved on August 15, 2022, from https://www.act.nato.int/images/stories/media/doclibrary/open201705memetic1.pdf
Mueller Robert. (2019). Report on the Investigation into Russian Interference in the 2016 Presidential Election. 448.
Munafo Mauro. (2021). Fake News - Hateri şi cyberbulling (2021th ed). Curtea Veche.
Nestler Eric, Hyman Steve & Malenka Robert. (2009). Molecular neuropharmacology: A foundation for clinical neuroscience (2nd ed). McGraw-Hill Medical.
Oprea Bogdan. (2021). Fake News si dezinformare online: Recunoaşte si verifică. Polirom.
O'Quin Karen & Aronoff Joel. (1981). Humoras a Techniqueof Social Influence. Social Psychology Quarterly, 44(4), 349-357. https://doi.org/10.2307/3033903
Pennycook Gordon, Epstein Ziv, Mosleh Mohsen, Arechar, A. A., Eckles, D., & Rand, D. G. (2021). Shifting attention to accuracy can reduce misinformation online. Nature, 592(7855), Article 7855. https://doi.org/10.1038/s41586-021-03344-2
Prosser Michael. (2006). Memetics - A Growth Industry in Us Military Operations; Degree of Master of Operational Studies by Major Michael B. Prosser, United States Marine Corps Academic Year 2005-2006. http://archive.org/details/ a507172
Psychologies Revista. (2016, March 14). De ce avem nevoie de umor. Revista Psychologies Romania. https://www.psychologies.ro/dezvoltare-personalacunoaste-te-2/cunoaste-te/de-ce-avem-nevoie-de-umor-2150994
Ravich Lenny. (2017). Terapia prin râs (2017th ed). Herald.
Sela Lee & Sobel Noam. (2010). Human olfaction: A constant state of changeblindness. Experimental Brain Research, 205(1), 13-29. https://doi.org/10.1007/ s00221-010-2348-6
Spence Charles. (2015). Leading the consumer by the nose: On the commercialization of olfactory design for the food and beverage sector. Flavour, 4(1), 31. https:// doi.org/10.1186/s13411-015-0041-1
Stamatin Georgiana. (2017). Teoria controlului reflexiv în logica războiului hibrid. Revista Intelligence. https://intelligence.sri.ro/teoria-controlului-reflexivlogica-razboiului-hibrid/
Stan Sonia Cristina. (2004). Manipularea prin presă. Humanitas.
Worldometers. (2023). https://www.worldometers.info/ro/
SternVictoria. (2015). AShortHistory oftheRise, FallandRise ofSubliminalMessaging. Scientific American. https://doi.org/10.1038/scientificamericanmind0915-16b
Taylor Petroc. (2022). Total data volume worldwide 2010-2025. Statista. https:// www.statista.com/statistics/871513/worldwide-data-created/
The Korea Times. (2022, January 11). In-ground traffic lights installed across Seoul for „smartphone zombies". Koreatimes. https://www.koreatimes.co.kr/www/ nation/2023/04/281_322052.html
Ulieriu Marc. (2007). Endorfinele - Moleculele fericirii. Descopera. https://www. descopera.ro/stiinta/929550-endorfinele-moleculele-fericirii
Voicu Marian. (2018). Fake news, manipulare, populism (2018-lea ed). Humanitas.
Volkoff Vladimir. (2000). Dezinformarea - Armă de război. Incitatus.
Volkoff Vladimir. (2009). Tratat de dezinformare. Antet.
Wayne Hanley. (1998). The genesis of Napoleonic propaganda, 1796 to 1799- ProQuest.<https://www.proquest.com/openview/a7eb49ac976ff769bf81dne01 4e99aa/1?pq-origsite=gscholar&cbl=18750&diss=y
Zimmer Bob. (2018). Democracy Under Threat: Risks and Solutions in the Era of Disinformation and Data Monopoly. 100.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2023. This work is published under https://cssas.unap.ro/en/periodicals.htm (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
In the era of communication, internet and social networks, memes have acquired a huge power to disseminate and influence people. This phenomenon is mainly due to the ease with which a message resonates with people and with one's friends and acquaintances, the lack of regulation of digital communication channels, and the way social media algorithms are programmed. They tend to favour the financial profit of social media platforms and less to combat bots, fake accounts and the propagation of misleading messages. In this paper, I will analyse and explain why humorous satirical memes have come to be used in influence actions and campaigns and why they have increased effectiveness, especially among trained people, compared to other techniques. The argumentation is built on two dimensions: that of genetic baggage and programming, together with the chemistry of the human body, where hormones play a very important role and that of the social, tribal dimension, where man is a being who wants and needs belonging to a group to feel safe, accepted and valued.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details
1 PhD Student within the National University of Political Studies and Public Administration, Bucharest, Romania