Time, as Henri Bergson described, is a continuous flow that we, through the process of perception, chop up into intervals—into befores and afters. This demarcation allows us to construct fictions, stories we tell ourselves to help us construct continuity between past, present, and future. These fictions in turn offer us the ability to act and react to the world outside of our bodies, and in particular, to each other.
Yet in the process, we cut out a lot of information. The selections we make often give us the impression that the narratives we spin could just as easily divert from the way we have constructed them. Everyone has different parameters, including and excluding different information. To overcome our differing views, we make agreements on the bounds of our individual truths. This is normalisation.
The problem is that some normalisations incorporate large losses of information, others smaller losses. Some are closer to what is considered universal, and some are more obviously fictitious. For all of us living inside these normalisations, however, it is hard to separate fact from fiction—our perceptions of the world external to us are hitched on a delicate network of fictions, so it is hard to tell which of our behaviours, fully enmeshed in these fictions, are responsive to facts and which to fictions.
In computation, our ability to separate fiction from reality is key to overcoming the differing views of individuals and our collective normalisations. The neutrality embedded in this concept is of the utmost importance to those who struggle against the irrationality of most normalisations, yet it is often forgotten that these normalisations are necessary for collective social behaviour to function.
And so the search for behaviour that is natural, that is universal, habitually tends to drift towards a discussion of the artificial, of fictions. The word artificial can be understood to denote anything actively created by people, while its supposed opposite natural is that which people have nothing to do with; what is natural exists regardless of human involvement. But every understanding of the non-human involves fictionalisation, lest it be incomprehensible; it is difficult to see where the fiction starts and where it ends.
Every moment, we create narratives to explain the phenomena that transpire before our eyes. We live through them, we act through them, we form communities through them. Even the hardest facts of life and of the universe, we turn into stories to communicate to others. In this sense, the entire world is artificial, in that the world exists, in our perception, only as a communicable narrative or series of narratives. To perception, nothing exists without its narrativisation and communication.
Yet humans are animals, and we know of other animals that they have some kind of determination in them, something that steers them and limits them. This principle, of biological need, is also wired into us. Everything we do, in the end, is something the forces of nature permit us to do. Buckminster Fuller suggests that there is therefore no true artificiality.
In my viewpoint, there is no meaning to the word artificial. People can only do what nature permits them to do. They do not invent anything. They make discoveries of principles operative in nature and often find ways of generalizing those principles and reapplying them in surprise directions. This is called invention. But they do not do anything artificial. Nature has to permit it, and if nature permits it, it is natural. There is naught which is artificial.
If this is true, how can everything be derived from nature, yet in derivation become all the more artificial? Nature permits us, even obliges us, to narrate ourselves. But it also permits us to choose that narration as we wish. This is only natural. We could then state that fiction is natural, being naturally allowed, even encouraged, by our genetic makeup. It is the natural state of things. Artificiality is then a natural evolution. There is nothing that is artificial, because everything is.
The subchapters to come include discussions and arguments by people stuck inside this logic. First, I discuss sociobiologists’ and social scientists’ attempts to look at mankind’s relationship to the natural, and try to filter out the malleable and artificial. Their goal is to find an indivisible account of existence that counts for all people, to which there are no exceptions. These discussions hinge on the question of whether nature determines our behaviour or if we determine it. The nature-nurture debate rages on, still, as we search for the origins of social conduct.
Similar discussions have been ongoing in the art world, with many in pursuit of a universal ideal of art and its purpose. This is in large part the conversation I aim to address here and in the following chapter. New technologies have always affected our understanding of what art is. Every generation of artists has tried to find purpose in their production, to develop a story to explain why their practice bears value, to prove the necessity of art and art practice altogether. This may take the shape of a search for true beauty, for the ultimate ratio, for a masterpiece that could transcend all ages and regimes. In any case, people have tried to find truth in their ever-changing surroundings. And when that truth is found, it is turned into yet another story.
What I want to bring forward in this chapter is the entanglement of fictions with our daily lives, and to ask whether these fictions are produced by our perception, by our understanding of the world, by our anticipation of the future, or whether we knew all along, in constructing these fictions—literary, political, economic, artistic—that they were indeed fictions. This, therefore, is my fiction—my explanation of the entwining fictions of our species and how they will be affected by current and oncoming waves of automation and computation.
2_0_FEEDING BACK
The formalisation of the indefinite repetition of the past into the future through computation alters the behaviour of individuals and communities drastically. Where computational systems of communication were first products based on people’s demands and necessities, computation eventually began to determine the demands and necessities. Decennia of feeding the database with information is currently feeding back.
Of course, the dynamics of feedback are not exceptional to our age. What is different for our current fictionalisation is that we have become very skilled at storing and accessing the fictions of the past, with a resolution so fine-grained that it becomes increasingly difficult to understand these fictions as fictional. Any feedback appears to be a natural transition.
In this chapter’s first part, I will examine the influence of the tropes of fiction on our behaviour, enforced by the ever limiting capacities of computation. In doing so, I will attempt to expose the relationship between the concept of the role—the identity, function, or character of the individual in society—and the communal desire to steer the collective future towards a repetition of the past, towards stability. The exploration begins with the following question: is the role a product of genetic determination, or is it the result of cultural normalisation? The answer to this question will allow us to formulate a possible or non-possible transformation of the role through computational thinking.
The assumption of a role—knowing your place—has taken many forms through human history, in the classification and delineation of societal conduct and in the specialisation of trades and activities to normalise the future. From the separation of labour tasks and establishment of chains of command to eugenics, apartheid, feudalism or caste-systems, the role, either enforced, implied, coerced into being, or taken up voluntarily, has had a tremendous impact on the course of history. It is my understanding that this will not change in the future, however contradictory it may seem to my statement regarding the unpredictable future. On the contrary, taking into consideration the evolution of the role during the past 100 years, the concept of the role has not vanished at all, yet has only been cast into a different foundation. Without comparing the validity of one social organisation to another, or placing them on an equal footing—I would never assume Nazi eugenics to be in any way similar to, say, the corporate ladder—I focus on the role as a means to establish identity either for the individual or the community. It is in this sense that the liberating ideology of modernity has hijacked the process of emancipation, which was a promise of freeing the individual, any individual, from the expectations of the division of roles at birth, and implemented a system of presumed self-regulation based on free will. The attachment of merit to societal position makes the role open to interpretation: if the role is not predetermined, then it has to be determined during the course of a lifetime, and this determination is, in theory, solely dependent on one’s personal desires and choices. This clean-slate principle, which holds that everyone is born an empty page that has to be filled through life is the essence of twentieth-century Western-identity ideology, coming to full fruition in the twenty-first.
To put it in a nutshell, “individualization” consists in transforming human “identity” from a “given” into a “task” —and charging the actors with the responsibility for performing that task and for the consequences (also the side-effects) of their performance: in other words, it consists in establishing a de jure autonomy (although not necessarily a de facto one). No more are human beings ‘born into’ their identities. […] Modernity replaces determination of social standing with compulsive and obligatory self-determination.
It is, however, a fallacy to assume that anyone is born unburdened and pristine and that their course of life can be entirely composed by personal choice and desire. Nevertheless, as a persistent normalisation concerning the necessity to divide the expectations of people’s actions according to roles and functions, the idea that “we choose who we are” prevailed. Consequently, the development of sociobiological studies and a better understanding of the human genetic mechanics has countered this idea by bringing natural determination—in which biological forces steer our behaviour—to the foreground.
If the choice of a role is dictated by culture, would it then be the cause of fictionalisation or the result? In other words, if roles are made up, does the creation of roles lead to the development of fictions, or are they influenced by fiction in the first place? Does art imitate life, or does life imitate art?
All the fictions we can contrive are the result of what nature permits us. So art must imitate life, which includes anything and everything we do. Everything that comes out of us is therefore nothing more than a natural process. Everything that we do is part of the natural flow of things. But also we create art because people before us have done so, following their fictions. We create stories because it is part of our culture, rather than part of our nature. We, then, do not initiate anything, we only continue: life imitates art.
The creation of stories can be seen as a natural development, but the definition of said stories is not. And so, the division of roles can be seen as a natural development, but the definition of said roles is not.
Religious fictionalisations, which fixed most people’s roles prior to the modern shift away from religious doctrine needed replacement in the nineteenth century. Early modernity found new stories in science, in the stories of progress and development, and of knowledge and advancement. It also found them in the fictions of universality, reproducibility, and the totality of information. This master narrative set the scene for a new kind of role playing. When the role is not defined from the start but chosen along the way, it is not embodied in the same sense. The role is played rather than blindly accepted. The normalisations of science and research implore curiosity and doubt, and only when a firm position is found can a statement be confirmed.
The void the dissolution of religion’s power left in its wake was filled with all kinds of stories, including stories about statistics, probabilities, and possibilities. About certainties in science, about laws of physics, and theories in biology, but also stories by unreliable narrators, by fantasies, new mythologies, and new superstitions,by stories of voyages to the moon and to the core of the planet, of visits to exotic lands or otherworldly planes, and of political intrigues and social dilemmas, by newspapers and magazines, novels and circus shows, by comics, plays, and radio broadcasts, films, television programs and computer games. Millions of modes of fiction influence our view of the world, propagating ideologies, normalisations, and stories to help us make sense of the random, complex and infinite world.
Today, panels of test viewers watch scenes of TV shows and Hollywood movies to give their developers a sense of what plays well and what doesn’t. Data of millions of viewers’ opinions, billions of watched hours, and trillions of box office sales determine the script of a narrative. Statistics on the probability of (financial) success decides whether a film is made, rather than passionate desires to make movies.
In any medium, a narrative can be thought of as a chain of events occurring in time and space and linked by causes and effects. . . . The basic principle of the Hollywood cinema is that a narrative should consist of a chain . . . that is easy for the spectator to follow. . . . The glory of the Hollywood system lies in its ability to allow its finest scriptwriters, directors, and other creators to weave an intricate web of character, event, time, and space that can seem transparently obvious.
The anti-ideological hypocrisy of computation now is everywhere. The scriptwriter follows a script, which is automated. The glory of Hollywood, as described here by film theorist Kristin Thompson, is little more than the ability of screenwriters and producers to follow the narratives of their own profession, to listen to market dynamics and to scientific analysis of storytelling. The analytical modus operandi of the film business not only creates fictional stories but also exists by the grace of fiction. The transparently obvious web of character, event, time and space is part of a blueprint always expected to work. Nothing is invented, as the knowledge on how to make movies that come across as sensible is transmitted from master to apprentice, from teacher to learner. But now, in the realms of computational thinking, the curriculum is fixed in the strata of marketing, audience analysis, viewer agreeability, and stardom. That which is expected from studios is presented as the way things are supposed to be, because they are general, averaged, and universal.
Interestingly, both filmmakers and consumers are gratifying this computational fiction. If computation presents the “final” script, the script that determines all scripts, then it must mean that there is no improvement possible. Repetition is the only way forward.
Stories produced by carbon-copying a successful script cultivate a culture of repetition. But they also produce a culture in which the lines between narrative fiction and normalising fiction tend to fade. Computation determines the ideal script based on what works best for the general audience, or in other words, based on what generates revenue. But the sweet spot of computational stability and its endless iteration generates our unconscious expectation that the same is true in real life.
Take, for instance, the following trope, common in soap operas.
The plot is always the same. In the first three minutes of the first episode the viewer already knows the novela will end with that same couple kissing each other. A telenovela is all about a couple who wants to kiss and a scriptwriter who stands in their way for 150 episodes.
People tend to identify with characters from TV shows—at least, that is often the goal of a screenwriter. She can portray a character in such a way that it joins up with real life, but then proceeds to put so many obstacles in its way that it becomes absolutely ridiculous. Nevertheless, the identification stays intact, leading to people believing that for a couple to kiss each other, they need intricate and elaborate stories to justify romance. It cannot be easy.
The separation between real life and fiction becomes even less apparent in reality shows. Here, reality is one to one scripted, dramatised, emotionalised. But all with the clear and shrewd objective of relating to viewers. The behaviour of the subjects is considered as real as it gets. And all the hardship and troubles they have to persist are real. So are their victories.
Of course, that is not how actual life unfolds. But the difference between real and fake, between spontaneity and script, is rendered vague. So the observed behaviour becomes the lead for expected behaviour. It becomes the normalisation, the culture. The computational understanding of behaviour feeds back.
It is this dynamic that I find particularly interesting in our journey towards a definition of the role of the artist in an automated future. In the next few chapters, we will go over the influence of computational normalisations on behaviour, both individually as well as in a group, and how the concept of the role is altered by the expectations of behaviour. We will look at the tropes of fiction as enablers for identity choice and as a guide for picking a role. And most importantly, we will look at roleplaying in its most extreme form, where narrative and identity intertwine completely, and use that model to envision the future under the influence of computation. This will lead us eventually to look at computer games as a perfect illustration of this model.
2_1_1_The Role
Without wandering too far off into ongoing discussions on identity politics, nationalism, tribalism and the like, I want to look deeper into the role of the individual in society—into what sets you and I apart, and how we create an us as separate from them. The individual identity, a tool we use to differentiate between people and find connection to others, is a product of normalisation altogether—a cultural invention, a fiction. Identity finds a place for us within a larger social whole, as it creates Belgians in Belgium or Germans in Greece, sons to parents or mothers to children, members of sports clubs or executives in firms. It defines a plumber as specifically not a trader, a priest as not an artist; it defines enemies of the state, intruders, and parasites as outside an attendant group, or model citizens, allies, and friends within. A friend in one group might be an enemy in another. Identity has no absolute value. It can be based on our passions, what we pursue, our jobs, our heritages, our burdens and punishments. The concept of the role is therefore ambiguous. Identity and personality are inextricably intertwined. It could be argued that personality induces identity: that personal motives, characteristics, behaviour and desires define who we are and how we then use those traits to construct an individual uniqueness towards society. Yet the opposite is also imaginable.
This is where the discussion of nature versus nurture enters the debate on the subject of identity. The idea that nature differentiates, divides, and produces uniqueness amongst mankind is one well supported by specific interpretations of biology and evolution. Sociobiologists believe that personality can be traced to biological attributes, rather than cultural inscriptions. Defenders of biological determinism argue that the assumption that every newborn is a blank slate is a slippery slope, because it also proposes that every person is malleable, ready to be cast into the molds of societal fictions. It justifies the ideology of the manufacturable world. If the personality of people is shaped by input from her or his surroundings, then it should be possible to design those surroundings to influence personality. One of the critics of the blank slate theory, cognitive psychologist and linguist Steven Pinker, suggests that the concept has been a persistent axiom in twentieth-century psychological research, and ultimately in societal conduct altogether. He retraces the concept to a quote from the seventeenth-century philosopher John Locke, in which Locke refers to our minds as empty pieces of paper:
Let us then suppose the mind to be, as we say, white paper, void of all characters, without any ideas:—How comes it to be furnished? Whence comes it by that vast store which the busy and boundless fancy of man has painted on it with an almost endless variety? Whence has it all the materials of reason and knowledge? To this I answer, in one word, from experience.
While Locke attempted to expose the irrationality of dogmas in his time, such as the divine right of kings or the hereditary titles of lords, his words inadvertently also suggest the perfectability of the human personality through steerable experiences. If experience alone makes up our behaviour, our desires, and our characteristics, then designing a series of experiences or modelling an environment can ultimately determine the shape of any single human personality. The doctrine, the normalisation, of the blank slate, according to Pinker, led to the proliferation of social-engineering projects, tabula rasa architecture, and notions of political equality wrapped up as political similarity—being equally blank means being equally accountable to the law:
Social scientists saw the malleability of humans and the autonomy of culture as doctrines that might bring about the age-old dream of perfecting mankind. We are not stuck with what we don’t like about our current predicament, they argued. Nothing prevents us from changing it except a lack of will and the benighted belief that we are permanently consigned to it by biology.
The denial of human nature as a predestined condition can and did bring about theories of malleability. But first it got rid of the dogma, with which the world of John Locke was riddled. Pinker goes further to state that this experiential discovery of the world shaping personality and behaviour not only liberated society from predestination, but made everyone blind to the immutable influence of genetics and evolution, to human nature.
I’ve argued that grounding values in a blank slate is a mistake. It’s a mistake because it makes our values hostages to fortune, implying that someday, discoveries from the field or lab could make them obsolete. And it’s a mistake because it conceals the downsides of denying human nature, including persecution of the successful, totalitarian social engineering, an exaggeration of the effects of the environment (such as in parenting and the criminal justice system), a mystification of the rationale behind responsibility, democracy, and morality, and the devaluating of human life on Earth.
Our values are indeed hostages to fortune, in the sense that values are normalisations. They change all the time. But Pinker here exposes an archetypical desire for universality—for an absolute human nature, one that is immutable and stable. That the essence of humanity is not prone to fortune. It is an almost melancholic experience to read his rants against the doctrine of the blank slate. And in this battle I do sympathise with him. Because I believe, too, that the blank-slate theory erroneously implies a completely manufacturable world, a theory so gullibly adopted by computational thinking.
Pinker chooses a rather unwieldy battle. He blames the theory of the malleability of mankind for the negation of human nature. But he then proceeds to frame human nature as being responsible for many aspects of human life and society that can be hardly associated with nature at all. Human nature, as I have proposed earlier, forces us to select. How that selection is made is entirely up to choice and chance. Pinker talks about the denial of human nature leading to the persecution of the successful—he may as well have meant the persecution of the unsuccessful. To consider this behaviour naturally determined is to misunderstand where values come from. To attribute the fortunes of a person to their innate abilities to gather wealth or prosperity is to negate the entire cultural system which enables wealth in the first place. Pinker, here, confuses biological success with cultural success.
Evolutionary biologist Stephen Jay Gould gives us a more nuanced version of human nature. Gould positions the concept of human nature in a rather ideological debate, with a dangerous deterministic stance on one side, and nihilistic openendedness on the other. He therefore suggests a defusal of the concept, downplaying but not neglecting the influence of human nature altogether.
Most biologists would follow my argument in denying a genetic basis for most behavioral differences between groups and for change in the complexity of human societies through the recent history of our species. But what about the supposed constancies of personality and behavior, the traits of mind that humans share in all cultures? What, in short, about a general “human nature”? Some biologists would grant Darwinian processes a substantial role not only in establishing long ago, but also in actively maintaining now, a set of specific adaptive behaviors forming a biologically conditioned “human nature.” I believe that this old tradition of argument—which has found its most recent expression as “human sociobiology”—is invalid not because biology is irrelevant and human behavior only reflects a disembodied culture, but because human biology suggests a different and less constraining role for genetics in the analysis of human nature.
We try to find ant-like, or wolf-like, or ape-like behaviour in people. Yet we are not critters, following our instincts, preprogrammed to exhibit certain behaviours. We, as persons, as individuals, might be predetermined to an extent, but it is in our interactions with others that our biological aspects start to fade into complexity. To understand one form of behaviour to little more than human nature is to neglect the intricacy of human conduct. The fact that our societies look completely different from what they did, say, 50 or 100 years ago proves that the continuity attributed to human biology does not have an extreme impact on the development of culture.
It is possible to argue against the idea that every creation of human conduct is natural because humans too are natural. And that the role in society is therefore part of human nature. The bestowment of roles may be part of human nature. But there is no proof that appropriation of the role follows a naturally predetermined pattern. In other words, we might be naturally forced to define roles, as we are determined to delineate the world of infinite possibilities in order to act and react, but what those roles are supposed to be is by no means inscribed into our biological build. At every instance in time, roles are re-interpreted and re-divided—so much so that the roles of hundreds of years ago seem completely silly or outlandish to us now.
It is very difficult to differentiate between nature and culture, as the latter affects our understanding of the former. As Pinker illustrated, this mix-up can lead to assumptions on biological determination which holds no sense. But the opposite is possible too, leading to a denial of biological processes, of which (cognitive) selection is the most important.
If people are so similar genetically, and if previous claims for a direct biological mapping of human affairs have recorded cultural prejudice and not nature, then does biology come up empty as a guide in our search to know ourselves? Are we after all, at birth, the tabula rasa, or blank slate, imagined by some eighteenth-century empiricist philosophers? As an evolutionary biologist, I cannot adopt such a nihilistic position without denying the fundamental insight of my profession. The evolutionary unity of humans with all other organisms is the cardinal message of Darwin’s revolution for nature’s most arrogant species.
The fact is that we are all more similar than we would think, and equally more different than we would like. In grouping individuals—let’s say, into women and men—we compare one group to the other. Men are, biologically, different from women. At least when we put the average man next to the average woman. Yet this assumes the range of all women and all men to be the same, compacted into a mean, represented by an averaged-out archetype. Comparing every individual man to every individual woman would deliver an entirely different outcome. The difference between the parts of one group is much greater than the difference between the averages of two groups as a whole. An intuitive approach to this concept is clarified in the discussion between homicide detective Del Spooner and the robot suspect Sonny in the movie I, Robot, loosely based on Isaac Asimov’s short story:
Spooner: Human beings have dreams. Even dogs have dreams, but not you. You are just a machine; an imitation of life. Can a robot write a symphony? Can a robot turn a... canvas into a beautiful masterpiece?
Sonny: [with genuine interest] Can you?
The same logic works for genetics in general. Our genetic code is far more similar than it is different: most of us are born with noses, eyes, legs and arms, a brain, and a heart. Taking humanity as a whole, we are all homo sapiens, not of other species. This makes us all very similar, in a fairly obvious way. Yet it is undeniable that within the group of homo sapiens, there is much difference. We can define mankind as one species, but this, in turn, does not make us equal by any means. Arguments based on the concept of human nature, then, are assumptions that address the totality of mankind but leave out our inherent differentiation.
I believe that human sociobiologists have made a fundamental mistake in categories. They are seeking the genetic basis of human behavior at the wrong level. They are searching among the specific products of generating rules […] while the rules themselves are the genetic deep structures of human behavior. For example, E. O. Wilson (1978, p. 99) writes: “Are human beings innately aggressive?” This is a favourite question of college seminars and cocktail party conversations, and one that raises emotion in political ideologues of all stripes. The answer to it is “yes.” As evidence, Wilson cites the prevalence of warfare in history and then discounts any current disinclination to fight: “The most peaceable tribes of today were often the ravagers of yesteryear and will probably again produce soldiers and murderers in the future.” But if some peoples are peaceable now, then aggression itself cannot be coded in our genes, only the potential for it. If innate only means possible, or even likely in certain environments, then everything we do is innate and the word has no meaning. Aggression is one expression of a generating rule that anticipates peacefulness in other common environments. The range of specific behaviours engendered by the rule is impressive and a fine testimony to flexibility as the hallmark of human behaviour. This flexibility should not be obscured by the linguistic error of branding some common expressions of the rule as “innate” because we can predict their occurrence in certain environments.
To the idea that personality produces identity, I would argue that the connection between the two is rather flimsy. The discussion on the source of personality—the-nature versus-nurture debate—has clouded the general view on identity, since both personality and identity are traits of the individual, which they either inherit or develop over time and therefore seem similar. And while I don’t deny that personality reflects identity, I cannot reconcile the fluctuating and relative nature of identity with the static concept of personality. Regardless of the stance towards personality as being a natural given or a cultural conditioning, or a combination of the two, it evolves over time and develops during the span of a lifetime, but it does not radically change when the environment changes. Identity does. It adapts constantly, even when moving from one room to another. You are not the same person when talking to your high school friends or when engaging in networking at a reception. Your identity changes when moving from one city to another, or from one country to another, drastically. Your identity is different when talking to your children than when talking to your parents. Identity is, in this sense, the shape of a person when interacting with society, with a group, with a situation. And while it is influenced by personality, it is not defined by it. We are never the same person in any environment. Every difference in environment requires a different identity, and thus requires us to play a different role. Which role that is depends on the expectations of said environment. For instance, within my own community, my nationality does not figure into the expectations of my identity, whereas when traveling abroad, the association with a nation becomes part of it. When outside the normalised situation, abnormality becomes a factor which differentiates, and thus becomes part of the identity. It steers towards the convenient role to play to fit the expectations of a particular scene. Fiction demands personality to become part of identity.
The many faces of identity set it apart from personality in that it is solely culturally dependent, and thus entirely normalised. Because of this, the evolution of computation has such a tremendous impact on it. If the ways we act in different environments are adapted to the narratives of those scenes, the fictions that inhibit groups and that create expectations of behaviour, then they are highly prone to the influences of automation designed to fix expectations and repeat those expectations indefinitely.
This brings us to the essential question of this chapter: Are we ever our true selves, unencumbered by the fictions of society? Are there ever moments where the innate characteristics of our personalities are not influenced by the expectations of our surroundings? The notions of the blank slate andthe opposed theory of genetic determinism can never be fully tested or completely simulated, since we are confronted with environments from the moment we are born. Even the environments of computation, which we consider able to liberate us from social constructs, lift barriers of scrutiny, and presumably allow us to be our true selves not only bring to life their own expectations, but also never fully detach from the cultural upbringings of the people behind them. It is true that the inconspicuousness of the internet brings out the worst in people, who feel liberated from accountability, but to say that this behaviour reflects naked, almost primitive personalities, void of social restraints, would be incorrect. Online behaviour, and the identities that we tailor to condone it, is as much a fiction as any offline equivalent.
Considering a person to be a container of a personality, which it unquestioningly follows through any situation, remaining constantly ”true to oneself,” is a myth that is mostly attributed in hindsight to unwavering, headstrong men, who take up real leadership and show us “the way” without compromises, or virtuous women following the same baseline principles in any situation. They are presumed to be true, genuine personalities. But the breadth, variation, and hypocrisy of holsitic identities become fairly apparent when we look at celebrities or politicians, who are under the constant surveillance of the public. Because their comings and goings are recorded in the never-ending archives of computation, and dispersed as such, their change in identity according to different situations reveals itself. They are now simultaneously role-models, leaders, parents, children, members of a club or a board, friends, family, sports women or men, writers, readers and so on. One cannot expect of someone to behave equally in all situations, yet if they act differently in one situation, they are critiqued for it in all others. The immutability of identity demands the role that people play to remain identical. Happenings in the past are laid out side by side to highlight inconsistencies, hypocrisies, in identity, tackling the credibility of a person in power. By constantly digging up the dirt, which, if we can believe popular fiction films and series, everyone has, the virtuous and unwavering people of the past have vanished. Any trust in consistent identities is lost.
Hypocrisy is impossible to avoid when the parameters of identity and personality are considered one and the same. But changing your language to fit the ears of a child, and immediately after to address a construction worker is a matter of understanding the conventions of human interaction. If the child-addressing language would be part of your personality, you would be considered an outcast in many environments. Your role changes when you change environment, and with it your behaviour. We are all chameleons.
Nevertheless, the dynamics of computation reinforce the conjunction of identity and personality, by averaging out the behaviour of people, and classifying them by types and characters, by creating bubbles of similar people who presumably always behave in the same way, by imposing models of association which propose approximations of the true self, like a horoscope, applicable to anyone with enough imagination — fictions to explain the fluctuating identities of people, but ultimately to steer them towards stability. The role within the computational world becomes identity and personality as one: the desires of people similar to me, become my desires; the actions of people similar to me become my actions; the behaviour of people similar to me become my behaviour. Identity, as a narrative, as an interpretation of a situation, solidifies in personality when the platforms through which we expose ourselves allow only for a prefabricated building set of traits and interests which fit within the model. We can be only our “true selves,” those designed in advance. Nothing is forgotten, every opinion, every ”share” or action is kept, stored, and referred to in the future. Our identities are then expected to remain the same, and any alteration can be regarded as hypocritical. Our roles are fixed.
2_1_2_Archetypes of Identity
It is up to each person to recognize his or her true preferences.
— Isabel Briggs Myers
To fully explain what the feedback of fiction onto identity entails, and how the conflation of identity and personality intensified this dynamic, we need to take a closer look at the popularisation of personality classification—in the early days of psychoanalysis a rather niche area of expertise, but one with many followers and advocates nowadays.
Carl Gustav Jung, the founder of analytical psychology, described a fundamental differentiation between people in his 1921 treatise on psychological types. He saw and described a tendency of character towards two general groups with two main attitudes: extroversion and introversion. The two types are so radically different, Jung argues, that even the uninitiated in psychological matters can comprehend them: “Who does not know those taciturn, impenetrable, often shy natures, who form such a vivid contrast to these other open, sociable, serene maybe, or at least friendly and accessible characters, who are on good terms with all the world, or, even when disagreeing with it, still hold a relation to it by which they and it are mutually affected?”
Naturally, at first, one is inclined to regard such differences as mere individual idiosyncrasies. But anyone with the opportunity of gaining a fundamental knowledge of many men will soon discover that such a far-reaching contrast does not merely concern the individual case, but is a question of typical attitudes, with a universality far greater than a limited psychological experience would at first assume. In reality […] it is a question of a fundamental opposition; at times clear and at times obscure, but always emerging whenever we are dealing with individuals whose personality is in any way pronounced. Such men are found not only among the educated classes, but in every rank of society; with equal distinctness, therefore, our types can be demonstrated among labourers and peasants as among the most differentiated members of a nation. Furthermore, these types over-ride the distinctions of sex, since one finds the same contrasts amongst women of all classes. Such a universal distribution could hardly arise at the instigation of consciousness, ie. as the result of a conscious and deliberate choice of attitude. If this were the case, a definite level of society, linked together by a similar education and environment and, therefore, correspondingly localized, would surely have a majority representation of such an attitude. But the actual facts are just the reverse, for the types have, apparently, quite a random distribution. In the same family one child is introverted, and another extraverted.
Jung frames these types of personality as being differentiated by nature, since they seem to operate at the level of consciousness, unaffected by environment or cultural heritage. He states that “the attitude-type regarded as a general phenomenon having an apparent random distribution, can be no affair of conscious judgment or intention, its existence must be due to some unconscious instinctive cause.” He therefore concludes that “the contrast of types, as a universal psychological phenomenon, must in some way or other have its biological precursor.” The universality of his theory is striking: everyone, regardless of their sex, class or upbringing, can be classified as being either an extrovert or an introvert.
While my summary of Jung’s view on personality might not do his work justice—his work rests on largely empirical evidence, yet his interpretation of this evidence is hardly ever assumed to be absolute—it serves as an introduction to what succeeded his writing: Jung’s work was adapted, reinterpreted, and elaborated to classify personality into tables or ranks, and then evolved into a method to define identity, rather than solely personality. The Myers-Birggs Type Indicator is probably the most famous of these adaptations.
After thoroughly studying Jung’s theory, writer Katharine Cook Briggs and her daughter Isabel Briggs Myers formatted a practical application of psychological type. Briggs and Myers began creating their indicator during World War IIin the belief that the knowledge of personality preferences would help women entering the industrial workforce for the first time to identify the sort of wartime jobs that would be the “most comfortable and effective” for them.
It is fashionable to say that the individual is unique. Each is the product of his or her own heredity and environment and, therefore, is different from everyone else. From a practical standpoint, however, the doctrine of uniqueness is not useful without an exhaustive case study of every person to be educated or counseled or understood. Yet we cannot safely assume that other people’s minds work on the same principles as our own. All too often, others with whom we come in contact do not reason as we reason, or do not value the things we value, or are not interested in what interests us. The merit of the theory presented here is that it enables us to expect specific personality differences in particular people and to cope with the people and the differences in a constructive way. Briefly, the theory is that much seemingly chance variation in human behavior is not due to chance; it is in fact the logical result of a few basic, observable differences in mental functioning.
Briggs and Myers popularised the indicator, concocting comprehensible and widely applicable tests, and set up a sort of quality-control label, formalising their initial research with a corporation that provided detailed discourse on interviewees, patients, and clients worldwide. The Myers-Briggs brand is now an authority in the field of personality assessment, and it is difficult to get into any corporate profession without ever coming in contact with the indicator tests or results.
The philosophy at the heart of the Myers-Briggs Company is completely in line with that of computational thinking: the indicator is a system that enables the comparison of apples with oranges. Psychometrics establishes the parameters with which personalities are categorised, and subsequently renders total the variations in such personalities. No one’s personality falls outside of these categories;everyone fits inside one of sixteen boxes. You can be an ISTJ (Introversion, Sensing, Thinking, Judging), which makes you:
Quiet, serious, earn success by thoroughness and dependability. Practical, matter-of-fact, realistic, and responsible. Decide logically what should be done and work toward it steadily, regardless of distractions. Take pleasure in making everything orderly and organized – their work, their home, their life. Value traditions and loyalty.
Or an ENFP (Extraversion, Intuition, Feeling, Perceiving):
Warmly enthusiastic and imaginative. See life as full of possibilities. Make connections between events and information very quickly, and confidently proceed based on the patterns they see. Want a lot of affirmation from others, and readily give appreciation and support. Spontaneous and flexible, often rely on their ability to improvise and their verbal fluency.
But, according to the indicator, you can be nothing in between. And while the allocation of traits to types have been proven arbitrary —the types vague and general enough to apply to anyone—this has not slowed down the Myers-Briggs Indicator’s popularity. The stability and certainty offered by the indicator is intoxicating. The test first presents you with a clear explanation of who you are, and then, you become that explanation. Your personality type determines your role. And every action you thereafter take, emotion you feel, or interaction you take part in gives you evidence of the type’s accuracy. “Oh, I’m such an ENFP!” goes a common refrain.
Psychological types, first envisioned means of understanding a person’s motives and desires, stripped from fictionalisations, are here integrated back into normalisation. They become part of a collective story again, fed back to the fictions, where people demand others to reveal their personality and keep it fixed to create an identity. Judgement of character is, then, an identity check.
This goes against Jung’s original understanding of identity, or what he calls “persona.” The collective story, referred to by Jung as the “collective psyche,” implies the existence of an individual role, one that is taken up amid societal life.
This arbitrary segment of collective psyche — often fashioned with considerable pains — I have called the persona. The term persona is really a very appropriate expression for this, for originally it meant the mask once worn by actors to indicate the role they played.
When we analyze the persona we strip off the mask, and discover that what seemed to be individual is at bottom collective; in other words, that the persona was only a mask of the collective psyche. Fundamentally the persona is nothing real: it is a compromise between individual and society as to what a man should appear to be. He takes a name, earns a title, exercises a function, he is this or that. In a certain sense all this is real, yet in relation to the essential individuality of the person concerned it is only a secondary reality, a compromise formation, in making which others often have a greater share than he. The persona is a semblance, a two-dimensional reality, to give it a nickname.
Still holding onto the idea of the true self, the subconscious personality, Jung recognises that it is especially difficult to rid a person of persona. Individuation, the becoming of one’s self or self-realisation, takes on mythical proportions for Jung.— Persona, as an indicator of the role, is considered something unreal or at least secondary—a mask we should cast off to discover who we really are.
The role has always been part of who we are—even if it switches all the time, even if it’s unstable, even if it’s constantly influenced by the exterior. Our excessive search for our true selves has blinded us to its importance by presenting the role itself as a traditional feature, as a normalisation. I present the role as something more: it is the necessary collective story. Without it, we are left defenceless against the forces of the unknown and infinite future. However, the contents of the story are not predefined. So how we interpret the role as such is open to discussion. But it is the conflation of these two parts—the role itself and the interpretation of the role—that renders the concept of the role ambiguous, false, or unreal and that jumpstarted an inquiry into the undoing of the role altogether, and into the origins of our incentives. The very notion that the role might lie at the basis of our incentives was abandoned.
The mask of persona is presumed to work in one direction, shielding our true selves from the pressures of the collective psyche. It is like a fortification, built to protect the fragile but immutable personality against the relentless attacks from the environment. It must keep the interior pristine. But it also prevents the true self from revealing itself. It is hidden behind the grand walls of persona. By tearing down the walls, the reality of the individual is assumed to present itself, finally coming to a real person, and not just a fictitious character playing a role in front of a fake decor. But this mask of persona is not a wall at all; it is very much part of the true self, working omnidirectional, permeable, allowing the true self to be impregnated by the fictions of the collective, imbuing it with ideas and concepts which become part of its reality. And these ideas can be lost again, replaced by others, changing the true self continuously. We can play different roles, and become those roles entirely and without repudiation. They are not all disguises or façades, but expressions of our true selves in different environments.
But this idea is particularly interesting when computation comes into play. When the destruction of our masks becomes our new mask, and when only one mask is then permitted—the true persona—a general confusion arises. When identity is fixed, we start picking identities like products from a supermarket and adapt ourselves even more to the descriptions of these personas. The true self transforms completely into the processed identities of computation, and life becomes a roleplaying game.
In the next chapter, I will therefore look further into the extremes of roleplaying—in the most computational sense of the word—to find strategies for overcoming or manipulating the personality-identity compression. In the end, our general confusion about identity and persona offers possibilities too—especially for the artist.
2_2_THE TEACHINGS OF GAMES
The role, being a product of cultural normalisation, is flexible. Its meaning is malleable and therefore corruptible. Each individual’s position in society is therefore in constant evolution. In studying how the perceptions and prospects of the role change under the influence of computation and computational thinking, roleplaying games, particularly those that take place online, within a virtual world, provide helpful insights into the future’s potential societal life.
Communities of online gamers have helped to define, over the past thirty years, different kinds of communities, where avatars and other symbolic representations of personal attributes are the only that matter. Gamers can hide behind these figures, or they can use them to show their true selves. Avatars take away barriers, but they also throw up others—geographical, physical ones.
The worlds in which these personae live are streamlined by computational thinking. They exist only by grace of computation. They are fantasy universes, with their own laws of physics. They are artistic expressions, but enabled by the logic of software engines and drivers. Roleplaying worlds are part of and play host to roleplaying games, and are therefore bound by rules. Some present themselves as open to interpretation, but they are in fact only visually open. They allow no spontaneity or ambiguity, no actual unpredictability or true randomness.
If players have to adapt to the world they enter, what does it mean for them to play this role? How do we communicate with each other? And is it possible to resist computation’s deterministic framework without completely withdrawing from it? These are some of the questions that I will try to answer in the next chapters.
2_2_1_Pick Your Player
Popularised through the second half of the twentieth century, the narrative personification of fiction has today become central to a number of subcultures. While the act of assuming a fictional persona in public interaction, rather than in theatre or opera, remained, through masqueraded balls, carnivals, or fancy dress parties, constrained to the spheres of the elite from the fifteenth century on, in recent history, costumed impersonation turned into a preferred pastime for the many. Costuming play, or cosplay, finds its origins in the science-fiction and fantasy fairs of the 1930s in the US, where the act of dressing up in character, derived from the Halloween tradition, met the popularisation of the far-fiction genre like science fiction or fantasy. These conventions hosted the authors of critically acclaimed science-fiction stories and provided opportunities for visitors to boast their knowledge of their favourite fictional characters. Dressing up during the conventions became more common as the genre grew in acceptance and fame, eventually leading to entire subcultures of cosplayer gatherings and masquerading contests.
The practice of narrative personification—opposed to the scripted nature of actors playing characters by the book—relies on a fictional avatar’s fleshed-out backstory and motivation and allows fans to place that character in a non-scripted or loosely scripted environment. Interaction with the random environment is then determined by the cosplayer’s interpretation of their character’s intrinsic motivation, and requires the player to improvise according to the limitations of the chosen persona. Cosplay is therefore a form of performance art: it asks more of its players than to just dressing up, but does not require a stage or decor. According to Chris Kincaid, there are four aspects to cosplay that are considered vital to the practice:
Narrative – personality and story of the fictional character
Clothing – design of outfits and community surrounding this design
Play – mimicking mannerisms of characters as accurately as possible
Player – character and identity of the cosplayer
Cosplay found popularity alongside the rise of the fantasy roleplaying game Dungeons & Dragons, introduced to the world in 1974. D&D Players imagined their roles, their narratives, and their play. In other words, they picked their characters for reasons not limited to physical appearance. Now, players could be anything their imagination allowed them to. While cosplay influenced roleplaying, and vice versa, the two can be seen as separate historical strains following similar concepts of narrative, design, play and player.
In D&D, players adapt to a story told by the dungeon master, the participant who controls the environment, sets up the scenario, personifies antagonists, and manages the statistics of the game. She represents the rulebook and the referee. The rest of the players then interact with her—with both the environment she creates and with her as a player—in order to solve quests, engage in battles, explore the world, and find treasures.
While guided by the dungeon master, players are free to move as they please, design their personalities, engage in conversation, or make decisions that impact the progression of the story. They inhabit their characters, and they make every move from within the mindset of their imaginary personae. They each become, for a brief moment, someone else.
Through D&D, this kind of roleplaying became immensely popular, making the game the reference-point of the genre. Millions of books presenting starting scenarios, battles, and items were sold over the nearly fifty years of the game’s existence, and its legacy has found its way into a plethora of popular-culture touchstones. It is the basis on which many spin-offs were built, including the massive multiplayer online roleplaying game (MMORPG) World of Warcraft.
World of Warcraft was released at the end of 2004 the long-expected adaptation of the Warcraft series in the form of an MMORPG. Its popularity is without precedence. By 2008, the game boasted over ten million monthly subscriptions, and by 2014, the developer Blizzard registered more than one hundred million active accounts. Following the lore of the Warcraft universe, World of Warcraft allowed players to pick a character from a list of possible classes, modify the type to fit their desires, and with that character discover the fictional planes of Azeroth. The game’s non-linear storyline and open world, combined with its roleplaying mechanics, class-based strategies, and online encounters and competitions with many other players, became the benchmark for MMORPGs. In the game, quests need to be completed, dungeons need to be raided, and items need to be gathered in order for a player to progress through the levels. This progression in turn increases their skills, allowing the player to access new areas and the ability to acquire better gear, which then leads to new quests and so on. The game also features auction houses where items can be traded, a currency system with copper coins, silver, and gold pieces, several cities to each host hundreds of players at a time, and very convoluted player-versus-environment (PVE) and player-versus-player (PVP) interaction. All together, the virtual economy, politics and social structures made the game a small approximation of the real world.
The concept of roleplaying games is that players each embody a fictional character who improves as their story progresses, starting out with basically nothing and ending as the victorious hero. This makes the player’s first encounter with the game, deciding who or what will represent them throughout, crucial to the game’s dynamics. As every class serves its purpose alongside others to win battles, every class has its advantages and disadvantages; picking a player means picking a role within the community of the game. For World of Warcraft’s toughest challenges in, a party of forty people is required. Forty players, composed from complementary classes, are therefore put together in the most balanced way possible. Healing classes are generally vulnerable to attacks, so they need to be protected by armoured classes, who are in turn supported by damage-dealing classes, making it possible for the healing classes to keep the rest alive. These large challenges, or “raids,” require very precise coordination and communication. Everyone needs to know their place, their function, and their tactics. Any individual’s iverting diversion from the strategy ultimately results in the destruction of the group.
Unlike D&D, in which a character is also picked at the start, World of Warcraft offers no real way of changing persona afterwards. The game provides opportunities for growth and for customisation, but only within the bounds of the game’s architecture. In place of imagination, there is certainty. The narrative is fixed, limited to the fantasy of the game, and to the backstory of the character—of which there are thousands of lookalikes with only slight variations. The massiveness of the game presents the world as a unique place, with challenges that seem like they are cut out especially for each player, but, in fact, are beaten by millions before.
Despite this, so many enjoy their time in this static world, gathering the same objects over and over again, collecting the same rewards, the same challenges, the same quests. Again and again and again, they live forever on repeat.
2_2_2_Min-Maxing
One of the most attractive features of roleplaying games is customisation. Players can build a character, both in appearance and in personality, according to their wishes. Customisation may sometimes be limited to the purchase of silly hats or the learning of funny dance moves simplyto impress fellow players, but may also imbue advantages in gameplay, as with, say, certain kinds of armour, or other learned skills.
Most commonly used to determine the overall power of a character is some form of an attribute system. An attribute is a piece of data, commonly called a statistic, or stat, that describes the t extent to which a character possesses a specific in-born characteristic. There are physical stats, such as strength—measuring physical power and carrying capacity—or dexterity—measuring agility, balance, coordination, and reflexes. There are also mental stats, such as intelligence—measuring deductive reasoning, knowledge, memory, logic, and rationality—and charisma—measuring force of personality, persuasiveness, leadership ability, and successful planning skill. In the iconic Fallout series, a post-apocalyptic single-player RPG first published in 1997, the attributes are represented by the acronym SPECIAL: Strength, Perception, Endurance, Charisma, Intelligence, Agility and Luck. World of Warcraft has five main stats— Strength, Agility, Intellect, Stamina, and Spirit—followed by a myriad of lesser stats, influenced by the main stats and more straightforward in their naming: Attack Power, Armour Penetration, Critical Strike Chance, Hit Chance, Weapon Skill, Spell Power, Healing Power, Magic Resistance, Defence Rating, Armour Rating, Block Rating, and so on.
Each stat is represented by a number that can be increased or decreased, all depending on the choices the player makes. They are computational representations of human traits, incorporated into the game, with the goal of making the role playing experience as tangible as possible. Stats are quantified characteristics, which means that they are measurements that can be optimised to meet certain requirements. Discussions on stats between players will look something like this:
At level 60, you need to achieve 440 Defense Rating to avoid being critically hit by level 63 mobs. This requires a fully leveled 300 innate Defense, and an additional 140 Defense Rating from gear. Although you still gain minimal value after 440 Defense, you need at least 440 to be considered a Raid tank.
In order to achieve victory—for instance in World of Warcraft, about which this quote refers—players can arrange for their characters’ stats to be calculated according to the game’s parameters. They can change their stats as much as they want and customise them to meet their personal demands, but in the end, their strategies come down to carefully choosing which stat to boost and which to neglect.
The same is true for another dynamic in roleplaying games: skill trees. This is the collection of abilities and special powers a character can be given. In the original World of Warcraft, every class has a different skill tree, or “talent tree,” which presents the player with around 150–160 options from which 51 have to be chosen. This means that there are a myriad of ways to spend “talent points” and complete the talent tree, which is called a “build.” And for each class of the nine total, there is a completely new set of talents to pick from. The options are seemingly endless.
In reality, there is a logic to follow for each build, for some talents are useless in the overall development of the game, and some are crucial. For most builds, there are about three or four talent points that can actually be spent “freely.” The others need to be put in the right place, or else encounters with end-game bosses will not be possible and, during player-versus-player activities, characters will not be on equal footing; the game will not be fully accessible.
The game’s openendedness is merely a selling point, a flashy facade, which an overwhelming complexity of thousands of quest lines, hundreds of explorable areas, innumerable creatures and critters, items, and objects, mounts and professions, travel paths, portals, collectibles and pets, tokens and emblems, dungeons and battlegrounds and seemingly unlimited options of encountering and engaging with all these features simply do the work of embellishing. But once players starts to get an overview of the tangle of endless possibilities, they end up falling into the fixed structures of quantification that run the show: they follow the stats, talent trees, achievements, gold, tokens and emblems, gear scores and item levels, reputation scores, and honour points, eventually aligning themselves with what the game asks. Entire internet fora and wikis are dedicated to only this optimization, hosting discussions and walkthroughs on how to best design a character, how to get the most from the least amount of effort. This is casually called min-maxing.
Min-maxing is the exact opposite of customisation, of free play. World of Warcraft features multiple in-game challenges that each require groups of forty to band together and face a threat. During these raids, players face powerful bosses that are difficult to subdue because they operate in ways different from more common enemies. However, precise tactics to defeat them are described at length on a multitude of websites, and these instructions need to be followed precisely if the group wants to vanquish their opponent. There are intricate explanations on how to min-max a character specifically for each fight. Third-party programmers have even made in-game modifications that warn players for each special attack and tell them what to do in each situation: where to stand, which spell to cast when, when to run away, when to hide, when to strike. As each boss has their own, fixed sequence of attacks and movements, the fight can be scripted precisely, phase by phase. Potentially, a group could decide not to follow the script, but it will most likely lead to their quick demise. Therefore, every member of the raid needs to know it and then execute the protocol. Moreover, the entire composition of the raid needs to be min-maxed for it to work. The script includes the best rotations—the order in which abilities are used for every class—and the best macros—automated programs that can be written inside the game to facilitate a succession of abilities. It explains which gear to wear and where to get it, what spells to use and avoid, which extras to supply or produce and which precautions to take. The script deconstructs a fictional battle into seconds, into stacks, into quotas and prerequisites. There can be left nothing to chance. David Graeber calls this the joy of bureaucracy:
The introduction of numbers, the standardization of types of character, ability, monster, treasure, spell, the concept of ability scores and hit-points, had profound effects when one moved from the world of 6-, 8-, 12- and 20-sided dice to one of digital interfaces. Computer games could turn fantasy into an almost entirely bureaucratic procedure: accumulation of points, the raising of levels, and so on. There was a return to the command of armies. This in turn set off a move in the other direction, by introducing roleplaying back into the computer games (Elfquest, World of Warcraft...), in a constant weaving back and forth of the imperatives of poetic and bureaucratic technology. But in doing so, these games ultimately reinforce the sense that we live in a universe where accounting procedures define the very fabric of reality, where even the most absolute negation of the administered world we’re currently trapped in can only end up being yet another version of the exact same thing.
In this way, the statistics of the game become the game itself, and the management of the statistics the goal. Like fantasy-football management games (Football Manager, FIFA), city management games (Sim City, Cities: Skylines, Transport Tycoon), factory management games (Factorio, Satisfactory) social management games (The Sims, Animal Crossing) or any type of simulator game (Euro Truck Simulator, Farming Simulator, House Flipper), MMORPGs give us the impression that any type of governance can be converted into sliders and graphs, percentages and balances, right and wrong. Building a career in painting in The Sims requires you to produce five masterpieces and sell two of them. Setting up a functional copper production line in Factorio demands two copper mines for every coal mine. And in order to yield the highest profit in Farming Simulator, you need to use barley-harvesting equipment on a barley field when the timer tells you to.
These games present the world as a logic sequence of actions to be performed at just the right time. By following a script, the player is rewarded, and the best results are achieved. The appreciation of predictability is reinforced. This is, of course, true for any game. But roleplaying games are more statistically motivated, or, in the words of David Graeber, “more bureaucratic,” rather than based on the achievement of victories or adrenaline rushes or the exploration of imagination. They represent not any skill or talent, just analytical insight.
We can find this logic feeding back in procedural approaches to real life, where the quantification of daily routines progressively opens up avenues for streamlining and scripting. Personal health tracking devices, like Fitbit or Strava, turn physical activities into roleplaying games. The rationale of management games—min-maxing—is imported into the routines of corporeal maintenance. Sports become the management of the body. Getting the most progress from the least amount of effort, and comparing that progression with that of other players, is textbook roleplaying.
Nutritional-assistant apps seem to act as stats managers in roleplaying and management games, weighing every calorie, every vitamin, and every drop of fat to map out a route to perfection. The user’s health becomes dexterity or stamina, and food intake is detached from taste or pleasure. Even the flavourful parts of eating are scripted by the overflow of cooking recipes and kitchen protocols.
The internet of things, which turns the house into a series of data- registering devices—smart homes, smart assistants, domotics—transforms the routines of the household into preferences and scripts. Managing the house becomes a matter of adjusting settings. Your freezer knows what products you have and lack, and tells you what to order, or orders them for you. The bathtub fills itself when you get back from work. The lights dim to fit your mood.
Customer credits, loyalty cards, and purchase records perfect the sequence of your consumption, presenting you with the best opportunities to buy those items that best fit your personality. When all you have to do is check the boxes of a premade list of personalised goods, you become a role-player in the game of supermarket shopping.
In education, learning becomes a matter of acquiring skills; teaching becomes transferring those skills. Credits are given to all who follow the curriculum as intended, and at the end, the credits award you an achievement. This achievement then unlocks new content, which is unavailable to those players who did not successfully follow the sequence. The list of examples goes on.
I know I speak in hyperbole when directly comparing real life to the statistical nature of roleplaying games. But the logic fits, and the exaggeration only serves as an illustration. We are living the narrative that came from an abstraction of narratives: roleplaying games, as simplifications of real life, are now the sources of expectations we harbour in real-life. We choose who we become, we pick our attributes, and we min-max our lives.
The term “gamification” is often used, rather derogatorily, to denote the practice of reducing the actions of the everyday into a playful sequence that bears similarity to a game. But this is not at all what I mean with the feedback of roleplaying into real life. To call this evolution ”gamification” is to do the gravity with which people engage in
roleplaying injustice. If anything, our tendency to quantify real life is more about min-maxing and the sense of achievement that flows from it than it is about making everything fun or competitive. It is very real, as real as Jung suggests persona is. To live life like a roleplaying game does not make life a game.
Yet, like in any roleplaying game, when the maximum level is reached and a character is fully optimised, a player’s interest starts to fade. The problem with manageable achievements is that, after a while, they are all achieved. Statistics for statistics’ sake tends to run a finite course. If there is no story and only numbers, only the naked architecture of computation, then progress ends. Again, roleplaying games have found a solution to this problem.
Through incremental iterations, some games can reproduce themselves. First, players are able, say, to grow from level 1 to 60, then from 1 to 70, and then from 1 to 80, and so on. New content, with higher numbers, greater statistics and more quests, is gradually added to the game. Players can harvest new tokens, gather new gems, or accumulate more currency. They can explore new territories and new parts of new stories, encounter new bosses with slightly different mechanics, equip new gear (making their old gear obsolete in the process), keep new pets, and ride new mounts. The next level is always just out of reach—always higher, greater, grander, more magnificent, and more heroic.
The youtuber Casually Explained made this satirical approach to the computational view on life, in his video “Life as a Video Game”, which ridicules the transposition of gaming logic onto reality:
So, since the first playable characters came out the devs have been working on the human player type for what seems like 3 billion years, and I’ll admit that I haven’t even gotten through all the levels yet, but I’ve been pretty impressed so far so I want to give a critical review of the current content. Now, if this is your first time playing I definitely recommend picking one of the starting regions with lower difficulty which I put in green here. All the servers are getting pretty overpopulated right now so you can pick one of the yellow regions if the recommended ones are full, but really try not to go for the areas in red unless you really know what you’re doing or you’re just trying to speed run the game. Pretty much all areas are player vs environment zones but there are a few player vs player areas around here and if you really wanna play in a private server you can do it over here, but the admins aren’t so great and it sucks if you change your mind because after you’re there you can’t leave so I’d probably give it a miss.
Okay, at the start you have to choose a race and in the past caucasian male was definitely the most OP, but they’ve all become a lot more balanced in more recent patches at least in the easier starting regions. If you look at the race perks tab there’s still some nice privileges with this choice so that’s what I went with.
Okay so once you’ve chosen your starting region and race you have to go through a 9 month loading screen while your character spawns and that already sounds bad what’s even worse is that once you’ve actually spawned you have to go through 18 levels of tutorial, the first few of which are just interactive cutscenes. Even when you get to start moving around and interacting with the environment what kind of sucks is that the only other users you really get to interact with are the two parent players that take you through the start of the tutorial, and the trouble is that if they haven’t advanced their parenting skill tree, you end up with a really screwed up stats for the rest of the game. That’s really just down to the RNG of who you get paired up with.
So the first few levels of tutorial really aren’t so memorable but once you make it supple 5 or 6 you’ve usually unlocked the friendship and knowledge trees but you do have to spend around six hours a day at a learning academy to keep levelling that up.
Okay so as you get to by level 17 or 18 you basically finish the tutorial and it turns out the first thing you actually realise when you’re done is that I had nothing to do with the rest of the game, and it turns out the whole tutorial is pretty much designed to sell you the college or university DLC, which is kind of overpriced for what you get but if you decide to go for it, it can help you get a step up on free to play players in the mid game.
So the midgame itself generally takes place between around level 18 and 65 where the two biggest focuses are to sustain the newly unlocked relationship-meter and generating ingame currency which is usually denoted in dollars. So first of all the relationship-meter. This is the built-in drive to find a sexual companion, spend time with them and reproduce. There’s some factions where your parent players choose a companion for you but in most of the western factions you get to choose yourself. You might think this is a way better idea but the problem is they also have to choose you, so it might sound better on paper but make sure you think out the gameplay.
Now, when it comes to generating in-game currency, the most common strategy is to trade your play time for dollars. If you see most users during this process, while they’re technically still actual players, the majority of them are running scripts so they’re typically AFK and they’re pretty much indistinguishable from NPCs.
So, because it’s an MMO there aren’t really any distinct objectives and generally sometimes during the midgame you have to figure out what it is that you wanna do. Some players focus on their relationship-meter, some on getting more dollars, some on really maxing out certain skill trees and some struggle to figure it out at all. There’s no right answer really just like any other MMO. I think your goal is to just enjoy the time you spent playing without ruining it for other people.
Hopefully when you are a higher level you can get to the point where you enjoy helping players who might be struggling. So when you do get to around level 65, this age should be the endgame content but in practice it is getting later and later all the time despite the level cap remaining pretty much the same. But nevertheless what’s good here is that you generally have a pretty fulfilled relationship-meter, your important skill trees are maxed out, you can generally spend the rest of your time exploring the map and doing the things you wanna do.
The trouble is that at this point it can be hard to actually do those things because you start to experience a lot of bugs with your character. It’s just because of the amount of variation of each user’s programming, you can’t expect to function perfectly after so many iterations. Around level 80 is when you start to encounter a lot of errors until eventually the screen just turns black and your playtime is over. A lot of users wonder what actually happens at this point but no one really knows. Some players think you respawn, some think you only get to play once, some even think you unlock a sandbox mode where you can basically just play over all the good bits on a special map. But you know, it doesn’t seem to me like there’s not a big difference between before you start the game and after you finish, so don’t overthink it, make the most of your playtime.
2_2_3_Tropes & Lores
“In this world where time is your enemy, it is my greatest ally. This grand game of life that you think you play in fact plays you. To that I say... Let the games begin!” — Victor Nefarius
Roleplaying games, both off- and online, are more than just playful versions of statistical problem-solving games. They are not pure brain trainers, like crosswords or sudoku, or sterile management games. Because they are mostly performed in groups, with a community of like-minded, content-hungry players, they rely on a narrative to give their computational containment meaning. Roleplaying games’ narratives give us the impression that the numbers these games use to operate bear importance.
The common way to do this in roleplaying games is by cultivating lore—a building a greater contextual narrative that gives a sense of purpose, importance, and urgency to the player’s actions. Lore can be defined by an endless conflict or feud in the narrative, the return of an exterior threat, the downfall of the society’s heroic leaders, or the collapse of an empire creating an imbalance in power.
As a mythology to explain the raison d’être of a fictional world, lore puts in place the logical connections between the origin and the now. It is the fictionalisation of the fiction, with just the right amount of complexity to incorporate all the possible fantasies of all possible players:
Video game creators are prone to making two erroneous assumptions about what constitutes a deep narrative. The first is that volume equals depth. In the classic tradition of epic science fiction and fantasy literature, studios will craft thousands of pages of backstory, often involving many hundreds of characters and vast intergalactic wars. Sometimes it seems as though, early in a narrative meeting, one writer will say to another, “okay, let’s set this in the middle of a war that has been running for a 100 years”; then their colleague replies, “No wait, how about... a thousand years?” And then everyone agrees this is exponentially deeper. It isn’t, it’s just an extra nought on the end of a conflict that, without context, pathos or human tragedy, is ultimately meaningless.
Game lore’s creators want their narratives to feel like they have been going on for thousands of years to give the impression that they could go on for another thousand. The player is dropped in the middle of endlessness, and moves from battle to battle without ever getting the impression that this battle might be the last.
The other problem is the belief that obfuscation equals depth. In the Final Fantasy and Metal Gear Solid franchises for example, the timelines, relationships and plot structures are so tortuously complex, so shielded within arcane terminology, that it’s almost impossible to engage on an emotional, empathic level. Yes, Kojima makes lots of super smart postmodern jokes and references throughout his games, but they are buried beneath narrative labyrinths that feel inaccessible, not because they’re intellectually complex, but because they don’t make a whole lot of sense. This doesn’t feel like great story-telling.
Now, I agree that the idea that complexity or volume equals depth is pervasive in gaming narrative, and that it is the result of bad storytelling assumptions. But in roleplaying, these tactics of obfuscation are used as strategies to render a virtual world lifelike. There is plenty of time to eventually come to unraveling the narrative labyrinth: the game is endless after all.
The tropes of lore seem to repeat themselves too, copying recognisable sequences of attributes specific to a setting: a Nordic-themed area includes cabins, pine trees, fjords, and longboats; an Egyptian-style dungeon is represented by hieroglyphs, slanted sandstone columns, sandy floors, and scarabs; the undead live in graveyards, carry diseases and cockroaches, and turn water green. Ice giants, lords of thunder, goddesses of life and destruction, the elements of water, air, fire, and earth, sky temples, fiery demons, spell-slinging wizards—these are each tropes connected to an expectation of a narrative and its subsequent decor. They can, as such, be completely disconnected from any human intervention and float entirely on procedure, on script, on normalisation.
Through variation, the world is made endless within its limitations. Computation recycles the bits of lore to procedurally generate the new whenever the player wishes it.
2_2_4_The Gathering
Still, even in the most strictly automated gameplay, there are possibilities for transgression. It is, after all, incredibly difficult, if not impossible, to banish randomness from human life entirely. Computational worlds will always be exploited by the workings of human folly. Putting many people together on an online gaming platform always breeds subversiveness in a process that can show us much about the real world.
Just as the analytical logic of roleplaying games feeds back on our expectations in everyday life, so too can the more emotional aspects. Gamesfoster community, friendships, and ways to communicate over vast distances.
Most notorious in the gaming world are the social dynamics of World of Warcraft, which the game permits but are not preprogrammed. Online weddings take place throughout the numerous servers of the game, with ceremonies either in the capacity of roleplaying characters or asin-game expressions of real-life love. Couples find beauty in the virtual world, either as an expression of their shared interest—the game—or as a way to share their connection with the rest of the community and their faraway friends. It may seem strange to hold services in the magnificent cathedral of Stormwind, the Human class’s capital, or on a floating rock in the shattered realms of Nagrand, but the act is a way of hijacking a sterile, computational environment and integrating humanness.
Some players hold parties or raves, go on exploring the world in groups, have picnics, or find nice spots to take pictures. There are funerals and baby showers, slam-poetry reading sessions and rap battles. Players form conga lines or strip down naked for money. On the occasion of the release of the 2019 World of Warcraft Classic, a revival of the 2004 game, so many people started playing at the same time that the servers became overwhelmed. Simple gathering quests became so crowded that people queued to wait their turn. There is nothing programmed into the game that would prevent a player from skipping, but the social control reached through the dynamics of the virtual world kept everyone in check.
When the COVID-19 pandemic forced schools to close, Charles Coomber, a seventh- and eighth-grade social-studies teacher, taught an entire geometry lesson on the makeshift marker board at the start of Half-Life: Alyx, a native virtual-reality game that was released, coincidently, at the start of the crisis. Ironically, the game is set in a quarantined and highly surveilled world, infected with an alien species controlling the minds and bodies of humans. The math class eerily managed to project a possible future not only through the dystopian narrative of the game but also by exposing the ability of game adepts to hijack the dynamics of preprogrammed digital environments and appropriate them to mirror real life.
While quarantines and lockdowns continue to globally force us to abandon physical contact and look for social connection elsewhere, our exodus to the online world has proven to be burdensome for many, as it requires the knowledge of an overwhelming amount of different applications and tools, many of them maladjusted to large masses or spontaneous use. However, online gaming communities have been adapting to this configuration of playing-apart-together for decades, employing avatars in virtual renderings of space to overcome geographic distance, building social structures within their guilds and clans as alternatives to society, and subversively bending the virtual environment to their wills by finding the loopholes in the static architecture of gameplay.
The appropriation of virtual space for things that are not preordained, however, does not always result in quirky or romantic side effects. In World of Warcraft, a player will spend a fair amount of time in the game on the collection of specific items that can be looted from the corpses of enemies or hostile creatures. The mechanics of the game determine the chance of any enemy dropping the desired item upon their death. Some items are therefore rare, and others are common. Searching for a specific item, the player may need to slay many iterations of the same type of enemy to raise the chances of finding said item; this is what is commonly called “grinding”: repeating the same action, over and over again, until the item is found.
Because each killed enemy also helps the player’s character progress in level, grinding is also used as a term for levelling up a character through repetition. Depending on the goal, grinding can take hours or even days. This long and tedious repetition, in combination with an in-game trading system, spawned the development of a real-world black market, where real money can be spent on the grinding of others. These gamers, commonly referred to as “farmers”—as they have to roam the lands of Azeroth in order to find certain items or gold, or to level up a character—would systematically focus on slaying one particular foe, or on the search for a specific resource that appears at a regular rate all over an area defined by the game. As the systematic work of farmers began to overburden the capacity of the game (since the regeneration of items continues at the same rate, with or without farmers), this outside work in time grew to put a huge strain on the gameplay of regular players. This, of course, increased the desire for farmers’ work, as their already laborious grinding became even more laborious. This outsourcing of in-game requirements and the subsequent growth of the black market did not go unnoticed by Blizzard, the company that developed and manages World of Warcraft. Before long, the company integrated the exchange of items, coins, and characters for real-life money in the in-game store. In addition to the paid monthly subscription needed to gain access to the realms of World of Warcraft, players can now buy their progress through the game instead of working their way up by playing. The black-market exploit became the standard for play.
another example that best describes the magnificent madness that can be induced when leaving an computational world completely unchecked and up for grabs for anyone willing to appropriate it, is the 2b2t Minecraft server. Founded in 2010, 2b2t is considered the oldest, most infamous, still running anarchy server, meaning that there are near to no rules active and that no moderation is in place, and as such, no restrictions or bans are placed on any type of behaviour. Occasionally called ‘the worst place in Minecraft’, a place of beauty and terror, the server transforms the overall very peaceful and constructive game into a place of grand creation and even greater destruction. Because no rules apply — only the computational logic of the stacking cubes in three dimensions reigns supreme — this world is swarmed with trolls, bots and griefers, as well as entire online communities who try to defend the world against them, and set up attempts to build or rebuild the splendorous fortresses of collective endeavour. After a decade of construction and mostly destruction, of pixelated obscenities and ISIS-flags, of hacks and bots automating demolition of any resemblance of the original landscape, the centre of the world — where every new player enters the world — is turned into a hellish crater, a disorienting void of lava, ash and ruins of once great castles. The remains are inhabited by cheating survivors — cheating has become nearly a necessity — who try to get as far away from the centre as possible in order to find untouched areas. But as the world keeps expanding endlessly, the possibility of finding peace stretches out further and further as the years pass.
Later, as I pass a floating swastika while shrugging off the latest barrage of anonymous insults […] I realize maybe 2b2t doesn’t represent the pinnacle of human ingenuity. But it still might serve a different purpose: The mapping of a collective mindscape, our virtual id, visualized and digitized for all time. The highs, the lows, the nagging voices of criticism, the thoughts we’d rather not share—who hasn’t felt all of these at some point?
In some ways, 2b2t is a more accurate depiction of humanity than I initially thought. Whether or not intended by its creators, the game gives imagery to an unrestrained stream of populist consciousness, the total summation of a certain segment of our species. 2b2t is like any other human mind: An infinitely expanding plane, filled with ideas both beautiful and terrifying, with an occasional voice on the wind making you feel like a fucking idiot.
The hardcore players of 2b2t have set up an archive of intermittent backups of the server, which allows players to retrieve lost constructions and revisit them in their current state. Of most of the temples and palaces, hardly anything but a piece of roof or a foundation can be found nowadays, but it shows the constant progression of the world, albeit a grim and destructive progression, a sort of digital entropy.
The computational world can be inhibited with human intent. The human touch inevitably tends to find its way to the realms of automated normalisation. No matter the desired degree of social austerity with which these worlds are impregnated, it is hard to rid them entirely from randomness, from exploits and appropriation by individuals or by groups. Communities tend to reconstruct themselves, whether or not they are governed, and whether or not they are destructive.
This gives us an idea about the evolution of the computational world, where statistical roleplaying feeds back into real life. When the logic of computational thinking pervades the experience of the everyday, we can remember the strategies used by exploiters and farmers, by role players and guild masters. There are those who do not play the game as intended, and so change the game to suit their needs.