The Culture of Computers

you i i i everything else . . . . . . . . . . . . . .” – Bob, Facebook Chatbot

Culture is a weird thing for humans and the few animals who exhibit it. As biologists have explored the animal kingdom for longer and more continuous periods, our understanding of what culture is and what culture isn’t is being pushed by our many case studies in creatures ranging from chimpanzees to chickadees. What progress has been made is in our understanding of culture as a far more mechanical, rigid, and adaptive process than we could have ever anticipated. Like in genetics, the transfer of information is the basis of culture, and the recapitulation of genetic patterns of transmission onto cultural ones only highlights that information theory, as applied in our understanding of dual inheritance, is a unifying theme between the two. An understanding of genetics can help us understand what culture is and what culture isn’t, as can an understanding of any other type of information transfer, especially in the realm of communication, especially in the realm of artificial intelligence.

Balls Have 0 to Me

Last year, media outlets went crazy with reports that the Facebook Artificial Intelligence Research Lab (FAIR) had pulled the plug on two of their chat bots after they started to “develop their own language,” while talking to each other. In the chat, Facebook pitted two bots against one another in a negotiation for some balls. The transcript is below:

Alice: balls have zero to me to me to me to me to me to me to me to me to

Bob: you i everything else . . . . . . . . . . . . . .

Alice: balls have a ball to me to me to me to me to me to me to me

Bob: i i can i i i everything else . . . . . . . . . . . . . .

Alice: balls have a ball to me to me to me to me to me to me to me

Bob: i . . . . . . . . . . . . . . . . . . .

Alice: balls have zero to me to me to me to me to me to me to me to me to

Bob: you i i i i i everything else . . . . . . . . . . . . . .

Alice: balls have 0 to me to me to me to me to me to me to me to me to

Bob: you i i i everything else . . . . . . . . . . . . . .

Alice: balls have zero to me to me to me to me to me to me to me to me to

Sensational media outlets made it seem as if Facebook pulled the plug on this chat because the two bots were plotting to overthrow humanity, but the reality is that what’s happening here is that the bots are communicating with each other in a form of shorthand. Something like Alice’s profound statement of, “balls have a ball to me,” may mean that balls have the value of ball to it, while Bob’s statement, “i i can i i i…” may indicate something like the value of how much it is willing to trade. Facebook pulled the chat because they got the information they needed, and it looked like gibberish.

Now, interestingly enough, there is something to learn from Bob and Alice, and it’s the fact that they communicated in shorthand in a way which completely deviated from the script which the researchers expected. In other words, the bots shifted their form of communication in order to take shortcuts. For those anthropologists studying the mechanisms of culture, it’s going to be obvious where I’m going with this.

The Culture of Computers: Similarities

Culture, at its core, is a coordination game. In other words, the primary reason that humans possess culture are so that we don’t have to learn things over and over again and so that we can coordinate negotiation and cooperation with one another in a way which we all have already agreed upon. If it can be confirmed that these two AI understood one another and were adjusting their communication based on mutual intelligibility, then it can be confirmed that they were coordinating with one another and taking shortcuts.

A primary example of coordination games in humans is the side of the road which a country drives on. About 69% of countries drive on the right side and 31% drive on the left, but it doesn’t matter which side any given country drives on. You can go on Wikipedia, and under each country you look up, it will tell you which side of the road its inhabitants drive on. In these systems, all that matters is that each country picks a side so that people don’t crash into each other. Which highlights another aspect of culture – its arbitrary nature.

Language, as the product of culture, has both a rather arbitrary and non-arbitrary nature. It’s not arbitrary in the sense that humans have to produce certain sounds to the extent that a number of them are going to present in nearly all languages. It’s arbitrary in the sense that aside from the mechanics, variation around these limited sounds is infinite. The chatbots in this case were highly constrained in what they could say. They very clearly had to use English words to communicate with one another, and I believe it has been suggested that the repetitive nature of the computers was their attempt to return to a binary system at the core of their processing. In other words, their language was an insight into the way they think (just like in Pinker’s view of psycholinguistics). But the point is that the chatbots improvised.

Do Computers Have Culture?

I’ve put a little bit of thought into this interaction, and I think we’re hitting a huge, untapped exploratory realm for both culturology and AI research. Do I think these two AI were culturally improvising? Obviously not, but we can highlight what we would need to see to confirm if things are cultural; likewise, in seeing what components artificial intelligence might lack, we can gain further insight into what human culture is through what computers are not, just like how our understanding of animals have informed us in our exploration of the cultural phenomenon.

Here’s what we might need to look for to see if computers have culture:

  1. Non-globally optimized peaks. Part of culture’s amazing ability is in its ability to optimize for different needs. As culture is adaptive, the fact that humans live in different environments and possess different needs within these environments, means that on a certain level it breaks down. In America, we certainly have American culture, but within that we have Texan culture, within Texan culture we have Texas-Germans and Texas-Czechs. Within these you have cultures within each town, and within each town you have sub-cultures such as punk groups, construction workers, bar regulars, and nerds. If a giant AI network were to exhibit culture as in humans, shortcuts would have to be taken in this almost fractal-like state in a way similar to how it is done in culture. In other words, you would see specialization for different needs, rather than a huge, continuously adaptive AI “mono-culture”.
  2. Maladaptation. The thing about culture is that at no given point is it ever fully adapted, and in many cases it is completely maladaptive. The most common example here might be bloodletting and traditional medicine, which in many cases can make human beings sicker, but are nonetheless carried forward by the mechanics of cultural transmission. In AI you might find that in order to operate in conjunction with other interlinked systems, the sub-components might have to periodically update, spare, and shift its own optimization imperfectly based on what is going on around it.
  3. Social transmission (evident through historical particulars). For this to be culture, it has to be “socially transmitted” to begin with. If it’s the case that these bots are simply restarting their entire systems periodically based on a globally optimized mutual intelligibility rather than periodically building on what is there, then it won’t be culture. Historical particulars and ratcheting is what leads to cultural adaptation, as well as maladaptation.
  4. Arbitrariness. Finally, like driving on a side of the road, some of this has to be productively arbitrary. If there’s some form of recognition system for groups of AI interacting, then we should see a lot of arbitrary variation around recognition between different components. We might see our chat devolve into a number of randomly organized words and characters, but we might see different looking chats going on in other systems, as well.

These four factors don’t sum culture up completely, nor are they necessary, but they’re a starting point. In short, there’s a lot to be done here for information theory and in tying it back to culture. I’ll be thinking about this a lot as I read more on it. There are clearly a number of stark differences, but there are similarities, as well.

In the end, we might find that AI hit all the check marks for cultural transmission, but would we be willing to admit this? I’m not certain, and like any other aspect of things we want to view as uniquely human, we would more than likely shift the goalposts to make sure we’re still special.

Like my blog and want to support my writing? Please consider supporting me on Patreon.

References

Freeberg, T.M., 2000. Culture and courtship in vertebrates: a review of social learning and transmission of courtship systems and mating patterns. Behavioural Processes51(1-3), pp.177-192.

Richerson, P.J. and Boyd, R., 2005. Not by Genes Alone.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: