Saturday, August 19, 2017

Intersectional lines of flight

In the most recent anomaly, I use the concept of international supply chains to illustrate the possibilities of intersectional analyses. It is both a joke and an illustration: a joke in that it is not a concept you would expect to see in a text on intersectionality, and an illustration in that there is no real reason why it could not be included in an intersectional analysis. One would have to make a case for including it, but that goes for every other methodological aspect as well, so it is not unique in that regard.

There are always more potential analyses than actualized ones. This is due to the fact that it is easier to come up with ideas than to go through the months long painstaking process of gathering and processing the data. There really is nothing stopping anyone from saying "hey, we should analyze x in the light of y" - the only effort involved is to have the idea in the first place. And ideas are plentiful.

If you've read your Feyerabend, you can have ungodly amounts of fun generating ideas for potential analyses about the most counterintuitive objects from the most unexpected of angles. Indeed, if you've read your Giddens, you have seen it in action; that famous introduction sure is effective in showing how coffee is not just a beverage but also a social institution, a major economic commodity, a marker of social status, and a whole host of other things condensed (and percolated) into one singular thing. There are no real limits to how many approaches you can use - in theory and in mind.

In practice, there are limits about. Some limits are related to energy - you only have so much of it. Some limits are related to genres and conventions - you are expected to follow the written and unwritten rules for how to go about things. Some limits are related to empirical applicability - some approaches simply will not work.

The first kind of limit is absolute. The second one is negotiable.

Among those who for whatever reason oppose the notion of intersectionality, it is common to make reference to the third kind of limit. "Atoms do not have genders", they might say, implying that an intersectional analysis of physics is impossible. More specifically, they imply that the objective (and thus scientific) ontic universe cannot be understood using the methods and concepts of the social sciences, and that true scientists should be left alone to pursue their important work unperturbed.

They are usually perturbed when an intersectional analysis about how 'objectivity' is a gendered concept with roots in imperialist colonial practices, and thus cannot be used uncritically to convey what they want to convey. The fact that this is a successful application of intersectional analysis is shoved aside by the assertion that no, it isn't.

Thus, we find ourselves back at the second kind of limit. Genres and conventions.

If you read enough about intersectionality, you will eventually come across appeals to include animals in the overall roster of categories. In its mildest forms, this pans out as arguments to strengthen animal protection laws; if it is unethical to let humans suffer, then surely it is unethical to let other forms of life suffer, too. In more radical forms, we find militant veganism (though, to be sure, it is likely militant vegans found their way to where they are by other routes than methodological considerations). Somewhere between these positions, there is a point where it becomes unstrategic to include animals in your analysis.

It is not difficult to come up with intersectional analyses which include animals. For instance: there is a class (or, perhaps more fittingly, caste) system in place with regards to animals. Some animals (dogs, cats) are pets, and kept around the house. Some animals are slaves to be exploited to the fullest extent of their biology (mutated, deformed fowl who live their life in dark factories). Some animals are poached for their alleged medicinal properties (tigers, elephants). Some animals are national symbols (bald-headed eagles). I probably do not need to flesh out the differences to successfully convey that there is something to be learnt by performing an analysis along these lines. Or that international supply chains might be involved somehow.

But.

It is unstrategic to perform such analyses. They do not get funded, for one. They also do not tend to be read with a sense of delighted gratitude; more often than not they are dismissed as prattling sentimental nonsense, along with their authors. There are limits to what a serious participant of contemporary discourse can say, and it is solid strategy to be aware of these limits.

Indeed, these very limits are very rewarding to perform an intersectional analysis of. I would go so far as to say it is a good idea. -

Friday, August 18, 2017

Who and what to know

A while back ago, I was attending a social gathering where people came, discussed for a while, and left. There was no fixed topic of discussion, or other purpose than the sheer getting together and talking. It was a fluid situation.

At one particular moment, those present got to talking about family relations and relatives. There were old folks present (persons in their sixties and upwards) who talked about their relatives and relations in terms of individuals. The reference points went along these lines: he was the one who was married to her, and they had that fancy car, remember? or: remember the old man who lived on that hill back in the days - he had a nephew, who married this other person who ran that store, and so on.

For those listening in on the conversation without knowing (and thus not remembering) these particular facts or persons, this line of describing who's who will remain a work in progress. More information is required about the nature of marriages, cars, hills and other aspects of local historical memory to make sense of it all. It is a situated knowledge about a specific cast of characters, and the only way to really become someone in the know would be to stick around long enough to become situated.

After a while of establishing who's who, someone asked one of the young persons present if they knew the children of those discussed. As it turned out, they did, in a way. They knew of these persons, but had never really interacted in any significant fashion. The most succinct summation of the situation put it thusly: oh yeah, him. He was in B, so I never talked to him.

This is a distinctly different way of relating to social relations. The B in this case refers to an administrative subdivision of school populations - 6A, 6B or 6C. These are all sixth grade, but for purposes of keeping group sizes manageable, divided into three groups. Referring to B as a known fact implies knowing these administrative subdivisions and their social implications, which is a radically different way of organizing who's who than the individual-to-individual approach outlined above.

The old folks present did not know the specific implications of the letter B. But, being old and wise, they picked up the gist that this letter somehow meant that the individuals in question did not know each other, and continued the discussion armed with this new nugget of contextual information.

The difference between young and old in this case is not subtle. In fact, it seems to be taken right out of some introductory textbook on sociology, wherein it describes the gradual expansion of bureaucracy into more and more aspects of our lives. The old ones thought in terms of individuals; the young ones in terms of administrative subdivisions. It was, in a single moment, a crystallization of modernity.

It was a strange moment, and I have pondered it ever since.

Thursday, July 27, 2017

In the mood for some discourse

Both the two most recent discursive anomalies share a theme. That theme is, somewhat unexpectedly, mood. Or, put another way: the way reading a particular text makes you feel, and how that feeling affects your thoughts.

In case you are reading in the future, the two anomalies in question are the ones about Hyde and Booth. Since texts are always retroactively present, you can sneak over to read them without missing a beat. Go on. These words will still be here.

Mood is an underrated concept. Sometimes it is dismissed outright, as part of the overall category of 'feelings'. At other times, it is seen as a distraction from the main point of interest, e.g. 'not being in the mood', 'being in a bad mood'. There is a tendency to see mood as something that happens beside the point, and that reality happens without you while you are distracted by these irrelevant moods of yours.

Besides being both rude and bordering on gaslighting, these takes have the additional drawback of being wrong.

Booth is perhaps most explicit in his discussion of moods. One of his premises is that the reason you keep reading a particular text - a romance novel, a cartoon, a crime novel - is that you want more of whatever it is you are reading. The point is not to see if the lovers stick together, what the punchline might be or whodunnit, but to extend the present experience of reading, whatever it might be. The act of reading the text puts you into a certain (albeit at times intangible) mood, and it is this mood that fiction provides. Far from being a side point, mood is for Booth the express purpose of reading. And, by extension, writing; to create an artifact in the world that conveys the kind of mood the author is interested in conveying, and thus creating an opportunity to explore this mood - both by experiencing it through reading, and by the creative act of criticism.

If you are a podcast listener, you might have experienced a peculiar kind of sensation: that of listening to people talk about something you are utterly uninterested in, but find the discussion itself fascinating and worthwhile. This is the mood Booth writes about; the state of mind the act of partaking of something puts you in, regardless of what the subject matter happens to be.

When Booth says that books are friends, this is what he means. You can pick them off the shelves and read for a while, and be comforted by their company; they raise your mood, as friends are wont to do. His approach to criticism is this: if what you have written can provide good company, then it has merit, and writing should strive to attain such merit. To be good company.

Hyde approaches the same theme from another angle, that of rhetoric and philosophy. Moods are not just something that happens while reading, but are the guiding principle behind our thoughts and actions. If we like the places we inhabit - dwell, in his word - we will act towards them in certain ways, presumably with the intention to preserve and decorate these places. If we do not like them, the mood will be different, and our actions will follow suit. Mood is what motivates us: thus understanding mood means understanding ourselves and our place in the world.

The punk aesthetic can be understood in this light. It defines itself against the status quo and seeks to rebel against it. The point is to be something different than what is on offer by the powers that be. The fact that it is seen as ugly and vulgar by those who are attuned to the mood of the times is one of punk's express aesthetic purposes, and only adds to the appeal of those who share the sentiment.

Hyde maintains that seeing mood as guiding principle places a certain ethical responsibility on us as discursive actors in the world. When we write something, we do not simply convey a certain number of facts in a certain order and with a certain degree of accuracy - we also convey a mood. More so when engaging in public speaking, as our presence defines the mood in the room with regard to the subject matter discussed. What we say and how we say it matters, and it falls upon us to think about our impact on those who listen.

Taken together, these two variations on the theme of mood gives us a foundation on which to build further thinking about critical reading and writing. At its most basic, it allows us to ask what mood a particular artifact puts us in or is written to foster. It also allows us to reflect on our own writing, and ask ourselves if we convey the appropriate mood alongside what we want to say. At its most simple, thinking about moods this way asks us to pay attention, and to act on what we see.

More indirectly, the notion of mood gives us an opening to understand why certain people like certain works or genres. There is no shortage of writers and podcasters who do little else but repackage things that have already been said elsewhere, but who add the element of mood. Being able to understand that it is this mood that draws their audience allows us to understand why they do what they do - 'they' being both audience and authors.

A benign example is why readers like the rapt wittiness of someone like Jane Austen; the way she depicts social interactions and relations is a very distinct kind of mood indeed. On a less pleasant note, many partake of racist media just for the sake of the mood therein: hearing someone else talk about the negroes and their decadent ways gives permission to maintain that mood and mode of thinking. Keeping mood in mind allows us to understand - and critique - these things in a more interesting way.

Closer to home, it also opens the door to understanding home decoration. The point is not just simply to look good, but also to suggest a certain mood. A sidenote, to be sure, but I want to imply the general applicability of these things.

I suspect that both works discussed above might be slightly obscure to the general reader. Booth published the Company We Keep in 1988, and Hyde's anthology about the Ethos of Rhetoric came out in 2004. I also suspect that, should you have stumbled upon these books in the wild, you might not have found them particularly interesting - they are both, in a way, intended for specialized audiences. While the point of writing discursive anomalies about a particular thing is to encourage readers to pick up these things and read for themselves, in this case the point is more to convey the general mood of these two books. To introduce you to a concept you might otherwise miss.

But, then again: that is the point of most writing about writing. -

Monday, July 24, 2017

Human-level intelligences and you

There has been much ado over the years about computers becoming as intelligent as humans. Several goals have been set up and surpassed, and for each feat of computer engineering we have learnt that intelligence is a slippery thing that requires ever more refined metrics to accurately measure. Beating a human in chess was once thought a hard thing to do, but then we built a computer that could do it - and very little besides it. It is a very narrowly defined skill being put to the test, and it turns out intelligence is not the key factor that determines victory or defeat.

Fast forward a bit, and we have computers giving trivia nerds a run for their money in Jeopardy. Turns out intelligence isn't the defining factor here either, on both sides. For computers, it's all a matter of being able to crawl through large amounts of available data fast enough to generate a sentence. For humans, it's a matter of having encountered something in the past and being able to recount it in a timely fashion. Similar tasks, indeed. but neither require intelligence. Either the sorting algorithm is optimized enough to get the processing done on time, or it is not. Either you remember that character from that one soap opera you saw years and years ago, or you do not.

The win condition is clearly defined, but the path to fulfilling it does not require intelligence proper. It can go either way, based on what basically amounts to a coin toss, and however you want to go about defining intelligence, that probably is not it.

The question of computers becoming as intelligent as humans has ever so gradually been replaced with an understanding that computers do not have to be. In the case of chess, a specialized dumb computer gets the work done; the same goes for other tasks, with similar degrees of dumb specialization. Get the dumb computer to do it really really way, and the job gets done.

If all you need is a hammer, build a good one.

A more interesting (and more unsettling question) is when a human becomes as intelligent as a human. This might seem somewhat tautological: 1 = 1, after all. Humans are human. But humans have this peculiar quality of being made, not born. As creatures of culture, we have to learn the proper ways to go about living, being and doing, And - more to the point - we can fail to learn these things.

Just what "these things" are is a matter of some debate, and has shifted over the years. A quick way to gauge where the standards are at any moment in time would be to look at the national curricula for the educational system of where you happen to be, and analyze what is given importance and what is not. There are always some things given more attention than others, some aspect promoted above others. And, at the core, some things are deemed to be of such importance that all citizens need to know them. Some minimum of knowledge to be had by all. Some minimum level of intelligence.

And there are always a number of citizens who do not qualify. Who are not, for any given definition of intelligence, up for it.

When does a human become as intelligent as a human?

Friday, July 14, 2017

Some words on media permanence

It is a strange thing about media artifacts that some of them age well, while others do not. Some can be forgotten for decades, only to find a new audience willing and able to engage with them. Others can not be revived as easily, and are thus consigned to reside only in the memories of those who were there at the time.

To be sure, this applies to things that are not media artifacts, too. Things happen, and after they have happened you were either there or you were not, and your memories of the event are shaped accordingly. It is a very important aspect of the human condition.

But the point of media artifacts is that it is possible to return to them at a later date. They are supposed to have some sort of permanence - it is a key feature. Books remain as written, pictures as pictured, movies as directed. It would be a substantial design flaw if these things did not last.

Though, then again, some things do not last. Books fall apart, movies fade, hard drives crash. Entropy is not kind to supposedly eternal things. Look upon these works, ye mighty.

But. All these things aside: some media artifacts age well, and some do not. Some can be readily introduced to new audiences, while others remain indecipherable mysteries even upon close encounter. There is a difference, and it is very distinct from the question of whether or not we're trying to jam a VHS tape into a Betamax player.

This difference can be clearly seen if we contrast Deep Space Nine and Babylon 5. Despite being from roughly the same time, belonging to the same genre and sharing a non-insignificant portion of plot elements, one of these television series is instantly accessible to contemporary audiences while the other is not. Though it pains me to say, it takes a non-trivial effort on the part of those who are not nostalgically attached to Babylon 5 to view it with contemporary eyes. A certain sensibility has been lost, and the gloriously cheesy CGI effects turn into obstacles to further viewing. Surmountable obstacles, but obstacles nonetheless.

The same goes for computer games. I imagine that, should we use the Civilization series as a benchmark, there would be different cutoff points for different audiences. For my part, the first iteration is unplayable, and I suspect many of my younger peers would balk at Civilization 2. I also have fears that 3 or even 4 might be too much of a learning curve for those who were not there to remember it. Not because the games are inherently impossible to play, but because the contemporary frameworks for how games are supposed to work (and how intuitive user interfaces are supposed to be) have shifted between then and now.

A certain sensibility has been lost.

It would be a mistake to label this development as either good or bad. The young ones have not destroyed theater by their use of the lyre, despite all accounts to the contrary. These changes are simply something that have happened, and have to be understood as such. Moreover, it is something to take into account as yet another generation grows up in a society overflowing with media artifacts, old and new.

Some of these artifacts will constitute shared experiences, while others will not. Such is the way of these things.

Saturday, July 8, 2017

Care for future history

These are strange times.

Since you're living in these times, the above statement is probably not a surprise to you. In fact, it might very well be the least surprising statement of our time. Especially if you happen to have a presence on twitter, and even more so if this presence is in the parts where the statement "this is not normal" is commonplace, or where a certain president makes his rounds. The two are related, in that the former refer to the latter: it is a reminder and an incantation to ensure that you do not get used to these strange new times and start to see them as normal.

These times are not normal. These times are strange.

In the future, there will doubtless be summaries and retrospectives of these times. More than likely, these will be written with academic rigor, historical nuance and critical stringency. Even more likely, all the effort put into making these retrospectives such will be made moot by this simple counterquestion:

Surely, it wasn't that strange?

We can see this future approaching. Less strange times will come, and frames of reference will be desensitized to the strangeness of our time. In a future where it is not common for presidents to tweet at the television as if encountering the subject matter for the first time, the claim that there once were such a president will be extraordinary.

Surely, it wasn't that strange?

It behooves us - we who live in these strange times - to leave behind cultural artifacts that underline and underscore just how strange these times were. Small nuggets of contemporaneity that give credence to the strangeness we ever so gradually come to take for granted. Give the future clear direction that, yep, there is a before and an after, but not yet, and we knew it.

It is the implicit challenge of our time.

Better get to it.

Sunday, May 21, 2017

Concerning the Dark Souls of US presidencies

It has been said that the current president is the Dark Souls of US presidencies. Which, to be sure, has a certain ring to it, but it lacks the virtue of truth. Let's explore the issue for a spell.

Dark Souls is a series of games built around the notion of gradual player progression. The games might seem hard at first, but if you stick with it you learn how to overcome that difficulty and become good at what the games ask you to do. The difficulty is not mechanical - the challenges do not require superhuman reflexes or superior skills to overcome - but rather psychological. By failing, again and again, the player gradually learns what needs to be learnt. The reward for this application of patience is the opportunity to excel whenever new situations arise that require the very thing just learnt. It is the player leveling up, rather than the player character.

Meanwhile, in the background of all this character development, a world and its long tragic backstory is ever so subtly unfolding. It is not a simple backstory, where this happened after that, but a series of subtle implications of social relations and emotional states of mind. Complex social processes led to cascading catastrophic outcomes which in turn sparked other social processes which -

It is a deep and complex backstory, and for the sake of brevity, it will all be ignored. Suffice to say that much of it is left unsaid, and that the player will have to piece it together from archeological fragments, old legends and features of geography.

From this description alone, you might see what I'm getting at. Gradual self-improvement through patience, slowly unfolding understanding of past events through contextual knowledge, and the characterization of subtle states of mind - neither of these things are applicable to the current president, even with excessive use of shoehorns or cherrypickers.

There probably is a past president that would live up the title of the Dark Souls of US presidencies. But that is a topic for another cycle.