Wednesday 18 June 2014

Reparations: the case against

Today I read 'The Case for Reparations' in The Atlantic. It is well-written, but far too many of its 15,000 words simply describe the injustices faced by black Americans, rather than argue the case for reparations as the only reasonable and right response to them. Here then are some of the problems that I believe inevitably undo any argument for reparations.


The impossibility of proper justification

Reparations would entail either 1) seizing money from businesses that profited from slavery, or 2) the disbursement of taxpayers' money by consenting governments. To preserve rule of law (circumstance 1) or democratic accountability (2), both actions would have to be justified by a standard of proof that is impossible to attain in judging a 150-400 year-old crime. That is, it is a burden of proof that historical research could not realistically bear - company X made a bundle from slavery, lost it in the Depression, and then built up a new fortune from scratch - how much do they have to pay? what about African-American Y who has suffered misfortune and injustice totally unrelated to race and slavery? Never mind the complex calculations, how to gather all that data?

The problem is in making a mixed argument - seeking a political and legal end using historical proofs. The historian cannot provide evidence strong enough to justify a legal decision to seize money, or a political decision to spend taxpayers' money (though I admit the last is debatable).

Reparation is thought justifiable as a form of redress, but it is also a punishment and one for which justification is no longer attainable. We could always disregard the question of justifiable punishment, and say we must pay reparations as a moral imperative, but doing so would damage rule of law and democratic accountability - given their historically proven value as defences against slavery and inequality, this would risk being self-defeating.

Money can't buy everything

Making a society pay for the sins of its ancestors is plainly absurd - if we have to take the rap for their crimes we should also get the credit for the good they did, so I deserve a medal at the very least for great-great-grandfather Hywel's bravery at Sebastopol (he didn't exist really). The reparation argument is saved from this absurdity because it says we should pay only for those crimes that still affect our society.

But this is another example of a muddled argument. Seeking restitution for a past wrong is not the same as reforming society, and I'm not sure the one segues into the other. In fact, this move takes the argument from the frying pan into the fire. 

Consider that if the effects of crimes are monetized, the effects of good deeds should be also. So the reparation amount would have to be abated proportional to the positive things that have been done to mitigate the legacy of slavery, e.g. the Civil Rights Act, affirmative action. As the reparationist's entire argument is based on the premise that deeds can be translated into dollars, she would have to agree that it would be unjust for US taxpayers and companies to pay the full price of the crime of slavery, because at least some deeds have been carried out to reverse its effects.

But something strikes me as very wrong here. We would have to factor these reparatory deeds in to the reckoning, only so as to convert them into a reparatory monetary value - but isn't this last calculation redundant, overlooking the very real value the Civil Rights Act and affirmative action already have as reparations? As they already work towards the same end as reparations (justice for black America) why not continue this process and carry out further such actions in the future? Why bother with the extra monetary calculations? 

To illustrate. Let's say a judge decided slavery warranted $100 trillion in reparations, but that the Civil Rights Act and affirmative action amounted to $5 trillion restitution each – after all, no one would deny that slaves in the 19th century lived worse lives than black Americans today. If that reduced the debt to $90 tn, then further acts of social justice would surely, eventually reduce the debt to $0. So on the reparationist's own terms, through social reform alone we could clear the debt of slavery, create a fairer society, and avoid entirely the problems of reparations detailed above. This would always be the better, most rational choice.

At this juncture we would surely feel that monetary reparation does not get at the heart of this matter. We intuitively want to take money out of the equation, I suggest, because paying off individuals as a solution to a social problem would be a mistake from the very start. Social dysfunction is remedied by social change (driven by e.g. law, as with the CRA), not by paying off for their troubles those afflicted by it, and leaving the dysfunction in place.

So as soon as the reparationist shifts the question - from restitution for a past crime to social reform in the present - cash payment becomes the wrong answer. It just doesn't answer the kind of problem that social injustice is - if the government tomorrow said it would tackle inner-city poverty by paying damages to poor families by way of apology, we'd think it not merely odd but incoherent, logically inconsistent. It would be, I think, a category mistake (there's a further argument to be made for why it is a category mistake, an argument that I perhaps misconceive, but I'll leave it for now - I think it's something to do with the fact that it is difficult, maybe impossible, to identify consistent moral agency across a society changing over time, but necessary if a claim for reparations is to be valid). 

Complete redundancy

So slavery must be atoned for because of its enduring legacy, and to do that we must reform the society that it created. If I am right that this can really only be done by government action, education, better morality etc. then the reparationist's argument becomes indistinguishable from that of the liberal (and indeed the vast majority of reasonable people) who says that we should try to create a fairer and more fulfilling society for all its members. 

So if that's where we end up, why bother with the case for reparations at all? For all the heat and noise generated by its rhetoric, the case for reparations, rationally conceived, proposes nothing that isn't proposed by everyday, vanilla-flavoured liberalism.

This does not exhaust the pro-reparation argument, but neither does it exhaust the anti-reparation argument. There is much more to be said and I'm open to challenges.

Tuesday 17 June 2014

Against critical theory

[This post was previously titled: Philosophy to the rescue? A credo, or something like it]

It’s time I stopped prevaricating and laid out some basic beliefs. So this one’s about what I think is wrong with the humanities (and therefore us graduates schooled in it), why I think philosophy could be a remedy, and why I have misgivings about my own remedy.

(I'm avoiding giving examples for brevity's sake, but will provide if challenged. Update - see here instead for an excellent, and highly amusing, close analysis of an example of theory talk).

To lay out where I stand in all this – I studied English lit. in a theory-heavy English department joint with classics in a traditional, ‘philological’ classics department. I also did some linguistics. Afterwards came a doctorate in classics, accompanied by a side-interest in philosophy that fed into my research; finally (and coincidentally) I ended up working as an administrator at the Philosophy Faculty in Oxford.

My conclusion from this petty odyssey in academia is as follows. Critical theory is:

  1. drek
  2. junk
  3. cack
  4. and balls

Now this won’t do as an argument at all, but at least it gives you a taste of my strength of feeling on the matter. More delicately, then, here is my gripe on how we have been harmed by critical theory (a field in which, he huffed pompously and not a little defensively, I am reasonably well-versed and have achieved some success).


Ceci n'est pas un philosophe.

Leg before wicket and a fair shot


First, a large portion of recent generations of humanities graduates have been taught not only that clarity is dispensable, but also that lack of clarity is a likely sign of a greater intellect than their own at work. 

How did we ever come to this pass? What tailspin into utter, abject debasement has led us to think that understanding one another is not important?

For it seems, how you say, clear that clarity is in fact vital and precious – like the leg-before-wicket rule in cricket, it allows your opponent a clean and fair shot at the argument you are presenting as a target. Now of course, of course, ‘fairness’ is merely a metaphysical construct that can easily be shown to contain fatal internal contradictions… But if we cannot get at each other’s arguments because they are locked away in opaque or nonsensical language, what are our chances of improving them or detecting and rejecting fallacies? And if academia isn’t the business of trying to make arguments better, then what is it? 

This is to say nothing of the toxic, hierarchical notion that students should blame their own deficiencies for the bad arguments of their theoretical elders and betters. It gives a sense of what a stillborn turkey the theory industry really is – it offers an education integral to which is the student’s constant awareness that he is incapable of and unequal to what he is learning. Just as bad, an education that leaves the student feeling unable to tell whether or not an argument is sufficiently clear.

(I’m trying to keep it brief, but I’ve always thought that this degeneracy, this outrageous regressiveness in a supposedly progressive endeavour, is a result of the wider failure of the theory project: while the original theorists had sought to incorporate social-scientific method into literary studies (psychoanalysis, Marxist economics, linguistics) later theorists balked at the democratization implicit in scientific method – that is, the process in which an author’s ideas turn into paraphrasable theorems and principles and thereafter cease to belong to the author, no longer depending on his or her original formulation of them (how many physicists working right now have read Newton’s Principia I wonder, yet how many use his ideas in some form?). Not liking this at all, the post-structuralists retreated into a viciously regressive, oracular prose that resisted paraphrase or methodization - to study Derrida you need to sit at the very feet of the hierophant himself, or near as damn it. Dead as the author may be, the theory industry does a good job of keeping him alive and well.).


The timidity of the blind


So the student of critical theory is punished into timidity, and like a bullied spouse assumes the faults must all be his or her own. It is also the timidity of the blind – being told that truth, falsity, and reason are naïve fictions to be disregarded is a bit like attending a terrifying dinner party governed by strict and totally arbitrary rules of etiquette – fumbling in the dark, with no sense of what the rules are, most would decide it’s best to say nothing at all. Which, funnily enough, is exactly what happens at most seminars.

The acolyte is stripped of her life-long tools of thought and argument (as I remember it, with a certain relish on the part of the lecturers), and given in their stead bad, shoddily expressed, and poorly reasoned maxims. Which explains those silent seminars - we all would avoid difficult questions and ambiguity if we no longer possessed means of arguing and reasoning our way to answers and resolution. Again - any academic discipline that causes its students to shun difficulty and ambiguity is not an academic discipline.

It is noticeable too how the initiate, stumbling blindly through total arbitrariness, tends to grab hold of anything apprehensible, indifferent to how the bit he has grasped belongs to the larger argument (if indeed there is one). In the absence of true understanding, and probably true reasoning too, he usually clings on to the injunctions and prohibitions of critical theory, in the manner of the medieval peasant who knows what heresy is even if he doesn't understand the theologians' arguments. This is the reason, I suggest, why the fervour with which many literature students reject authorial intention, say, is vastly greater than their actual understanding of those writers who rule it out and their putative reasons. Amidst all the babble and confusion, a retreat into a taboo mentality at least offers some security.

In fact, clinging on to whatever paltry certainties can be salvaged, and never abandoning them, is a vital survival tactic - say the wrong thing and the secret and mysterious workings of theory logic (the profound truth of which, like the words of a Sibyl, is confirmed by their obscurity) will show the student to be making the argument of an essentialist, or an intentionalist, or a capitalist, or a metaphysicalist – maybe even a Tory. Given that the danger of denunciation is ever-present and, in the absence of logic, always unpredictable, it is understandable why theorists and their students shun equivocation and exploratory argumentation in favour of reductiveness. 

Indeed, one of the most boring things about theoretical interpretations of literature is that they so often come round to the same, safe conclusion whatever the text under interpretation. This is either because the theorist advances a theory that is true but only trivially so; or because a myopically reductive vision of the world inevitably  makes the things in it look blurred and samey. Or both.


The bad news


The bad news, if you ask me, is really quite bad. A significant portion of our highly educated young people have been schooled in theories that fail entirely to describe the lived experience of being a self-conscious, cognitively rich human being. To be human, the theorist tells us, is to be little more than economically determined, ideologically driven, intrinsically political trash. 

These graduates also have an aversion to reasoned argument, and the inability to do it well when ventured. Theory seminars, in my experience (with the exception of a course taught by cultural theorist John Frow), were rather sorry and sterile affairs, largely because argument becomes impossible when your opponent can always disclaim any intention to say something true, when any potential shared premise can be rejected as arbitrary, and when the absence of clarity means that the very subject under discussion is not clear.

They are timid in confronting differences of degree, as opposed to absolute difference, and unsure in reasoning through their own intellectual ambivalence.  Fatal is any complexity, but particularly moral/political complexity, that cannot be banished by oracular assertion. Acknowledgement of ambiguity and paradox is avoided – as above, speculative reasoning can only lead to further alienation, and possibly ideological deviancy and subsequent denunciation.

Another corollary is the continuing rude health of the absurd and harmful postmodern notion that the difference between the centre and the margin is a moral one. That is, the relativism that says our moral criteria should vary according to identity. Not only is this arbitrary and unreasonable (how is what-I-am organically linked to judgement of the rightness or wrongness of my actions?), toxic (know me by what I say and think, please), and self-defeating (isn’t it racists, reactionaries, and chauvinists who reduce us to what, not who we are?), it also tends to lead us to censure unfairly and oppress those at the centre, while mindlessly indulging those on the periphery. 

The obsession with identity politics, despite its intellectual poverty, is lamentable. That something as important as gender equality is served by the thin and unthinking arguments of modern-day feminism is a howling outrage. And we can lay it right at the door of Kristeva, Irigary, Butler et al.

Given that these ex-students constitute a large portion of our intellectual and cultural elite, the question arises – come the day our way of living is seriously challenged, do we want them to be the ones to defend it and argue its value? Would they even be capable? 

It might be objected (or more precisely, insinuated) that I’m clearly reading the Telegraph too much and making ‘England, summer of 1940’ an arbitrary focal point for all moral and political debate. But then the Second World War is already of moral and political relevance to critical theory, which after all was spawned by a generation of variously defeatist, dishonest, and downright treacherous French authors (though this is unfair to Paul De Man, the deconstructionist who spent the war penning elegant solutions to the ‘Jewish problem’, as he was in fact Belgian).


The good news


But there is good news! On the face of it, it feels callously wasteful and almost nihilistic to say that the solution lies in junking decades’ worth of research and writing. But it does lie there, and we should remember that academia is only valuable to us as long as it is able to turn on a dime and reject its mistakes. Alchemy is no more valid and grounded in truth for all the time and effort it swallowed up, and the academy’s willingness to turn its back on its mistakes is a reassuring sign of a lively resistance to dogma. So in the rejection of all those glossy tomes would be the consolation that we are not damned eternally by our errors.

If the problem with critical theory has always been that it ‘is inept philosophy applied to literature and culture’ (so Denis Dutton), is it possible that the remedy is again in philosophy, but this time philosophy done well. By ‘philosophy’ I mean, totally unapologetically, analytical philosophy – dry-as-dust, Anglo-American, ‘if p then q’ philosophy. And I emphatically do not mean the continental drek that helped get us in this mess in the first place.

Actually, by ‘philosophy’ I mean the kind of philosophy-lite that I dabble in myself and incorporate into my literary research. Here’s why I think it can stop the rot:

  • It will teach us how to argue again;
  • It will teach us to be properly reflective again;
  • It thrives on difficulty and paradox;
  • Beyond academia, it will give us the tools for moral debate needed in a post-religious society;
  • Having rigour and method, it interfaces well with science (esp. maths & physics) – the future of the humanities looks dim without some reconciliation with the scientists taking place;
  • More boldly - all academic pursuits taken to their highest level of abstraction turn into maths, physics, or philosophy. Why not give as many students as possible a head-start in making that climb if they want to try it?

I am not saying that we should make all students into analytical philosophers. Rather, if we wish to theorize what we are doing, we are better looking to philosophy than critical theory. Now here are my misgivings:

  • My idea requires the cooperation of philosophers, and they may have no wish to ride to the rescue of the rest of the humanities;
  • I almost certainly over-estimate the clarity of the analytical tradition – in fact, I know for myself that I do, and have been told as much by a philosopher working in the field . Philosophers have their own love of excessive technicality and jargon;
  • I overlook the many others working in the humanities still uninfected by theory - those practising such rigorous skills as textual criticism, archival research, close analysis, and the like. I overlook them mainly because I find them less interesting, too atomistic, and also because they have little truck with the world of ideas (I'm a classicist - experto credite).
  • The division between analytical and continental philosophy is not as hard and fast as I make out, and prominent philosophers of the former stripe are trying to reconcile the two.

And the main misgiving: it is almost certain that I am a little awestruck, and more than a little gauche in the presence of philosophers. Partly because for some time I’ve been a philosopher manqué, which became embarrassing when I started working in a philosophy faculty (fawning fan-boy). The epiphany was watching philosopher of mind Ted Honderich give a talk in Edinburgh around 2002 – the focus, mental acuity, commitment, and aggression of the Q&A that followed the talk were a revelation. They behaved as if something was actually at stake! It made it immediately obvious to me (or maybe even more obvious than before) that I had to abandon the flaccid sterility of critical theory. My original plan was to digress into linguistics, and then return to do a thesis in Eng. Lit. informed by proper rigorous method. Classics intervened, but I hope I ended doing something not too far off the original plan.

Finally, it is only fair that I  say that Clive James’ Cultural Amnesia, which I gave a bit of a slagging in a previous post, is actually excellent on the impostures of theorists, especially in the chapter of Walter Benjamin. Also, it’s probably clear that my picture of a theory-poisoned humanities undergrad resembles quite closely a certain type who writes for, comments on, and reads the Guardian. This foreshadows my next big post, which will be on the failure of the modern left.

Wednesday 11 June 2014

The Beatles, again

Actually, this post isn't about the Beatles as such, it just follows on from my earlier post about them - and the Moptops are probably a bit more grabbing as a prospect than the diffuse thought-piece that follows. 

It will be unusually brief, however.


I said before that anyone would agree that the development of the Beatles' music was extraordinary and unprecedented and the rest, whatever his or her opinion of its quality.


When I wrote that, I was skirting round a more ambitious proposal I had in mind, which is this: 

music isn't in itself well-suited to being remembered, but narratives of the development of music are.
So the most solidly canonical musicians are anchored in wider popular and cultural memory more by the overall story of how their music progressed, than by the form and content of the music itself.
Acts with long and varied careers are more likely to have a historical development in their music, and this is one reason why bands the canon is made up of bands like the Beatles.
Let me expand this a little bit more before you heave a weary sigh. 

First of all - music isn't suited to memory. A patently absurd statement, on first view, given the ease with which melodies stick in our minds. But there are many good reasons to argue that music has poor memetic qualities. It isn't very portable, in the sense that it can't easily be adduced in forms of discourse by means of paraphrase or quotation and the like. The only way to experience a song is to hear it, in its single, irreducible form - a hummed version isn't the 'same' as the original to any great extent, and excerpting is unsatisfactory.


However, five different people could come up with five completely different ways of saying 'mid-era Beatles used a range of instrumentation' and 'late-era Beatles made extensive use of arpeggios' but they would still be telling the same story, to a considerable extent.


Moreover they could do it in written or spoken form. Music is not paraphrasable, or retellable, whereas a story of music is. If a musical canon is something constructed by discussing music, then entry into the canon must be easier for music that is more easily resolved into the stuff of discourse (the stuff of discourse being e.g. narratives, arguments, etc.).


Music has relatively little value as a mnemonic - there are of course jingles ('Remember, remember, the fifth of November') but the mnemonic element here is rhyme and metre, not melody. And the same is true, it is said, for the beginnings of poetry itself - it was a way of coding important information into an easily remembered form.


Narrative is much more amenable to transmission and discussion because, I suppose, of its semantic content which transcends the particularities of form - it can be scrambled and revamped, but still mean the same thing. Because of this, then, the big beasts of the musical canon are guaranteed their place not just because their melodies etc. were memorable, but because the collective story they made up is even more memorable. Narrative and argument are the same stuff history is made of - melody and harmony are not.


I'm not saying that the stories of band members, and what they did or didn't do with groupies/TV sets/red snappers etc., is what grants canonical status - but even then, it cant hurt.


Some (really not very convincing) examples:

Beatles (see earlier post)
Rolling Stones - Blues band, then go poppy, then invent their quintessential sound during the
1968-72 purple patch.
James Brown - invents funk 1965; invents it again in 1971.
Fleetwood Mac - start off all bluesy, then become all grown-up.
Pink Floyd - all psychadelic, until Syd goes mental.
David Bowie - etc.
Is this a thing, or not? 

Friday 6 June 2014

Rise of China - a paper tiger?

This one's about China - the country, not the ceramics. I'll tell you my thoughts on earthenware another day.

A successful and not entirely unintimidating business figure told me the other day that the University of Oxford is wasting its time running an MBA programme. When China starts educating its business leaders, he said, they will do it their own way - the MBA will become yesterday's news.

Coincidentally, some friends posted Facebook updates also about China, specifically the Tiannamen Square massacre:



I didn't know the massacre was suppressed so fiercely, though it's hardly a surprise. Anyway, this contrast of China's supposedly gleaming future and the reality of its recent past and its present got me to thinking - specifically, it got me to thinking 'you already have some hastily jotted scraps on the rise of China - perhaps this is a way to squeeze some life out of them?'

So, having reached to the back of the cupboard and checked the sell-by date, here is my slightly stale offering. I'm not an economist, or a Sinologist, or a historian, so it's likely to be all very dubious.

A while ago I read a piece by Robert Peston, the BBC economics editor, on the possibility of a downturn in China. It's good - I recommend it. It is troubling that the arguments for China's inevitable rise to dominant world power status come from economists - in a way, it's natural that most predictions of the political and social future are made by those social scientists who have (or imagine they have) the most developed and mathematically sophisticated heuristic tools.

In another way, it is absurd - we would reject as incomplete and hopelessly reductive any history that tried to describe a country's past in solely economic terms, because we know of the many other factors that determine events (politics, religion, social change, culture, individuals and relations between individuals). Why, then, do we give so much weight to economists' predictions of the future when we know they are giving us nowhere near the whole story about what makes stuff happen?

Maybe it's because economics has enough maths in it to make its projection credible - or, less generously, because economics has enough empirical trappings and glamour to make us briefly forget that futurology is a mug's game. As any gambler / mug knows, you just might get away with it - even woefully wrong predictions become yesterday's news and disappear into kind obscurity once the big event has come to pass.

But I'm not really interested in this question. More important - is the prediction of China's rise to superpower status flawed because it is reductively economic? Should we listen to the economists, or are they selling us snake oil?

Consider the previous two world powers, the United Kingdom and the United States, and their ascent:

  1. both were the world's foremost military nations;
  2. both led the world in technological development;
  3. both were among the world's most politically stable nations, and unquestionably the most politically innovative;
  4. both were leading trading and industrial nations;

None of these things can be said of China at the moment, or even conceivably in the near future, apart from the last. It seems beyond doubt that for some time, even the foreseeable, the US will remain stronger, more inventive, and more stable than China. China will work harder and produce more - but is this enough?

I might be committing a fallacy here: namely by imaging that there is some Anglo-Saxon 'special path' that must be followed to achieve global leadership, whereas in fact the Anglo-American virtues 1-3 are merely incidental and not conditions of superpower status - really, all that makes a country top dog is number 4, productivity and growth.

But innovation, and with it a culture that allows and creates innovation, do seem like necessary conditions, namely: 

  • enough political flexibility to adapt to the inevitable social changes (new demands and aspirations) caused by the increased wealth of being a wealthy industrialized nation; 
  • the research and development capability to create the military advantage needed to protect overseas investment and wealth-creation. 

Even if the exemplar of the previous two global powers did not exist, I think one would still conclude that radical extension of China's power would require these attributes - as it is, she has a sclerotic one-party system that doesn't allow its citizens to use the internet unsupervised, and a military not yet able to project power (into e.g. Africa).

Turning to America, it is undeniable that, though she may be a on a losing streak in various ways (economic, military), we are living in a new golden age of American innovation. China can boast nothing comparable to Silicon Valley, and we might also ask whether its current culture could produce anything even remotely similar. That is, a culture of innovation based on spontaneous creativity, daring to fail, and cheek.

Now for my final attempt to argue, perhaps tentatively, that certain non-economic factors are conditions of the global dominance that the UK and US achieved historically, and the dominance China could putatively achieve. Viz. the global leader's pre-eminent position among other nations is facilitated by relations of assent and imitation - in the first case, assent to its dominance in recognition of its achievements and the possible value of those achievements. Central to the relationship between Europe (Britain specifically) and India was the latter's sense that, though an ancient and advanced civilization, it had met a civilization more advanced than its own from which it could learn a lot - this is despite the moral backwardness of imperialism, and I certainly don't overlook the importance of brute coercion. 

In the second case, the bonds of emulation - the adoption of religious, legal, and political institutions.

I think these two principles - assent and imitation - certainly apply to the British, American, Roman, and Islamic empires. If they do not apply to the Spanish and Mongol empires, then maybe that is proof I am right - dominance grounded only or mainly in coercion doesn't last long. It needs to be at least facilitated by affective bonds of emulation and assent (even admiration).

Frankly I fail to see what China has to offer that the world will want to imitate. Appreciate, yes. Buy, certainly. But imitate? The world will want to buy its products, but this does not translate into assent to its pre-eminence. When the Arab world flared up three years ago, people in Benghazi and Cairo were more likely to clamour for democracy than for Communism, or one-party rule, or Confucianism.

Even if I have completely failed to provecontra the economists, that China's rise is not inevitable, there is always the fallback of arguing merely for what is preferable. I know which I prefer, and I don't need an economist to tell me why.