The “problem” of the English discipline and the rise and possible fall of legal positivism

The “Problem” of the English Discipline and the Rise and Possible Fall of Legal Positivism

Dispersing the Culture of the Rabble from the Comforts of Rule

The theme of Chapter 2, foreshadowed in Chapter 1, was that the rulers of England, faced with a series of crises, dangerously uncontrolled monarchs, bitter religious differences and powerful overseas enemies, had to construct authority out of consent. Sovereign authority – that of the Stuart monarchy – and colliding certainties – those of the Protestant sects that proliferated during the civil war period – were what had brought them to the brink of disaster, courting political disintegration and tyranny or possible foreign conquest, both last of which might have amounted to the same outcome. Civility, politeness and agreeable disagreement, I suggested, provided for some time a way out of the crisis. This, to a degree broke down less than a century later. Then, for more than a century the Benthamite, pedigree-view of law as merely the will of the sovereign rather than the bearer of a content, gained a great deal of persuasive power, in my argument. This was what literary theorists might term a shift to the authorial text, where meaning is restricted to what the writer intended (or what someone authorized declares the author to have intended). The alternative is a readerly text, where meaning is importantly connected with the context of its reception – the time, the place, the reader, etc. In 1763, the Seven Years’ War produced global power for a Britain about to encounter class and eventually gender challenges within. It was the “Annus Mirabilis” of 1759, with victories seemingly everywhere, that, whatever the inclinations of Prime Minister Lord Chatham, persuaded his successors of British invincibility, convincing them soon after that they could not only crush the American rebellion in the 1770s, but simultaneously sustain a conflict with almost all of Europe. The lack of success here was, unfortunately, insufficient to discredit the authoritarian tendencies in government.

Despite what may have been the early reform intentions of the younger Pitt (Chatham’s son, although a Tory), “(d)espite the humiliation of the Treaty of Versailles”, the surrender to the colonies, “nothing better reveals the might of Britain” than “as Michel Besnier put it, she lost the war but immediately afterwards won the peace. She could in fact hardly fail to win, since she had all the trump cards”.1 Allowing for Braudel’s exaggeration, there may be some truth in the assessment. But, if we are making connections between empire, the authoritarianism encouraged by it, and the drawbacks I have associated with lawyers’ law, the conception of law without Burke’s “grandeur”, this development was not necessarily, as Sellar and Yeatman might have put it in 1066 and All That, a good thing.2 Nevertheless, the Eden Treaty of 1786 dictated the terms of trade for Europe and, despite his later embargo on the import of British goods to Frenchoccupied Europe, it is said that Napoleon’s troops marched into Russia with buttons made in Birmingham. Great Britain was, for internal purposes an empire in a new way at the end of the century.

I have emphasized, no doubt too starkly in order to make the point, that the externally aggressive state that emerged from the 1688 Revolution and the 1707 Act of Union was, compared with other European dynastic agglomerations (the United Provinces were another exception), internally unusual. As early as the 1640s, Colonel Rainsborough could celebrate his liberties and political equality with the conclusion that to doubt their (nevertheless contested) truth would be to doubt if one were English. The sentiment would be repeated in mid-eighteenth century America. White male Americans, at least, and no doubt many women, constructed themselves as “American” just as Englishmen seemed to have constructed themselves at quite an early date, as “English”. It was not until very much later, according to Robb, that ordinary people in France developed a sense of Frenchness.3 It matters less, of course, that there should have been a consensus than that it should have been an issue at all. Coke, Ireton, Lilburne and the members of the Convention Parliament would all have had different reasons to doubt whether the views expressed by another justified doubt about his (usually his) Englishness, but doubt of this kind was an important component in the formations of cultural and political identity, perhaps more in the Anglophone world at this time than elsewhere. The English subject in the Lacanian mirror was, naturally, multiple and contested, but it was, from the early seventeenth century imagined vigorously and with energetic collusion from the Scots and the Protestant Irish. The nineteenth and twentieth centuries were crucial to how the ambiguous inheritance of this past was to be received. One was, not coincidentally, I have suggested, the simultaneous authoritarianism and closure enacted by Jeremy Bentham and the Westminster sovereigntists of George III’s ministries. The other legacy, in the 1750s denounced as effeminate, was the civility recommended by Locke and others, yet both remain like viruses which, once caught, remain in the body.

The subsequent tensions play out in unexpected fields, but with results that may in the twenty-first century, yield grounds for optimism. A strong commitment to a polite politics, even, as many historians have shown, in riotous situations, may finally, strengthened by the connection with the new Europe, have civilized, or rendered shorter-lived the harsh “there is no alternative” of the Thatcher years. We shall encounter equally harsh opposition to “effete” culturalism from unexpected sources, but for the moment it will suffice to notice a recent commentator:

It is no accident that Europe, one of the most powerful regions and certainly the most peaceful and prosperous in the world, comprises a community of states with a shared commitment to the European Convention on Human Rights … Both … Labour and the Conservatives are discussing the development of a British bill of rights to build on existing protections under the UK Human Rights Act and the EU Convention …4

The Victorian era in Britain is in some ways undecipherable largely, it seems to me, because its observers want answers which are, for our purposes, at least, unavailable. Strange alliances, no doubt unintended, emerge. For the Victorians, the question was one of how best to manage a bewilderingly polyglot population, not merely overseas, but also, as was recognized in a way it had not been earlier, at home. In a sense, indeed, people had suddenly become not at home, and the uncertain uncanniness generated by two waves of industrial and agricultural change and by associated demands for political change left large questions about order unanswered. The Victorians’ solution was, as we saw, neatly embodied in Macaulay: education and legislation in no particular order. Repression, if left to men like Lord Liverpool (a humane man by reputation, but who acquiesced in the infamously repressive Six Acts of 1819 in response to the peaceful protests in favor of parliamentary reform in St Peter’s Field, Manchester) or James Fitzjames Stephen with his famously reactionary views; educing the best a person could be, or “sweetness and light”, for Coleridge and Arnold; or something banally in between for men like Robert Lowe, whose “revised code” of education, following the Royal Commission report of 1861, may deserve some comment.

I have written of both the “missionary” impulses of the middle classes in England and their parallel ambitions overseas, and of the autodidactic “improving” motives of the formally uneducated themselves, in the educational area. From before the time of John Bunyan, a contemporary of Lilburne’s, in the seventeenth century, and from a mixture of motives in harsh conditions, men and sometimes women who regarded themselves as humble, energetically sought solace and understanding in their own critical grasp of what we might call high culture. Because of the influence of the Kirk on literacy and learning in Scotland it was, in my youth at least, common to find a Scottish mill-hand or collier who would quote from Burns, or Scott’s The Lady of the Lake. Children who left school at fourteen would know the dates and significance of Bannockburn and Flodden for the reflections, the imaginary of their identity as Scots as opposed to “English”.

The autodidacts and the self-improvers who labored by candlelight after a day’s heavy laboring elsewhere, aside (but not too much aside), it was the missionaries, as they saw their largely secular selves, and the ever-latitudinarian Christian socialists who stood in for the Kirk in England. We can find countless examples of unfortunate expressions. Is it really necessary to shame the hobnailed working man into politeness by comparing him to a clown in a boudoir? A glance at the letters of Lord Chesterfield to his son about clownish aristocrats whose behavior the son must avoid if Great Britain is to cohere reveal a similar (necessary?) cruelty. My own grammar school headmaster, explicitly renouncing any “missionary” role in favor of social work, writes, risking the patronizing tone of an institutionally accepted “improved” working class boy, but also reminiscently of Smith, holding the mirror up to the “English”, that curiosity means the tolerance of uncertainty, resistance to the “insolence of office” that would instruct otherwise.

He opens, and I quote him at length:

The position of the grammar school in modern society is a crucial one. It stands at the meeting point of the elite and the popular cultures. Young men and women who wish for a higher education will normally pass through its sixth form. Increasing numbers of children, whose families have had no experience of the grammar school and little of the cultural standards which they will be expected to assimilate there, are crowding into its courses. Many of these come from that large area of society which is occupied by what used to be called lower middle and upper working classes, though it is doubtful if such terms are helpful under modern conditions, and it is certain that accurate descriptions of class characteristics becomes increasingly impossible. It is to children from homes like these that the school must transmit our cultural heritage …5

To return to Lowe: his legislation was almost exactly a century before the publication of Davies’ book, and implemented the recommendations of the Newcastle Royal Commission’s report on the education of the working class and the proper role of the government in that task. We know that he, along with the rest of the government, wanted cheapness. Whilst the debacle of the Crimean War in the 1850s had cost £78 million, an annual impost of £723,000 in educating the poor seemed to them outrageous. It had to be, and was, reduced. We can only speculate on Lowe’s other political motives, although his fierce antagonism to democracy may provide a clue. Knowledgeable and articulate people denied participation in their society to would be troublemakers, as he learned from his experience in the Assembly of New South Wales. Schools that were already government-funded were, of course to be made cheaper. The regime was to be one of cramming and regurgitation, not of comprehension or inquiry. Teachers were to be paid (and assessed) by managers according to the rote learning of children examined annually by inspectors. How important was it that the working class be discouraged or distracted from, in Arendt’s sense, thought? Quite important, I believe. It is a mindset with which we are becoming, unfortunately, increasingly familiar in the present:

Since getting as many children as possible to pass a mechanical examination of a mechanical knowledge of the three Rs was the government’s main requirement from teachers, there was really very little need to ensure that they received a broad liberal education … there was little financial incentive to help the clever child realize his full capacities.6

Of course, as Rapple points out, Lowe’s policy was mean-spirited and dismal. It applied a utilitarian logic that demonstrated that knowledge was potentially certain, applying Newtonian science to the universe and utilitarianism to social issues. The poor deserved their poverty, for the most part, because they were unintelligent. Lowe knew from Australia what chaos could result from the multitude’s being given a political voice. In England, at least, there was certainty, threatened but saved from questioning. In a few decades, a great deal would change, no thanks to Lowe, but drawing on the inheritance of the Scots, and of Coleridge and Arnold. They were and are all, of course, criticized, which is precisely what they wished for and encouraged.

Legal Positivism as a Plan to Institutionalize Authoritarian Culture

With the cultural phenomenon, legal positivism, and its gradual acceptance after Bentham’s Fragment in 1776 and the Indian conquests, we are returned to the gravitational field of certainty, perhaps unsurprisingly. Victorian novels and law reports are replete with accounts of financial insecurity, fraud, bankruptcy and ruin. With Conan Doyle we can map the cultural oscillation perfectly.7 Sherlock Holmes epitomizes the scientific pursuit of achievable certainty, in contrast to his bumbling foil, Watson, a medical doctor. Like Watson, Doyle was a medical doctor, no doubt as aware as his character of the uncertainties of diagnosis. Holmes was, perhaps, a comfort. Doyle’s later character, Professor Challenger, on the other hand, with his discovery of the earth as a living creature, and Doyle’s own later spiritualism could have provoked only contempt from the character, Holmes. For our purposes, then, the Victorian legacy repeats a phenomenon already encountered: a yearning for certainty, haunting the imaginary, but with the uneasy suspicion of its unattainability.

The demise of Absolutist government in Germany, Austria-Hungary and Russia after World War I, economic, social and national instabilities, the rise of totalitarian “solutions” to those problems and the response by mediocre politicians to continuing social disorder “at home”, perhaps prolonged the life of constitutional and other dogmatisms.8 “Government … by bores … by men who were barely equipped to deal with the two great issues of the time: poverty at home and the rise of fascism overseas”,9 coupled with World War II, delayed what I see as a return to something like the sentiment of liberal Whiggism in the late twentieth century with the maturing of the European Union. This return had to await the slow revolution – in the non-millenarian 1688 sense; a return to the past, or, better, to the past as it might have been – generated by Britain’s very gradual association with the new, postwar Europe, the domestic adoption of the European Bill of Rights and the resurgence of the common law. The priority of defending an obsolete imperial order at home and overseas was until recently reflected in governments’ and theorists’ approaches to law.

I remarked earlier on Arendt’s distinction between the American and French revolutions. The balance of powers in the new American Constitution, designedly an improvement of the “balanced” constitution of early eighteenth century Whig Britain, she contrasts with the French substitution, some decades later, of the general will for Bourbon Absolutism, leading the way to Robespierre, the Terror and Napoleon. The Whig constitution, resolved in the years following the English revolution was, I suggested, under threat during the emergencies, real or imagined, at the turn of the nineteenth century, when the example of how to govern India, as the British saw it, seemed a useful one for governing the unruly at home. Because the emergencies only intermittently went away, class and gender antagonism grumbling throughout the Victorian era and beyond, into the eras of world wars, the temptation to invoke, in Britain, the strong leadership deemed appropriate in the empire remained; and strong leaders are tempted to legislate before they negotiate, so AN Wilson writes in his sweeping histories of the Victorian and post-Victorian periods.10

Authoritarianism in Britain never approached the enormities of twentieth century European totalitarianisms, although, as I pointed out in Chapter 1, one government’s oppressive behavior cannot be excused on the grounds of its killing fewer people than the government of somewhere else. At least, it seems to me pointless. And whilst “human (i.e. political) rights” curbs on British executive governments have become more practically effective under the influence of a reformed Europe and the devolution of much government from Westminster to Wales – more still to Scotland – the “war on terror” appears to have enhanced executive power in the United States. This has had predictable consequences: the “extraordinary rendition” of untried prisoners to foreign countries where they can be tortured with impunity, and the creation of an “Alsatia” like Guantanamo, a place belonging to, and run by the United States by reason of a treaty imposed upon Cuba, but in which no constitutional protection from torture is available. By a bare majority of the US Supreme Court, Guantanamo prisoners’ rights to due process were recently affirmed, but the court now seems a feeble obstacle to executive power.11 The executive branch of government in the United States, from Roosevelt through Bush Junior, has had aggregated to it enormous powers, whilst a resurgent common law in New Zealand and more so in the United Kingdom, seems prepared to challenge the executive in ways that were once familiar in the United States in the years of the Warren and Burger courts.

Obviously, my book is concerned with the vicissitudes of the English subject at the interstices of legality and empire – and I have assumed the temporally contingent nature of the last two, as well as the first. A brief glance at current events in the United States – the increasingly religiously-refracted understanding of the polity has been noted – is justified to reinforce the point about the effects of empire in strengthening the executive to the extent that it takes on the character of a sovereign where it may not, as in Arendt’s assessment, have possessed it before. The powers assumed by Roosevelt in implementing the New Deal, condemned by conservatives as unconstitutional, can be seen as a benign response to a major economic catastrophe, even as necessary measures to avert more serious social disorder.

Looking at later events, however, some commentators have wondered to what extent the apparent closeness of corporate power speaking the language of Protestant evangelism to government lends weight to executive claims to act almost unilaterally, to enable the United States to pursue what it sees as, not merely human, but as God’s work.12 But God’s work is even more imbricated with this new empire than it was with the British. It seems clear that Bush the second’s administration is seen in some circles, at least, as an imperial one. Blumenthal quotes a White House aide’s explanation to Ron Suskind of the New York Times Magazine. Writes Suskind, “the aide said that guys like me were in ‘what we call the reality-based community (believing) that solutions emerge from … judicious study of discernible reality’”. But, the “faith-based school of political thought” meant that “that’s not how the world really works any more. We’re an empire now, and when we act we create our own reality”.13 A less sanguine view of empire was taken by one of Michael Moore’s correspondents in the military, serving in Iraq. He begins by defining the role of the US military as upholding and defending the Constitution of the United States. But:

This war has nothing to do with upholding and defending that constitution. The military is for defending the republic; it is not for overthrowing dictators, building “democracy” or making the world a better place for everyone else. While these may, at times, be noble goals, it cannot be the business of a free and democratic society, or else it will cease to be a republic and become an empire.14

Whilst US government officials were either defending the use of torture, redefining it to exclude conduct they endorsed, or withdrawing it to extra-jurisdictional sites, in Britain one finds among the judiciary statements that almost return us to the balanced constitution, a constitution which Arendt – correctly when she wrote it – the United States had drawn from and improved. According to Cooke, J, “some common law rights (such as that against torture) presumably lie so deep that even Parliament could not override them”.15 Turpin and Tomkins cite the English Master of the Rolls, Lord Woolf, arguing that judges might either find “an irrebuttable presumption” that Parliament could not have intended (a circumlocution to save yet avoid the Diceyan doctrine of parliamentary sovereignty) to ouster judicial review of its legislation, or they might assert that there were limits on parliamentary sovereignty. Legislative insistence upon its supremacy would quickly cause an irresistible political demand for an entrenched bill of rights. Macaulay, reflecting on the Reform Act 1832, had long before predicted that, whilst Parliament had, for some moments been the House of (some) people, the time would come when it would have more definitively to become so. His prescience about the role of the judiciary in the process was less acute, but that may reflect the poor and subservient or compliant caliber of most English judges throughout the nineteenth and twentieth centuries.

The doctrine of stare decisis16 established a self-imposed rule that the House of Lords could not reverse its earlier decisions. The sacrosanctity of parliamentary sovereignty and the Dickensian Judicature Act 1873–75 established so many tiers of juridical appeal that no ordinary mortal would live to travel it. The case on which Bleak House’s Jarndyce v Jarndyce was based was reputed to have taken more than a century to conclude. In the more enlightened late 1990s, under the influence of the European Union, judges in England seem to have found more connection with citizens. Lords Steyn and Hope of Craighead discovered simply the limitations on legislative supremacy, depending on an examination of the merits of its content, citing the consequences of the divided sovereignty implied by the Scotland Act 1998 and the incorporation, the same year, of the European Convention on Human Rights into law in Britain by the Human Rights Act 1998: “the English principle of the absolute sovereignty of Parliament … is being qualified”17 – as, I suggested, it always was. We are, in a sense, nearly en route back to a landscape that would have been familiar in general terms to Burke: the constitution changes gradually according to the customs and expectations of its subjects. A proper understanding of law and a broader understanding of its content requires, in a sense, a return to Adam Smith’s and Blackstone’s concerns about broadening education in law.

Meanwhile, government by legislation that cannot be challenged in terms of its qualities required something like legal positivism to explain and justify it, to render wider, contextual issues of history and ethics peripheral. It also required the kind of subservient philosophy that reassures the Gradgrindian “facts” men and lends them respectability. Fortunately for a more humane view, like the medical professions who have rejected the proposition that patients are machines with symptoms in need of repair, lawyers have recognized that there is more to their subject than was dreamed of in their (hitherto) philosophy.

The central tenets of the erstwhile dominant, and by no means yet dead, legal positivism have become familiar during the course of this book. They are that it is the pedigree of the norm that determines whether it is a legitimate component of the legal system: did the rule originate with the sovereign, or was it promulgated under the delegation of the sovereign? In the empire – in particular in India – this was, as James Fitzjames Stephen put it, “the gospel of the law”,18 which, significantly, it was the duty of the British to “teach” the natives. Politics, administration, the proper organization of government itself, depended for its coherence on the sovereignty of the Governor-General, later Viceroy, who, whilst nominally subject to control from the Queen-Empress or Parliament in London, was effectively seen as the basis of authority.

In its cultural dimension, legal positivism required and still requires, we have seen, a rigid distinction between the worlds of the moral and the legal. The standard justifications for this are, first, that (by the 1960s in the UK) many areas of predominantly, but not only, sexual morality were seen, in a phrase much used at the time, as not the law’s business. This was the premise of the Wolfenden Committee report19 which recommended that male homosexual acts committed in private should not be illegal, and nor should privately conducted acts of prostitution. The sovereign’s place in the bedroom was properly to be a limited one, in the view of the Committee. In an apparently liberal era following Wolfenden, suicide ceased to be an offense and abortion ceased to be illegal, subject to certain conditions.

However, the detachment of legality from morality – hence the elevation of sovereignty in the understanding of law – was not unchallenged. Patrick Devlin, a Law Lord, published a defense of the position that law was and ought to be a custodian of public morals. His alarm that even the kind of mild liberty of private action supported by Hart and others could be a prelude to the breakdown of ordered rationality and the encouragement of anarchy, was prompted by his reading of Stephen’s trenchant beliefs on the subject. Wherever one looks at that time, the empire gazed back.20 Devlin was not, therefore, an opponent of the sovereignty doctrine, but rather of the view that what he took to be the moral sovereignty of his class – its aspirations at least – should form a part of the law. The man on the Clapham omnibus, the man in the jury-box, the man of some property (since there was a property qualification for jury service when Devlin wrote), could be appealed to as a distinctly male-gendered believer in such of his betters’ morality that Devlin himself believed in. It was this comfortable moral deference to an imagined upper class that the Angry Young Men, the Australian Oz collective and the 1960s radicals responded to.21 If “free love”, one of the subversive elements of sixties radicalism, was the expected exploitation of women, it stimulated feminism. The man on the Clapham omnibus is no longer hermaphrodite or governed in his thinking by homosocial priorities.

Stephen, writing in the 1860s, although a strict legal positivist, a proponent of Westminster sovereignty, “was convinced that the criminal court is a school of morality, the ‘organ of the moral indignation of mankind’”. (This was his inspiration for Devlin.) India was, for Fitzjames Stephen, the lesson for Britain: good codes of law based on European morals and firm but benevolent administration. In Britain was “being erected a tyrannical democracy which will change the whole face of society … destroy(ing) all that I love and respect in our institutions”.22 Liberty, on the other hand, we recall, was for Stephen, a “system which embodies … all the bitterness and resentment that can possibly be stored in the hearts of the most disappointed, envious and ferociously revengeful members of the human race” against those whom they regard as their oppressors.23

If it was the legislative will that private homosexual acts were within the law, although Devlin did not use this example, something like it was raised by a celebrated case a few years after Wolfenden: what about the flaunting of homosexual relations, its advocacy, for example, or men holding hands or kissing in public places?24 Where was the line between law and morality in sexual matters to be drawn? And where if the line could not be drawn here, was the boundary between efficient administration and the apocalypse to be defended?

Devlin was no doubt encouraged in a minor key to a questioning of the possibility of drawing firm lines between law and morality by a perhaps unforeseen loophole in the Street Offences Act 1959, which restricted the illegality of prostitution, like that of male homosexuality, to its public manifestation – if that could be defined; in the case of prostitution, in the form of public solicitation of clients by sex workers. In the case of Shaw v DPP,25 no women approaching men on the street offering sexual services were involved. However, Shaw had produced a publication detailing the private services that were available from identifiable women, The Ladies’ Directory. Prosecutors, as they sometimes will, resorted to the doctrine that a conspiracy to commit a lawful act can, if needs must, be a criminal offense. We are familiar with this from the famous incident of the Tolpuddle martyrs in the 1830s, where, although trade unions themselves were legal, the necessary steps of forming the union in question – affirming loyalty, etc – were deemed a criminal conspiracy, since they derogated from the legal subject’s primary oath of allegiance to the Crown.

The House of Lords in Shaw, in nostalgic mood, found in the eighteenth century authority for the view that the royal courts had a duty to preserve public morality, to prohibit whatever was contra bonos mores et decorum,26 and they convicted Shaw of, among other things, a conspiracy to corrupt public morals. But 1960s Britain was, as we have seen, the age of satire, and the Edwardian-flavor of the prosecution in the Lady Chatterley trial had little chance of being persuasive.27

There are some curious epicycles which there is no space to pursue here beyond noticing them. For example, Coke’s quarrel with James I involved the assertion, or, as Coke would have it, the reassertion of the existence of a public sphere: if the king was subject to the law, there must exist a political regime apart from the dynasty and the royal will. In the eighteenth and nineteenth centuries, rioters, parliamentary reformers and trades unionists insisted that there were legitimate collectivities, call them, if you will, civil society, outside this designated public sphere – a sphere asserted by the political classes as legal sovereignty – that the reformers, trades unionists and others, suspected was actually a mask for the advancement of private interests. In an age of civil liberty now, perhaps, endangered, diminished in some parts of the Anglophone world, slowly growing in others, Hart and his colleagues articulated a general feeling that beyond the public and beyond the collective organization of civil society, there was a personal sphere in which activities, however disapproved of by others, should not be subject to scrutiny.

However, there is another element in the narrative of the law/morals separation if one thinks of the distinction made by Hart between pre-legal, or primitive normative orders and those with more sophistication. Legal systems have three levels of operation. First, those who “recognize” rules as legal rather than rules of another kind – somewhat mystically since, as we have seen, they are private persons prior to any regime that permits a public/private distinction, and officials prior to the possibility of any criteria of officialdom. They are, of course, the Viceroy and the Indian Civil Service, together with the merchants who control the economy using the codes. Second, there are legislatures and councils whose job it is to direct change. Third, those who function to enforce the norms legitimated at the other two levels.

The apparatus operates in a way analogously to the indigenous land rights issue in Australia in two ways. The colonizers did not have to pretend that human life did not exist on the continent, since there was indisputable evidence of it. They simply noted its “primitive” nature: no princes or kings; no legislatures or other agencies of deliberate change to the law as it was traced back to the absurd myths of the Dreamtime – in no way comparable, in whites’ view, to the redemption religions of the Mediterranean littoral; no bureaucracies. But after massive land theft came the moral outrage of the Australian High Court decision of 1992, Mabo v Queensland,28 which has all the tragi-comedy of a popular greetings card of the 1980s, depicting a mid-forties woman, her hand to her mouth, exclaiming, “Oh God, I forgot to have a baby”. When there was so much evidence of relationship to land among Aboriginal people, so much respect for each nation’s attachment to country and the need to seek permission to cross another’s country, could the common law courts have “forgotten” about it? Oh God, yes.

The answer lawyers once seemed satisfied with was, in effect, indeed, “yes”, but that was then. But when it remembered – and the courts did not remember for very long, if we think of the endless refusals to recognize, in the present, indigenous claims to a continuous connection with their land, as required by the colonists’ law – how

Only gold members can continue reading. Log In or Register to continue