Law, Science, and the Loss of Shared Reality
A shared reality requires a shared sense-making framework.
If I asked you what law and science have in common, what would you say? Our legal systems no doubt bear little resemblance to our scientific institutions, and a courtroom and a laboratory have as little in common as a barrister’s gown and a lab coat. I have what appears to be the relatively rare experience of having worn both (in their respective appropriate settings, of course). The question I have posed to you here reflects one that I have been asked several times in conversations about two seemingly very different careers: “So, you were in law but you changed into science? It must be so different?” When my response offers that while yes, at least at face value both fields appear to be entirely different, they are in practice quite similar, this tends to take people by surprise. To be clear, any similarities were not immediately obvious to me when I started down the science path. As I became more familiar with the basis of the scientific method, my antecedent understanding of the basis for our legal systems (and by “our”, I mean common law systems) prompted the realisation that both disciplines are grounded in a shared sense-making framework. That sense-making framework, broadly, is derived from a unifying epistemology.
So, what do the law and science have in common? Consider: the departure point in science is a question to be tested; in law, it is a question to be tried. Science proceeds from a presumption that the null hypothesis is true, i.e., that there is no difference between two groups or there is no effect of exposure X on outcome Y; law proceeds from a presumption of innocence until guilt is proven in a criminal trial, or from a presumption that a claimant must demonstrate the balance of probability in their favour in civil proceedings. Thus, science and law establish the burden of proof as against the claim or charge itself; it is for whoever makes the claim or brings the charge to establish sufficient evidence to make determinations affirming or rejecting the claim. To make such a determination of the question being tried or tested, both science and law establish thresholds of evidence considered sufficient to conclude that the question has been affirmed, known as the “standard of proof”. In both fields, the standards of proof are arbitrary, whether “the balance of probabilities”, a P-value of 0.05, “beyond a reasonable doubt”, or a Cohen’s d effect size of 0.81. Both science and law operate with a spectrum of standards of proof that reflects the relative costs of a false-positive finding or an erroneous verdict, i.e., as the seriousness of the question being addressed increases so does the standard of proof required to be met. While arbitrary, standards of proof are not redundant, because they reflect the most defining similarity; that both science and law are fundamentally grounded in the concept of “evidence”, i.e., facts or data that support an inference in favour of, or against, a hypothesis or legal question being tested or tried, respectively.
Perhaps you can see how, although practised in very different ways, there is a fundamental similarity in the nature of inquiry for both disciplines. This similarity in the nature of inquiry is not an accidental coincidence; it reflects a shared epistemic basis, grounded in empiricist and positivist epistemology, of what constitutes knowledge (or evidence) and how we can come to verify that knowledge. Each discipline poses its own set of applicative and interpretive problems that are unique to the field. For example, a longstanding debate within legal epistemology is the role of positivist legal theory in distinguishing between the law as it is and the law as it ought to be.2 This distinction traces right back to Hume’s 1739 A Treatise of Human Nature3, and his empiricist contention that we cannot derive an ought from an is. The very basis of what we term “the scientific method” is an articulation of a similar principle, that empirically observable phenomena should be testable to determine whether they correspond to a verifiable reality that is, rather than ought to be. This epistemic evolution provided a revolutionary step in our understanding of knowledge and its methods of acquisition, away from structures of knowledge grounded in the assumed omnipotence of the Church and divine intervention. The heliocentric theory, for example, that the Earth and planets orbit around the sun first proposed by Copernicus and subsequently bolstered by Galileo’s inquiries, was an attempt to verify an is that crashed against the Church’s theological assumptions of what ought to be.
In his Treatise, Hume stated:
“’Tis impossible to reason justly, without understanding perfectly the idea concerning which we reason; and 'tis impossible to perfectly to understand any idea, without tracing it up to its origin, and examining that primary impression, from which it arises...the examination of the idea bestows a like clearness on all our reasoning.”
Symbolic of his empiricist reasoning, what Hume was articulating was a conceptual distinction between Christian theological assumptions of causation which flowed directly from an omnipotent God and therefore removed randomness and chance, to an epistemology grounded in establishing cause through inquiry and testing, which necessitated an embrace of probability. Randomness, chance, and fate were, to varying degrees, the guiding organisational principles of pre-Christian and Classical societies. The emergence of the Christian Church thus itself provided a radical departure from these organisational conceptions of the world by placing God at the centre, which meant all actions, positive or negative, were to be causally understood as derived from God. This was reflected in Medieval attitudes to standards of proof in the form of trial by ordeal or trial by combat4. In the former, a popular form was to throw the accused into cold water; if they sank they were proclaimed innocent, while floating was a testament to guilt. In the latter, victory in the duel was a testament to innocence and defeat to guilt. In both cases, the result was considered to be an expression of God’s judgement. Trial by compurgation, where the accused would have others swear oaths to their character, was grounded in the concept that “God is my witness”; God was the ultimate judge of the truth of the assertion, and fear of God’s wrath bound the party to their sworn oath.
The critical characteristic of the Medieval landscape of both law and sciences was that facts were bound up with religion, meaning that legal and educational institutions and concepts were inherently tied to religious institutions, which influenced what would be considered a fact in the circumstances. Justice from divine intervention differs by orders of magnitude from justice determined by facts and evidence, just as the world as divinely ordered differs by orders of magnitude from the world determined by empirical observations confirmed through evidence from experiments. The 12th Century saw the beginnings of a movement away from the Medieval legal concepts of proof as Christian theology evolved its doctrines of individual agency and moral conduct, itself a quiet revolutionary epistemic development that, ironically, laid the foundation for the eventual separation of Church and State; the State would act as guarantor of individual rights while the Church would act as guarantor of the soul. And contrary to the popular portrayal of the Medieval period as “the Dark Ages” from a scientific perspective, this period in reality was defined by an embrace of scientia, in the broader sense of the Latin word; knowledge and learning. Ecclesiastical orders embraced investigation of the natural world, but as they understood it at the time, to reveal the divine order and thus enhance spiritual wisdom.5
The seeds were thus sown for the gradual emergence of an empiricist and positivist approach to knowledge and knowledge acquisition, although the fruits of the tree would not be particularly visible, at least scientifically, until the Enlightenment. Scientists do, however, tend to be dismayed to learn that concepts of evidence and proof did not originate with the Enlightenment, but the roots of the nature of evidence, proof, and standards of proof lie deep in legal epistemology dating back to the early Medieval Church. The importance of establishing facts through evidence may be gleaned in the legal maxim, per factum cognoscitur ius: “by means of a fact, we recognise (know) the law”. A slight alteration of the wording could, to the scientist, resemble the scientific process; “by means of an experiment, we know the fact”. Thus, properly understood, the Enlightenment was itself an outcome, rather than an origin event, where the seeds sown over preceding centuries had grown sufficiently to challenge the intellectual hegemony of theological doctrine. Fundamental to this challenge was a shift in the axiom of intellectual inquiry from the intellectual arrogance of assumed knowledge in the Church and the intellectual servility of the congregation in the face of such presumed omnipotence, to the intellectual humility engendered by an epistemic framework grounded in evidence and the determination of fact.
Such an approach has not been confined to the realm of law or science. The most rigorous historians operate within the same framework, understanding that the accurate portrayal of the past requires a commitment to evidence, verifiable facts and epistemic humility in the face of what we can, and cannot, know about the past. Of course, the basis of what there is to be known and how we can come to know it is not solely confined to this epistemic framework. Other epistemic positions exist on the continuum; idealist/subjectivist (i.e., reality is a construct) and rationalist (i.e., innate knowledge and a priori reason as a source of knowledge) epistemologies have attempted to provide alternative approaches. They are, however, rendered incapable by their very own operational definitions of producing the same level of rigour in whatever discipline they are applied. Subjectivism may be useful to carve out a tenured career in some pseudo-academic “something studies” field, but when it subsumes other areas of the humanities, such as English literature or history, it catabolises these fields into moral relativism and cynicism, where irrelevant identitarian criteria supplant the pursuit of truth and fact6. When subjectivist epistemology subsumes science, as it now has, it fosters the postmodernisation of science, where scientific findings are disconnected from methodology and reduced to a language game, and where the merits of scientific findings are subjugated to the social cause to which they relate.7 And that is to say nothing of the use of rationalist justifications for tyrannical social causes, with the “cult of reason” of the French Revolution as one notable historical example.
The crucial point here is this; when we view our societies and our contemporary socio-cultural ennui through the prism of our political system of democracy, or the cultural prism of “the Western canon”, or the scientific method, or the rule of law, we are missing their essential roots. They are the products of a particular intellectual framework that allowed such ideas and systems to be conceived in the first place, and evolve over time. But they are not in themselves the genesis. Understanding this is critical for the current crisis of confidence in our social order and institutions that we are currently living through. When viewed from the perspective of the increasingly obsolete classical Left-Right political divide, it is characterised as a “culture war”. But it is neither a “war” nor is it about culture; at its core, it constitutes a breakdown of epistemology, of a shared framework within which to seek sense and truth. Issues to debate and negotiate in our evolving project of societies built around respect for individual freedoms, and the competing tensions that such a project entails, have always existed. The salient difference was that previously the broad contours of the issue and its attendant facts would be known; the debate would then play out over how best to interpret and act upon those facts. Now, on any given topic, cultural, political, legal, or otherwise, two (or more) entirely different and often irreconcilable sets of “facts” exist, and there are no definable contours for the issue at hand, only a blurred jumble of incoherence and ideology. People are just shouting past each other about entirely different constructions of reality.
In this respect, our crumbling order runs deeper than the pejorative “cultural war” view, because the latter is a byproduct of the fundamental issue, which is that the foundations of knowledge that underpin our social order have been uprooted. The “culture war” view, when it attempts to assign blame according to the traditional Left-Right political divide, misses how totalising this epistemic breakdown is. The embrace of inane relativism and subjectivism may be more visible on the Left/liberal side of the spectrum, particularly because academia is dominated by politically Left/liberals. But the contemporary Right has also fractured between the traditional “small ‘c’ conservatives” who think we can just reassert the good ol’ days of fact-based issue resolution, the reactionary relativists with their noxious conspiratorial post-truth nativism, and the “post-liberal” Right lost in fantasising over “the Sacred” and the Bronze Age. The “culture war” view also misses that there is a unifying characteristic to both sides of the barricades, which is the elements of rationalist tendencies both exhibit; the desire to enforce their respective preconceived ideals of how society should be, elevate the cause over the rights of the individual, and leverage institutions to remake society in the image of their idealistic social order. Robespierre must be grinning in his grave.
This is why what “culture war” really represents is another shift in the guiding epistemology of society, one in which a disarrayed mesh of rationalist and subjectivist worldviews have displaced the preceding framework grounded in the pursuit of truth and verifiable, falsifiable facts. The preceding epistemic order is not returning, either, because the environment in which it emerged no longer exists. The contemporary environment is defined and shaped by technology, with the consequences that tech entails for our very conceptions of reality itself, the fragmentation of communications, and the inability of our institutions, a relic of the pre-tech era, to keep up with the pace of change.
The respective evolutions of the law and science serve, at least to me, as an example of the crucial importance of a shared epistemic framework as the fundamental basis for societies to, however, imperfectly, develop in a forward-moving iterative process grounded in seeking to verify reality through establishing facts and evidence. Both disciplines serve society in entirely different ways, through different means, yet may be characterised by a common method of inquiry. But we are no longer in the business of verifying reality; society has moved on to the business of creating reality, with multiplicities on offer. The political manifestation of this is simply the attempt to create reality and remake society in the image of the creators. And in the absence of any shared reality, there is no longer any need for a shared sense-making apparatus, certainly not one grounded in empiricist and positivist knowledge frameworks. That, too, is a relic of a bygone age.
It was a good run, but it had to close out sometime.
The p-value acts as a threshold for a hypothesis test to reject or affirm the null hypothesis, with “p = 0.05” meaning a 5% probability of rejecting the null hypothesis when there is no difference between groups/variables. Cohen's d is a measure of effect size, with the arbitrary industry standard of 0.2, 0.5, and 0.8 considered small, medium, and large effect sizes, respectively. For the p-value, the smaller the p-value, the stronger the evidence against the null hypothesis. For example, both p < 0.05 and p < 0.0001 are ‘statistically significant’, but the latter is much stronger evidence to reject the null hypothesis and affirm the alternate hypothesis with confidence. With hypothesis testing, you will hear language like “failed to reject the null hypothesis”, which sounds odd, but is statistically and scientifically the correct way to phrase a p-value of >0.05. This is because affirming the null hypothesis does not mean that it is true that there is no difference between groups, only that the test did not show it to be false. With the different legal standards of proof for a criminal trial or civil proceedings, a similar principle applies, e.g., a verdict of “not guilty” does not necessarily mean that the defendant did not commit the crime, only that evidence of sufficient probative value was not produced to satisfy the burden of proof. The important point in both contexts is that the concept of “proof”, i.e., of sufficiency of evidence, is not absolute, but exists on a continuum where different levels of evidence may satisfy different questions and purposes.
Hart H.L.A. Positivism and the Separation of Law and Morals. Harvard Law Review. 1958 Feb;71(4):593-629.
Hume D. A Treatise of Human Nature. 2nd ed. Selby-Bigge LA, editor. Vols. I–III. London: Clarendon Press; 1739. 1–736.
Ho HL. The Legitimacy of Medieval Proof. Journal of Law and Religion. 2003;19(2):259.
Falk S. The Light Ages: A Medieval Journey of Discovery. London; Random House, 2021.
In the humanities, this is often evident in the absurd concepts of “epistemic justice” and “other ways of knowing”, which conflate what may be other ways of understanding the world with the veracity of those beliefs. Like many Progressive shibboleths, these concepts are ultimately patronising to, and infantilising of, the peoples and beliefs to which they are applied.
This issue would have to be an essay in its own right, but one examples of this issue may suffice for now. One is the question of racial bias in police shootings; an analysis published in the prestigious Proceedings of the National Academy of Sciences showed that Black Americans were statistically no more likely to be shot by White officers, i.e., police shootings were influenced by factors beyond race. The authors retracted the paper because of the political implications of the findings in the aftermath of the George Floyd incident, where the prevailing social narrative that America was gripped by an "epidemic" of racially-motivated police shootings had taken hold. Thus, the findings were subjugated to the social cause, and the prevaling narrative related to said cause, to which the findings related.
A cheerful piece to start off the week :)
Really enjoyed this one - all of it, from the topic itself and interesting and brilliant historical backgrounds, to the more ‘3amThoughts-esque’ conclusion.. Thank you!