December 2008


The invention and rapid expansion of the internet and World Wide Web over the last several decades stands as a watershed moment in human history, transforming human communication and, ipso facto, leaving its impress on virtually every area of culture in the industrialized world. There can be little question, then, that human knowledge itself (or more precisely, the way we popularly understand human knowledge) will change in profound and irreversible ways—many of which will not become apparent until long after they have occurred. We need only look back upon other such watershed moments in human communication to see that this is true.

One such moment in the West occurred in ancient Greece in the 8th century B.C., when writing was re-discovered. This event was obviously of monumental significance for Greek and Western culture. Among other things, the subsequent spread of writing throughout ancient Greece brought about a shift of prominence from one form of speech to another–from mythos to logos—thus sounding a death-knell for the oral culture in which the epic poets and their bards flourished. Gradually, knowledge became less and less a matter of what was collectively preserved and re-iterated through memory and oral recitation in a communal space. As a result, the legends, sagas, and myths that depended upon formulaic speech, poetic innovation, and public audiences for their life-blood no longer stood as living insights into the cosmic order, the nature of the gods, or the origins of society, but became artifacts of the past (preserved in textual form, to be sure). Since the writing of texts facilitated the careful pursuit of inquiry (historia), and the meticulous construction of accounts (logoi), knowledge became more and more a matter of theory, argumentation, and analysis, as well as the disciplines that developed from and depended on these forms of speech and thought. Thus, the Golden Age of Greece, an “age of reason” that saw tremendous intellectual achievements, arguably could not have taken place without the development of writing.

Another such watershed moment in human communication occurred with the invention of the movable type press by Gutenberg in the 15th century, an invention that Mark Twain once called the “greatest event in the history of the world.” The printing press enabled high-brow texts and ideas to circulate among the masses, with the result that the language of learning eventually shifted away from Latin to the vernacular languages, the places of learning shifted away from monasteries and scriptoria to universities, libraries, and presses, and the communities of learning shifted away from the feudal aristocracies and clergy to scholars and even to ordinary folk. The Renaissance, Reformation, Scientific Revolution, and later, Enlightenment–these are but a few of the major developments in human culture and knowledge that have been traced to the printing press. In fact, the development of the scientific method itself, as well as the philosophical schools of empiricism and rationalism, could not have developed without the emergence of new criteria for truth and knowledge—all of which can be plausibly tied to the legacy of the printing press. A new “age of reason,” we might say, was brought about by this second transformation in human communication.

And so we turn to the legacy of the internet, about which it would be foolhardy to make determinate proclamations at this early date (it may take hundreds of years to gain the necessary perspective). We can, however, pose some questions. For example, how will the development and proliferation of electronic communication change the way we conceive of knowledge? There is no question that it has enabled us to transmit, store and retrieve knowledge more efficiently—but what kinds of knowledge? What kinds of knowledge flourish with the globalization of Google, Wikipedia, electronic databases, search engines, and web-logs? Is it merely “information?” If so, has the popular appreciation for reasoned argumentation and analysis been fundamentally diminished by the explosion of information and the proliferation of opinions in electronic media? Indeed, what forms of learning will be left behind, as printed monographs, books and newspapers arguably fall by the way?

In a related fashion, we might inquire: who will be the new learned? Will those with prodigious memories become even less important now that Google is but a click away? Will those capable of reasoned argument or careful empirical observation become obsolete in the face of those who can “process” information more efficiently? Though it seems hardly inconceivable, the rapid proliferation of electronic communities of learning suggest that, one day, the silicon tower will eventually supplant the ivory tower as the loci for intellectual discourse. If so, we can only wonder about the security of our knowledge as its surety is guaranteed not by human memory, or the scroll or printed page, but by computer chips and bytes.

Of course, there is no question that the developments in human communication bring great blessings to humankind. However, one wonders whether all these drastic changes in the appearance of knowledge, what it is, how we learn, who it is that knows, and where knowledge gets transmitted, change the epistemic fundamentals regarding the mind’s relation to the world and the importance of face-to-face human contact in the transmission of knowledge. As for me, though I become increasingly dependent upon electronic media and on-line communities for the development of my own thought, I find that part of me cannot help longing for what has been left behind—for the days of archaic Greece when story-telling was a meaningful community (and educational) experience, or the days of Medieval Europe when reasoned argumentation guided by faith was seen by all to be a worthy exercise of learning. And I wonder what good things we are in the process of leaving behind now.

In his Essay on the Intellectual Powers of Man, Thomas Reid (1710-1796) gives this advice: “Let us accustom ourselves to try every opinion by the touchstone of fact and experience. What can fairly be deduced from the facts duly observed or sufficiently attested, is genuine and pure; it is the voice of God, and no fiction of human imagination.” It is hard to recall a stronger endorsement of empiricism than this in the whole history of philosophy—to identify the facts duly observed, and what is inferred from them, with the voice of God. Has Reid gone too far? Or should philosophers follow him?

Reid’s assumption, when he recommends that we test every opinion “by the touchstone of fact and experience,” is that we can in fact acquire reliable experience of the world. He clearly thinks that by starting with due observation and sufficient attention to the testimony of others we can reach philosophical knowledge that is “genuine and pure.” This is something that has been doubted from time to time in the history of philosophy ever since the days of the ancient Sophists. The doubt involves a suspicion that what we consider to be facts, the purported foundation of philosophical and scientific knowledge, may in fact be relative to the observer and hence not objective or real facts at all. In Hellenistic times, Pyrrhonian Skeptics believed that one should suspend belief in all cases, because it is impossible to know whether a proposition or its contradictory is true. In essence, they reasoned as follows: because people disagree, therefore nobody knows anything. Obviously though, if the Sophists and Skeptics are right, Reid is wrong.

Another way of disagreeing with Reid would be to say that we actually do have genuine and pure knowledge, but it does not come from experience. This is a view opposite to that of the Sophists and Skeptics in the sense that rather than trading on doubt it trades on the desire for absolute certainty. Philosophers who put reason and logic before experience are inclined to hold that whatever reason discerns as being deduced from fundamental principles must be true, because it is deductively certain, like mathematics. Thus, by relying on deductive logic, Berkeley proved that there is no such thing as matter,[1] and British Idealists held, for instance, that “the Absolute enters into, but is itself incapable of, evolution and progress.”[2] The fact that these conclusions fly in the face of common sense is thought not to count against them, because they are shown strictly to follow, not from facts, but from indubitable principles. The rationalist program requires precisely that we subordinate our ordinary understanding of things to the results of logical inquiry. It is as though we were to say, these are my theories, so much the worse for the facts! Again, if Berkeley, the British Idealists, and various other rationalists are right, then Reid is wrong.

What should we think? It is perhaps helpful to remember that philosophy in the West began with the Greek confidence that the world we live in is intelligible. The kosmos, an ordered whole, and physis, nature, are accessible to nous, intelligence, and to logos, reasoned speech, physei, by nature. Thus philosophia, the love of wisdom, is not an unrequited love or doomed enterprise in our tradition. On the contrary, it is Plato’s “upward path,”[3] the path we choose when we reject abject skepticism, extreme rationalism, and other paths that deny the natural capacity of the embodied human mind to understand. As human beings, rational animals, our mode of understanding is empirical, that is to say, we learn by the use of sense perception and emotion together with intelligence. Admittedly, this involves our subjectivity, and perhaps even prejudice. However, in Gadamer’s striking phrase, “there are legitimate prejudices,”[4] namely those authoritative pre-judgments that actually aid understanding. I think this is close to what Reid meant when he recommended that we rely on the touchstone of fact and experience. He regards it as a justifiably confident reliance, like the loving trust a child places in its parents and teachers. Not that experience (and parents and teachers) can never be wrong, but rather that when we fairly consider “the facts duly observed or sufficiently attested” as well as what follows from them, we become able to judge of those experiences (and parents and teachers) and to discern when their deliverances are right and when they need to be taken with a grain of salt or even rejected altogether. This is the philosophical attitude par excellence, and Reid is not out on a limb with it at all. Rather, if we want to be philosophers, too, we should follow him and heed the voice of Truth as it speaks to us in well-considered experience. For, “through the infinity of the universe, the mind which contemplates it achieves some share in infinity.”[5] To depart from this faith is to embark on a different path altogether.


[1] See Three Dialogues Between Hylas and Philonous. Berkeley is normally classed among empiricists, but his reliance on logic at the expense of common sense surely places him among extreme rationalists.

[2] F.H. Bradley, as quoted in A.J. Ayer’s Language, Truth, and Logic, Chapter 1, “The Elimination of Metaphysics.”

[3] Republic, Book X, last few lines.

[4] Hans Georg Gadamer, Truth and Method, Part II, Section II, Chapter 1 (B), “Prejudices as Conditions of Understanding.”

[5] Bertrand Russell, The Problems of Philosophy, Chapter XV, “The Value of Philosophy.”