
<div> <p> </p> <div> <div><div><i><em>This essay was first published in Volume 10 o</em></i>f <i><em>Sacred Web, Winter 2002, pp. 49-71.</em></i> Since its original publication, more scientific evidence has emerged to support Dr. Wolfgang Smith’s traditional theory of vertical causality. For a more recent essay offering a scientific and metaphysical critique of the theoretical aspects of Charles Darwin’s dogma and his transformist theories of evolution, see M. Ali Lakhani’s essay, <i><em>Darwinism: A Critique</em></i>, from <i><em>Sacred Web</em></i>, Volume 46 (Winter 2020) at this link: <a href="https://www.sacredweb.com/science/darwinism-a-critique/">https://www.sacredweb.com/science/darwinism-a-critique/</a></div></div> <p><i>"Now, whatever lacks intelligence cannot move towards an end, unless it be directed by some being endowed with knowledge and intelligence, as the arrow is shot to its mark by the archer." — Saint Thomas Aquinas</i></p><p><i> (Summa Theologiae 1.2.3)</i></p><p></p> <h3>Necessity, Chance, Design</h3><p>From time immemorial, mankind has understood events or objects as the result of necessity, or of chance, or of design. These three basic categories of explanation appear to be native to the human mind, and as such they constitute what may be termed pre-philosophical notions. To be sure, there has been an ongoing effort on the part of philosophers to clarify these conceptions, and integrate them into a coherent account of causality. The simplest approach, perhaps, is to deny both chance and design, as the Greek atomists have done, and thus to suppose that all things occur by force of necessity, as Leucippus declared. Other schools, while still denying chance, have acknowledged design as a principle of causality not reducible to necessity; such was the case, for example, in the Stoic philosophy, which stipulated a kind of providential action or <em>pronoia</em> emanating from the World-Reason known as the Logos. Yet other schools of thought acknowledge both chance and necessity as irreducible principles, but deny design; and it is of interest to note that today many leading scientists espouse that position.</p><p>Among the three basic categories of causation, the most puzzling, perhaps, is the notion of chance. What confuses the issue, first of all, is the mistaken but commonly held belief that chance and necessity are irreconcilable: that it is a question simply of “either/or.” But clearly, to say that the toss of a coin yields heads or tails “by chance” is not to claim that the outcome has no cause, or is not in fact determined by its cause. Whether or not the toss of a coin is in some ultimate sense deterministic is a separate issue; what counts is that the event is, in any case, random or contingent in a suitably relative sense. Thus one finds that even in the heyday of classical physics, when the operations of Nature were deemed to be fully deterministic, statistical methods based upon the idea of chance could be successfully applied in various domains; for instance, in the kinetic theory of gases. The random distribution of gas molecules, and of their velocities, within a statistical ensemble, thus, does not contradict the supposition that the trajectory of each molecule is fully determined by a causal law. And I would pointout in passing that this accords with Aristotle's idea that chance has to do with the coincidence of causally determined sequences of events, a scenario which occurs when two classical particles collide. Outside the domain of physical science, moreover, the notion of chance has always played an essential role. Courts of law, for example, distinguish regularly between accidental and non-accidental happenings, and insurance companies treat catastrophic events as a random variable with some given probability distribution. It is definitely meaningful, therefore, to speak of contingency and likelihood, regardless of whether Nature proves ultimately to be deterministic, or whether the outcome of every process is known beforehand to God. After all, we judge of things on the basis of <em>our</em> knowledge, and from <em>our</em> point of view; and as that knowledge, or that point of view, changes, so do our judgments relating to causality.</p><div> <div> <div> <p><span>Sacred Web is devoted to exploring the relevance of universal and traditional wisdom in addressing the challenges of modern life.</span></p> </div> </div> </div><p>As concerns traditional doctrine, I contend that no orthodox school has been averse to the notion of chance. I have in fact argued on traditional ground that there can be no such thing as a fully deterministic universe, and that contingency constitutes indeed a necessary complement of determination: that it represents in fact the <em>yin</em>-side of the coin.[[1]] It is needful that there be contingency as well as law. The concept of a clockwork universe, it turns out, is fatally flawed; and it is perhaps surprising that this discovery was made in recent times, not by poets or mystics, but by mathematical physicists, no less: by individuals who, of all people, are dedicated to the ideals of rigour and exactitude, and are thus predisposed towards the side of law. If such as these have reached the conclusion that contingency is necessary, after all, to the economy of Nature, this finding carries weight.</p><p>In light of these considerations one may combine necessity and chance into a single category under the tide of natural causation, which then stands in contrast to design. In contrast, but not in opposition; for it is indeed an essential characteristic of traditional cosmology to admit both modes of causation: the divine, if you will, as well as the natural. It would thus be as contrary to the wisdom of tradition to maintain, on the one hand, that events and objects in the natural world are caused exclusively by divine action — as certain religious extremists have claimed — as it would be to maintain that the universe is governed simply by natural causes, as the philosophy of naturalism insists. Traditional cosmology, it can be said, acknowledges two seemingly opposed principles: the primacy of divine action, namely, and the efficacy of natural causes. I should remark, perhaps, that while I would not dispute Etienne Gilson's claim to the effect that the harmonization of these two principles has found its consummation in the Thomistic philosophy, I cannot accept his contention that Platonism, in particular, denies altogether the efficacy of natural causes, and thus affirms a radical extrinsicism.[[2]] To be sure, the philosophies of Plato and Aquinas represent different points of view, and it may be true that the efficacy of natural causes is affirmed more prominently in the teachings of the Angelic Doctor; but even so, I find Gilson's charge of “radical extrinsicism” to be misplaced. The point, perhaps, is that one can never understand a traditional philosophy which one does not approach with reverence. Suffice it to reiterate: <em>all</em> traditional cosmology, I contend, respects in the final count both the primacy of divine action and the efficacy of natural causes.</p><p>It has been likewise recognized, however, that manifest design cannot be attributed to natural causation. This is what St. Thomas Aquinas contends in his fifth proof for the existence of God; and let us note that the argument is validated, not by some abstract logic, but indeed on metaphysical ground, and thus on the basis of an intellectual perception. With the advent of modern times, however, the perennial “argument from design” has come under attack. First came deism, the “absentee landlord” philosophy which in effect exiled God from the universe; and this has led, by stages and degrees, to the full-fledged naturalism which came into vogue during the nineteenth century. As a dictionary of philosophy puts it: “Naturalism holds that the universe requires no supernatural cause and government, but is self-existent, self-operating, and self-directing; that the world-process is not teleological and anthropocentric, but purposeless and deterministic, except for possible tychistic events.” Among the philosophical and theological movements which opposed this position, it was the school of British natural theology that centered its counterattack on the argument from design. “During the 17th and 18th centuries,” we are told by Vergilius Ferm, “there were attempts to set up a 'natural religion' to which men might easily give their assent and to offset the extravagant claims of the supernaturalists and their harsh charges against their doubters. The classical attempt to make out a case for the sweet reasonableness of divine purpose at work in the world was given by Paley in his Natural Theology, published in 1802.” Despite the fact, however, that this “natural religion” may have held its attraction for many an English gentleman, one finds that it fell woefully short of a tenable doctrine, due to the fact that it had in part assimilated the ambient naturalism which it wished to combat. In a word, British natural theology was a compromise solution, an eclectic doctrine that was bound to fall. And no one, it seems, knew better how to expose and capitalize on its weakness than Darwin himself: “I cannot persuade myself,” he wrote, “that a beneficent and omnipotent God would have designedly created the<em> Ichneumonidae</em> [parasitic wasp] with the express intention of their feeding within the living bodies of Caterpillars.”[[3]] True enough: there can be no answer to Darwin's objection, nor to the allied argument from dysfunction, without committing in some way to the doctrine of Original Sin and the resultant Fall — something which British natural theology, in its “sweet reasonableness,” neglected to do. And so it came about, in the wake of Darwin's theory, that this “natural theology,” which had enjoyed the approbation of an intellectual elite, succumbed eventually to the assault of naturalism.</p><blockquote>CSI cannot be generated by any natural process, be it deterministic, random, or some combination of the two, as in so-called evolutionary algorithms. Thus was born a science termed design theory. … In the final count, it is hard to argue with a mathematical theorem.</blockquote><p>We must not however lose sight of the fact that there is substance and indeed validity in Paley's conception of a “watchmaker God.” lf we walk through a field, as Paley invites us to do, and discover a watch lying on the ground, we may indeed conclude that this object is not the result of natural causes. It was not a natural law, nor a blind concatenation of accidental happenings, that artfully fashioned and assembled the parts of the watch to the end of keeping time. We are all perhaps familiar with a book entitled <em>The Blind Watchmaker</em>, in which the Oxford zoologist Richard Dawkins proposes to refute Paley's claim; meanwhile, however, it turns out that the ancient argument has been recently revived on a scientific plane. The movement was sparked by Michael Behe, a molecular biologist, who introduced the concept of irreducible complexity, and argued convincingly that no natural process can give rise to structures which are in fact irreducibly complex. In<em> Darwin's Black Box</em>, published in 1996, Behe took his case before the general public and lucidly explained his position. His treatise, filled with captivating accounts of discoveries from the world of molecular biology, has become widely known, and has engendered serious debate in scientific circles. Behe's book, however, was only the beginning: the opening salvo, one might say, of a scientific counterattack, this time, against the prevailing naturalism. The decisive breakthrough was achieved two years later by a mathematician and philosopher named William Dembski, in a treatise entitled <em>The Design Inference</em>. Dembski had asked himself the question whether design can perhaps be recognized by means of some signature, some criterion which can be defined in strictly mathematical terms. The resultant theory, it turns out, not only generalizes Behe's concept of irreducible complexity, but puts the question of a “design inference” on a mathematical — and hence rigorous — basis. What Dembski discovered is that a signature or criterion of design can indeed be given in terms of a probabilistic notion of specified complexity, or equivalently, in terms of an information-theoretic concept of complex specified information or CSI. The decisive result is a conservation theorem for CSI, which affirms in effect that CSI cannot be generated by any natural process, be it deterministic, random, or some combination of the two, as in so-called evolutionary algorithms. Thus was born a science termed design theory, also known as the theory of intelligent design or ID. The movement has of course drawn criticism from various segments of the scientific establishment, above all from the Darwinist contingent; but, in the final count, it is hard to argue with a mathematical theorem.</p><p><em>In the following section, I propose to present what will hopefully be a readable introduction to the theory of intelligent design. I should perhaps point out that I presuppose little in the way of mathematical background. Nonetheless, the “non-mathematical” reader may, if he or she so wishes, omit parts of this somewhat mathematical interlude, which is to say that the sequel is comprehensible on its own.</em></p><figure><img src="https://storage.ghost.io/c/26/b5/26b5a426-214e-4897-b490-fa1deaa9b7ae/content/images/2026/02/9A669F6A-4216-44E9-8018-DDA3DE56A0C8-COLLAGE.jpeg" alt="" width="2000"><figcaption><span>William A. Dembski (left) and Michael J. Behe</span></figcaption></figure><h3>The Mathematical Theory of Intelligent Design</h3><p>Let us begin with the concept of irreducible complexity: “By irreducibly complex,” writes Behe, “I mean a single system composed of several well-matched parts that contribute to the basic function, wherein the removal of anyone of the pans causes the system to effectively cease function.”[[4]] The definition is evidently framed “with malice of forethought” to guarantee that no Darwinist process can ever give rise to an irreducibly complex structure; for as Darwin himself observed: “If it could be demonstrated that any complex organism existed, which could not possibly have been framed by numerous, successive, slight modifications, my theory would absolutely fail.” To which however he added: “But I can find no such case.”[[5]] The logic of Behe's argument appears to be impeccable: if a structure requires a number of “well-matched parts” before it can be functional, this precludes the possibility that “numerous, successive, slight modifications” could have been successively selected on the basis of function. The viability of Darwin's<br>theory, therefore, does indeed hinge on the question whether one can “find such a case.” Now, one of the most impressive and frequently cited examples of irreducible complexity is the so-called bacterial flagellum: a molecular device, whose function it is to propel the bacterium through its watery environment upwards along a nutritional gradient. The device consists of an acid-powered rotary engine-replete with a rotor, a stator, O-rings, bushings, and drive shaft-plus the actual flagellum, a kind of molecular rotary paddle. On account of the disorienting effect of Brownian motion, the flagellum must rotate at angular velocities on the order of 10,000 rpm, and must be able to reverse direction within one hundredth of a second. Moreover, to be functional, the device obviously requires auxiliary structures for detection and control, as well as for the production, storage, and distribution of the requisite fuel. What confronts us here, quite clearly, is a feat of nanotechnology that staggers the imagination; and needless to say, no one has yet proposed so much as the vaguest outline of a Darwinist scenario that might account for the production of these structures.</p><p>Yet, even so, Behe's argument remains incomplete. The conclusion that no Darwinist process could have produced the bacterial flagellum does appear to be inescapable; and yet the argument falls shy of a rigorous proof. Now, the standard strategy, in the physical sciences, for proving that a closed system, operating under the action of natural causes, cannot attain a certain state, involves the use of an invariant satisfying a conservation law. It matters not whether the invariant is an<br>energy, for example, which must remain constant, or a quantity, such as entropy, which cannot decrease (or increase); in either case, the conservation law rules out states that would violate that law. And let us note that this argument is perfectly rigorous, and does not require that we check out all possible scenarios which might conceivably bring the system into the disputed state. Getting back to Behe's contention, it is by means of this strategy, using CSI as his invariant, that William Dembski is able to refute the Darwinist claim in the case of structures such as the bacterial flagellum.</p><p>The basic idea of Dembski's theory is simple enough. Let us suppose that an archer is shooting arrows at a wall. To conclude that a given shot cannot be attributed to chance — in other words, to effect a design inference — one evidently needs to prescribe a target or bulls-eye which sufficiently reduces the likelihood of an accidental hit. What is essential is that the target can be specified without reference to the actual shot; it would not do, for example, to shoot the arrow first, and then paint a bulls-eye centered upon the point where the arrow hit. What stands at issue, however, has nothing to do with a temporal sequence of events: it does not in fact matter whether the target is given before or after the arrow is shot! What counts, as I have said, is that the target can be specified without reference to the shot in question. In Dembski's terminology, the target must be “detachable” in an appropriate sense. Consider a scenario in which the keys of a typewriter are struck in succession. If the resultant sequence of characters spells out, let us say, a series of grammatical and coherent English sentences, we conclude that this event cannot be ascribed to chance. An exceedingly unlikely and indeed “detachable” target has been struck, which however was specified after the event. In general, the specification of a target requires both knowledge and intelligence; one might mention the example of cryptanalysis, in which specification is achieved through the discovery of a code. What at first appeared to be a random sequence of characters proves thus to be the result of intelligent agency. The fact is that it takes intelligence to detect intelligent design.</p><p>I would like to emphasize that it is impossible to rule out the hypothesis of chance simply on the basis of low probability. If a sequence of 1‘s and 0‘s is generated by tossing a fair coin 1000 times, the possibility that the resultant bit string will contain not a single 0, let us say, can indeed be ruled out. Yet, if one does actually toss a coin 1000 times, one produces a bit string having exactly the same probability as the first: 1 in 2 to the power 1000, to be precise. Why, then, can the first sequence (the one containing no 0‘s) be ruled out, while the second can indeed occur? The reason is that the first conforms to a pattern or rule which can be defined independently; it is a question, once again, of a “detachable” target which itself has low probability. In the case of the first sequence, the prescription ‘no 0‘s’ itself defines such a target: the subset, namely, containing the given bit string and no other. But this is precisely what can not be done in the case of the second bit string (the one produced by tossing a coin 1000 times): it is virtually certain in that case that no detachable target of low probability has been hit. It is possible, of course, to come up with a description by reading off the sequence itself; but that description or pattern (if such it may be called) will turn out not to be detachable. To read a description off the event is like painting a bulls-eye around the point where an arrow has struck: such a description, of course, proves nothing. The discovery of a <em>detachable</em> pattern of sufficiently low probability, on the other hand, proves a great deal: it proves in fact that the event in question cannot be attributed to chance. What rules out chance, thus, is not low probability alone, but low probability in conjunction with a detachable target: this winning combination is what Dembski terms the complex specification criterion.</p><p>The general mathematical structure within which design theory operates is as follows: One is given a reference class of possibilities Ω, together with a probability measure P which assigns to each (measurable) subset of Ω a real number between 0 and 1. Given an elementary event E (represented by a point in Ω), a specification of E is a subset T of Ω containing E, which is “detachable” in a sense to be defined. The pair (T, E) is then said to constitute a specified event. It is to be noted that a specified event has two components: a conceptual component T, one can say, and a physical component E. It constitutes a twofold entity, thus, a thing that combines, so to speak, two worlds. And therein, let me add, lies the power and indeed the genius of Dembski's theory: where others have dealt with events, Dembski deals with specified events, a categorically different kind of thing. Let us suppose, now, that one is able to associate an invariant J with each specified event, which cannot, say, increase under the operation of natural causes; as we have previously noted, such a conservation law could validate a theory of design. To obtain a suitable invariant, Dembski replaces the probability measure P by a corresponding information measure I, defined by the equation.</p><p>I = B <em>log₂</em> P</p><p>According to this formula, the information contained in a bit string of length n is just n. In general, for an event A in Ω, I(A) represents by definition the information content of A as measured in bits. It is to be noted that P and I are inverse measures: the smaller P, the larger J will be; indeed, as P tends to zero, I goes to infinity. In mathematical parlance, I is thus a measure of complexity. Dembski goes on to define the information content of a specified event (T, E) to be 1(1): what counts is the conceptual component of the specified event. Finally, the information contained in a specified event is what Dembski terms specified information, and this is what he takes initially as his invariant.</p><p>It turns out, first of all, that specified information is strictly conserved under the action of a deterministic process. As one would expect, the proof hinges upon the fact that a deterministic process can be represented by a function that has the totality of initial states as its domain, and the resultant states at a later time <em>t</em> for its range. Let us suppose, then, that Ω is a reference class of possibilities, and that (T, E) is a specified event in Ω. If E is causally determined by an event E<sub>0, </sub>there exists then a reference space Ω<sub>0</sub> containing E<sub>0</sub> and a function <em>f</em> from Ω<sub>0</sub> to Ω, such that <em>f</em>(E<sub>0</sub>) = E. It is to be noted that subsets of Ω “pull back” under <em>f</em>, which is to say that <em>f</em> induces a function f ⁻¹from the powerset of Ω to the powerset of Ω<sub>0</sub>. One may therefore define a subset T<sub>0 </sub>of Ω<sub>0</sub> as the inverse image of T; and as might be expected, it turns out that (T<sub>0 , </sub>E<sub>0</sub>) constitutes again a specified event. Let it be noted, further, that a probability measure P<sub>0</sub> on Ω<sub>0</sub> induces a probability measure P = P<sub>0</sub> <em>f</em> ⁻¹ on Ω, such that P<sub>0</sub>(T<sub>0</sub>) = P(T). It follows that the specified events (T<sub>0</sub>, E<sub>0</sub>) and (T, E) carry precisely the same amount of specified information.</p><blockquote>The proverbial monkey pounding on a typewriter can perhaps produce a few bits worth of English prose, but not the text of <em>Hamlet</em>.</blockquote><p>Surprisingly, this conclusion could have been foreseen without recourse to mathematical analysis; as Dembski has put it: “What laws cannot do is produce contingency, and without contingency they cannot generate information . ... ”[[6]] The point was made four decades earlier by Leon Brillouin, when he wrote that “a machine does not create any new information, but it performs a very valuable transformation of known information”; and it appears that as far back as 1836, the poet and amateur scientist Edgar Allen Poe had said much the same.[[7]]</p><p>But what about random processes? As we have seen, a random process can generate arbitrarily large amounts of information (due to the fact that events of arbitrarily small probability can indeed occur), and it can even generate small amounts of specified information; what it cannot do, according to the complex specification criterion, is to generate specified information in large amounts: that is the crucial point. The proverbial monkey pounding on a typewriter can perhaps produce a few bits worth of English prose, but not the text of <em>Hamlet</em>. There must be a cutoff, even though its exact location cannot be ascertained. It can thus be stated that there exists a universal complexity bound, or UCB, beyond which specified information cannot be increased by a random process taking place within the bounds of physical space and time. Dembski, for good reason, takes his UCB to be 500 bits, which is “playing it safe”: no monkey could remotely do that well — not in a billion years! What no random process can generate turns out to be complex specified information, or CSI: specified information, namely, in excess of the UCB.[[8]]</p><p>Having thus disposed of deterministic as well as random processes, it remains to consider an arbitrary combination of the two. Now, any such combination can be modeled by a so-called stochastic process. In place of a function <em>f</em>(x) of a single variable, one has now a function <em>f</em>(x, ω) of two variables, in which ω represents the random component of the process. The trick is to break the problem into two parts: one first permits ω to “occur,” which transforms the original function <em>f</em> into a function<em> f</em>(ω) of a single variable x, in which ω serves as a fixed parameter. That function, however, can then be handled as in the deterministic case. One is able by this means to conclude that the total process cannot increase specified information by more than the UCB.[[8]]</p></div><p></p><p>What is the significance of this conservation law? Clearly, it shows that CSI cannot be generated by any natural process. In particular, the vast amounts of CSI existing within the DNA of a living cell could not have been produced through the Darwinist scenario of random variations acted upon by natural selection. Neo-Darwinists such as Manfred Eigen, having recognized the origin of information as the major problem of contemporary biology, have been searching for an evolutionary algorithm that can do the job, without realizing that this possibility can in fact be ruled out on theoretical grounds. Every evolutionary algorithm, no matter how ingeniously conceived, is a stochastic process, and as such it falls under Dembski's interdict. Only two possibilities remain: either the universe must have been replete with CSI from the first moment of its existence — a supposition which hardly accords with the hypothesis of a “Big Bang” origin — or else it cannot be conceived as a closed system operating under the action of natural causes.</p><p>It appears that CSI, and more generally, information in the mathematical sense, has a certain ontological status; it was Norbert Wiener, I believe, who first pointed out that, in addition to mass and energy, the universe comprises “information” as a basic ingredient. It is in any case to be noted that the course of science, from about 1900 onwards, has tended towards that recognition. The trend began with Bohmann's statistical mechanics, which demonstrated that the notions of contingency and probability playa vital role in the economy of Nature. Contingency and probability, however, add up to information, as we have seen; and I might mention that the statistical definition of entropy, as given by Boltzmann, is actually formulated in information-theoretic terms. It was by way of Boltzmann's statistical approach to blackbody radiation, in particular, that Max Planck was able to discover the so-called quantum of action, nowadays known as Planck's constant, a discovery which inaugurated the quantum era.[[9]] </p><p>With the advent of quantum mechanics, moreover, it became apparent that probabilities and information are not only useful conceptions, but prove indeed to be necessary on the most basic level of physical theory. It is true that Einstein, for one, refused to accept this conclusion; yet every argument which he advanced to counter the quantum-mechanical indeterminism proved ultimately to be ineffectual. More often than not, it actually contributed insights which only served in the end to bolster the quantum-mechanical position, as was the case with the well-known Einstein-Podolsky-Rosen thought experiment, which was later carried out, only to confirm the quantum-mechanical predictions. Moreover, when David Bohm, after prolonged discussions with Einstein, did finally succeed in constructing what appeared formally to be a deterministic quantum theory, he did so at the cost of introducing what he termed “active in-formation” as a basic principle; but as Dembski was quick to point out, Bohm's “active information” is but a special case of CSI. It appears that Bohm was able to dispense with indeterminacy on the level of particles only by admitting contingency in the form of CSI; one way or another, it seems, contingency — and thus information — is bound to enter the picture. Meanwhile, CSI has made a second appearance as a foundational concept of physics, this time in the form of the so-called Fisher information from which Roy Frieden claims to derive all the basic laws.[[10]]</p><p>From another direction entirely; recent advances in communication and computer technology have given rise to a number of mathematical sciences in which information, normally in the form of CSI, plays a leading role. Information theory itself, as a mathematical discipline, was inaugurated in 1948 by Claude Shannon, an electrical engineer concerned with communication problems. But unquestionably the most significant encounter with information has taken place in the domain of molecular biology, which has brought to light what may be termed the primacy of CSI in the biosphere. The fact that vast quantities of specified information, recorded in a four-letter alphabet, reside within every living cell, and that each species derives, as it were, from a text known as its genetic code, suggests that life has indeed an informational basis.</p><p>Given the crucial role of CSI in both physics and biology, it behooves us now to reflect further upon that notion, beginning with the mathematical concept of information as such. The danger, when it comes to the latter concept, is that we are prone to read far more into the term than it is meant to signify, the word “information,” after all, has obviously been in use for a very long time before Shannon gave it a technical sense. That sense is in fact quite bare: it boils down to the actualization of a contingency in a mathematical space Ω of viable possibilities, endowed with a probability measure P. If I flip a coin <em>n</em> times, I have produced information: <em>n</em> bits worth, to be exact. And even now, as I am striking the keys of my typewriter, I am producing Shannon information. I am also, however, generating <em>semantic</em> information, which is something else entirely, something which no mathematical theory can encompass, for the obvious reason that semantic information is not a quantitative thing. There is an ontological discrepancy, thus, between semantic and Shannon information, not unlike the ontological hiatus between the corporeal and the physical domains; and as in the case of a corporeal object X and its associated physical object SX, it does happen that every item of semantic information is associated with a corresponding item of Shannon information, which serves, so to speak, as its material base. The latter is what remains, one can say, after all meaning, all non-quantitative content, has been cast out or “bracketed.” This accords, once again, with René Guénon's point that “quantity itself, to which they [the modems] strive to reduce everything, when considered from their own special point of view, is no more than the ‘residue’ of an existence emptied of everything that constitutes its essence.”[[11]]</p><blockquote>No other scientific finding, I believe, has been as profoundly reflective of theological truth as the discovery of what may be termed the informational basis of life.</blockquote><p>Having thus distinguished between semantic and Shannon information, I would like to point out that the semantic component constitutes an example of specification, an instance of a “detachable target.”[[12]] To be sure, the example of semantic information is highly special, which is to say that specification can arise in a thousand other ways. Think of a bit string in which 1's and 0's alternate, or in which they represent a sequence of prime numbers in binary notation; or again, think of a bit string of length <em>n</em> which is “algorithmically compressible” in the sense that it can be generated by a computer algorithm of “length” less than <em>n</em> (a notion which can indeed be defined): all these are examples of specification. However, despite its immensely special nature, I will note in passing that semantic specification enjoys a symbolic primacy in the natural domain. lf it be the case that God “spoke” the world into being, as Scripture declares, such design as it carries must derive ultimately from a divine idea or <em>logos</em>, which may by analogy be termed a “word.” And I would add that nowhere in the natural world is the linguistic character of specification more clearly in evidence than in the genetic code of an organism, which constitutes a text, as I have noted before, recorded in a four-letter alphabet. The genetic code, then, is a <em>written</em> text, imprinted on DNA; yet one may conclude on theological ground that this written text derives indeed from a spoken word, the kind to which Christ alludes when He testifies that the words He speaks “<em>are spirit and life</em>.” (<em>John</em> 6.63) No other scientific finding, I believe, has been as profoundly reflective of theological truth as the discovery of what may be termed the informational basis of life.</p><h3>Horizontal and Vertical Causality</h3><p>Following these rather cursory considerations relating to the concept of CSI, I propose to reflect in some depth upon the nature of causality. From the outset we have distinguished between necessity, chance, and design, and have combined the first two modes under the heading of natural causation. On the strength of Dembski's theorem we may now conclude that CSI, wheresoever it may be found, must be attributed to an alternative mode of causation answering to the notion of design. I now contend that this alternative mode is none other than what I have elsewhere termed vertical causation, a mode characterized by its atemporality.[[13]] Inasmuch as vertical causation acts “above time,” or instantaneously, as one can also say, it differs fundamentally from natural causation, which is inherently temporal, and could therefore be characterized (in metaphysical parlance) as “horizontal.” One arrives thus at a dichotomy which needs now to be carefully examined and clarified.</p><p>The prime example of vertical causation is unquestionably the creative act of God; for as St. Augustine says: “Beyond all doubt, God created the world, not in time, but with time.” One needs however to realize that this creative act extends in a sense to God's providence, the first effect of which is what theologians term” conservation.” As Gilson points out, this effect “is, in some way, but the continuance of the creative act.” We need not attempt to classify God's action upon the world; suffice it to say that God acts ever above time, and thus “vertically.” That vertical causation, moreover, is the cause, not only of time, but also of the actions and processes that transpire in time. Yet these actions and processes have an efficacy of their own: such is the miracle of God's creation. God is intimately present, not only in the substance of all beings, but in their operations as well; and yet, as Gilson has beautifully said, “the intimacy of the assistance He gives leaves their efficacy intact.” It follows that God's vertical causality is complemented by a causality which operates in time; and this is what we have termed horizontal causation: it is the kind, obviously, with which science is concerned. A word of caution, however, needs to be interposed: To say that every act of natural causality is horizontal is not to imply that every act of horizontal causality is natural. There may conceivably be temporal processes which are neither deterministic, nor random, nor yet stochastic, a fact which implies that the concept of horizontal causation is wider than that of natural causes. We shall have occasion to return to this point presently.</p><blockquote>Could it be that vertical causation is in fact the hallmark of intelligence? This appears indeed to be the case.</blockquote><p>Having distinguished between a vertical causality, which is proper to God, and a horizontal causality which operates by way of a temporal sequence, we need to ask ourselves whether this dichotomy amounts simply to the traditional distinction between primary and secondary causation. The answer, dearly, is that it does not; for it happens that there exist second or created causes which likewise act above time, and thus vertically. What stands at issue is a higher degree of participation in the divine causality, one which pre-eminently reflects the action of God Himself. There are two prime examples of this higher mode of secondary causation: the causality, namely, emanating from the angelic realm, and in second place, action derived from human intelligence. Could it be that vertical causation is in fact the hallmark of intelligence? This appears indeed to be the case; to act “above time” is apparently the prerogative of an intellectual nature, a being endowed with intellect and free will. Vertical causation, thus, is none other than intelligent causation, whereas natural causes may be characterized by comparison as “blind.”</p><p>This brings us at last to a question which has been lurking in the background from the start: the matter, namely, of human art, in the widest sense of that term. To the modern mind, at least, the most obvious and incontrovertible instances of design are those resulting, not from an act of God, but from the action of a human artisan. Not even the most committed Darwinist would deny the presence of design in the case of Paley's watch; but whereas he accepts the notion of design in the sphere of human artefacts, he considers it “naively anthropomorphic” to extend this notion to the biological domain. We are in fact surrounded on all sides by CSI deriving from intelligent human action, and long before Dembski appeared on the scene, it was clear to everyone that this CSI is not in fact the result of a natural occurrence, but was put there by an intelligent agent. Who, for example, when he comes upon a collection of stones on a hillside spelling out the word “Welcome,” would imagine that these stones were deposited by a flood or an avalanche? In a thousand ways, all of us have been engaged, since early childhood, in the business of inferring design; and whether we know it or not, these inferences are invariably based upon specification. Dembski's theory, thus, applies in the first place to the domain of human art.</p><p>Let us then consider the production of artefacts, from primitive crafts to modern industry. Is it not obvious, one might say, that the artefact is invariably produced by means of a temporal process? In a sense this is of course true. I do not deny the necessity of temporal process; what I deny is its sufficiency. My contention is twofold: first, that the critical factor — the <em>sine qua non</em> of human art — is an act of intelligence; and secondly, that such an act is not reducible to a temporal process. Few, I suppose, would object to the first claim; it is the second that troubles us. The difficulty stems from the fact that we tend to temporalize the act of intelligence by identifying cognition with thought. We take it for granted that cognition occurs <em>within</em> thought — within a psychosomatic and temporal process — whereas in fact thought is only a means, a movement, if you will, in quest of cognition. To put it in traditional terms: cognition is an intellective as opposed to a psychosomatic act. As Aquinas observes: “The activity of the body has nothing in common with the activity of the intellect.”[[14]] And this in itself suggests strongly that intellectual activity does not take place “in time.” The crucial point, however, is that intellectual activity <em>cannot</em> take place in time, for the simple reason that temporal dispersion is incompatible with cognition: we cannot know “bit by bit,” because to know is necessarily to know one thing. This conclusion cannot be obviated, moreover, by adducing memory as a means of <em>presentifying</em> the past; the fact remains that the cognitive act must be “instantaneous,” and hence supra-temporal: for indeed, the moment or instant is not a part of time. The intellect, therefore, whether conceived as a “third principle,” or (Thomistically) as a power of the soul, must be inherently supra-temporal as well.</p><p>Getting back to the production of artefacts, the following has now become clear: If intellectual agency is indeed a <em>sine qua non</em> of human making, it is <em>ipso facto</em> impossible to reduce the production of the artefact to a temporal process. It is true, of course, that mechanized manufacture <em>is</em> a temporal process; but one must not forget that the machinery involved in this process carries design, which is transmitted to the resultant product. As Dembski's analysis shows, a deterministic process, and thus a function, may indeed transmit CSI, but cannot produce it (not even to the extent of a single bit!). A manufactured artefact, thus, no less than one produced by a human artisan, presupposes an act of vertical causation.</p><p>Human making is allied to God's creative act by virtue of the fact that it likewise entails an atemporal mode of causation. It can be said that the human artist imitates the divine, and “participates” to some degree in God's creative agency. “Art imitates Nature [in the sense of <em>natura naturans</em>] in her manner of operation,” says Aquinas; and elsewhere he specifies that the human artist works “through a word conceived in his intellect,” and thus in imitation of the Holy Trinity.[[15]] It is no small thing, therefore, that transpires in even the humblest instance of human making, which is indeed worlds removed from a mere temporal process. No wonder this difference can be detected in the artefact by way of a distinctive signature indicative of design.</p><p>A further clarification needs to be made. Having characterized vertical causation by the fact that it is atemporal in its mode of operation, we must bear in mind that it may nonetheless be temporal in its effect. A violinist, for example, does indeed act “above time” on the plane of intellect, and yet the music he plays is produced by a movement of his bow. Once again a temporal and thus horizontal process enters the picture; but clearly, it is not a natural process: it cannot be, since it derives from an intellectual act.[[16]] One needs therefore to distinguish (as I have intimated previously) between two kinds of temporal process or horizontal causation: the kind that derives from natural causes, and the kind that springs from intelligent agency. It is the violinist, acting as an intelligent agent, who first apprehends the music —Dembski's “detachable pattern” — on the plane of intellect, and then, by an act of his free will, conveys that pattern to the world of sense by way of a temporal process, an action of horizontal causality.[[17]]</p><p>A few words on the subject of “free will” are called for at this point. One sees from the example of the human artisan that intelligent action is no — and cannot be — the result of a natural process. The cause of such action cannot therefore be identified on the natural plane, that is to say, in the external world, and may consequently be characterized by default as “internal.” But is that not indeed what we normally mean when we speak of “free will”? Now, if by “freedom of the will” we understand an exemption from external causality, then the preceding reflections do in fact establish that freedom in the context of human art. The decisive fact, however, is that intelligent action is “free” by virtue of an intimate participation in the freedom of God Himself. And this divine freedom, to be sure, is infinitely more than a mere exemption from external constraint; after all, it is by virtue of this very freedom that God created the world, and thus “external constraint” itself. Freedom, therefore, has primarily a positive connotation: it has to do with creativity, with the expression of truth and beauty, and also with “play” — with what Hindu tradition terms <em>lila</em>.</p><blockquote>What disturbs the Einsteinians, it appears, is not merely the breakdown of determinism, but indeed the collapse of natural causation: not only does God “play dice,” but what is worse, He does so “instantaneously!”</blockquote><p>One final point needs to be made: nothing obliges us to suppose that a temporal process which is not productive of design, or of CSI, can <em>ipso facto</em> be attributed to natural causation. A case in point is given by the quantum-mechanical phenomenon of state vector collapse: the radioactive disintegration of a radium atom, for instance, cannot in fact be accounted for in terms of natural causation by virtue of its irreducible discontinuity. The ancient dictum “<em>Natura non facit saltus</em>,” I claim, holds to this day; only it needs to be understood that the <em>natura</em> in question is <em>natura naturata</em> as distinguished from <em>natura naturans</em>: the “natured” as opposed to the “naturing.” I have argued elsewhere that <em>natura naturata</em> acts invariably by way of a <em>continuous</em> temporal process, in contrast to <em>natura naturans</em>, which acts “above time” and thus by vertical causation.[[18]] The action of <em>natura naturans</em> is therefore inherently instantaneous, and it is this intrinsic instantaneity, I contend, that is reflected in instances of irreducible discontinuity. The reason why state vector collapse has mystified physicists, it turns out, lies in the fact that the phenomenon cannot be attributed to natural causes. What disturbs the Einsteinians, it appears, is not merely the breakdown of determinism, but indeed the collapse of natural causation: not only does God “play dice,” but what is worse, He does so “instantaneously!”</p><h3>Intelligent Design and Theistic Evolution</h3><p>The fact that the DNA in a living cell carries “tons of CSI” implies that Dembski's theorem has disqualified the claims of Darwinism. It has dashed the neo-Darwinist hope of finding “an algorithm, a natural law that leads to the origin of information,” as Manfred Eigen has put it.[[19]] What stands at the heart of a living organism is CSI, and no natural law, no algorithm, no stochastic process can produce that CSI. I will leave out of account the question how long it may take the scientific community at large to accept this fact and draw the consequences; certainly, if we admit what Thomas Kuhn has to say on the subject of “scientific revolutions,” this will not happen overnight. What in any case I find to be of far greater interest than the rise and fall of the Darwinist paradigm is the fact that Dembski's theory poses a fatal threat, not just to Darwinism, but to the cause of authentic religion, surprising as this may seem. To be sure, theologians for the most part are jubilant to learn that God is not superfluous after all; yet what they fail to realize, almost to a man, is that the design movement threatens to plunge us into a heresy worse than Darwinism. The problem is this: Whereas design theory has indeed disqualified the Darwinian mechanism, it has in no wise discredited the Darwinist concept of common descent, which thus remains entrenched as a scientistic dogma. But clearly, the hypothesis of a common descent which cannot be accounted for in terms of the Darwinian mechanism — nor, for that matter, in terms of natural causation as such — is tantamount to the tenet of theistic evolution. It is almost inevitable, therefore, that Dembski's discovery will be generally perceived as a scientific vindication of that tenet, a doctrine which has already swept the theological world and penetrated even into the Vatican. Thus, if theistic evolutionism was the rage long before Dembski proffered his scientific insight, just think what its status will become in the wake of his monumental discovery! Here indeed is a doctrine to “deceive even the elect.”[[20]]</p><p>The problem with the notion of common descent is that it obviates metaphysics in a domain that is incurably metaphysical. Common descent, to be sure, if there be such a thing, is something that transpires in space and time: it is something we can picture, something that answers to the demands of our ordinary understanding. Therein lies its appeal, and therein too lies its impossibility, for it happens, as every traditional school has recognized, that first origins cannot in fact be situated in space and time. There exists, for instance, a Patristic doctrine concerning first biological origins — the doctrine of <em>ratione seminales</em> elaborated by St. Augustine in <em>De Genesis ad Litteram</em> — but the teaching is irremediably metaphysical, it alludes to a vertical descent, a progression from the metacosmic Center to the cosmic periphery. But this means that the scientist is in fact “coming in” near the end of the story: confined, by the <em>modus operandi</em> of his approach, to an exclusively horizontal perspective, he misses the vertical descent which ontologically precedes all manifestation on the spatio-temporal plane. Thus, in keeping with this horizontal perspective, the hypothesis of common descent proposes to resolve the mystery of first biological origins within the spatio-temporal domain: at the felly of the cosmic wheel, where indeed first origins can never occur. Moreover, to bring God into the picture, as the theistic evolutionists have done, does not alter this fact, this principial impossibility: it only compounds bad science with bad theology.</p><blockquote>The “origin of information,” thus, which neo-Darwinists are seeking in an evolutionary algorithm, is actually to be found in the <em>logos spermatikos</em> that came into being in the single instant of creation, when “God created the heaven and the earth.”</blockquote><p>One forgets that authentic evolution is indeed an unfolding, as we learn from the Latin verb <em>evolvere</em> (<em>e</em> + <em>volvere</em>, “to roll out”); it is thus an outbound kind of movement. Where there is an outside, however, there must also be an inside, an interior; and let me hasten to add that we must not psychologize this “inside”: the <em>bona fide</em> interior of an organism is not a matter of “consciousness” but constitutes the ground from which every component of the organism, including consciousness itself, is derived. The integral organism — like the integral cosmos — may thus be conceived in terms of a symbolic circle, whose center represents its “innermost” point, the true <em>ratio seminale </em>[[21]],<em> </em>of which the visible creature, situated in space and time, is but the outward manifestation or “unfolding.” It is moreover of interest to note that the Greek equivalent of the term <em>ratio seminale</em> is <em>logos spermatikos</em>: the “seed word,” if you will, whose marred reflection, as I have suggested previously, can indeed be discerned in the genetic code. The “origin of<br>information,” thus, which neo-Darwinists are seeking in an evolutionary algorithm, is actually to be found in the <em>logos spermatikos</em> that came into being in the single instant of creation, when “God created the heaven and the earth.” From that point of origin there began a “vertical descent,” an evolution in the true sense of the term, in which however the essence of the organism, its <em>ratio seminale</em>, remains unchanged. Here, then, is the crux of the matter: to comprehend this metaphysical truth — even as “through a glass, darkly” — is to perceive at once the fallacy, not just of Darwinism, but of theistic evolutionism as well. But the latter doctrine, as I have indicated before, is the worse of the two: for whereas Darwinism as such offends by an unwarranted extrapolation while remaining otherwise faithful to the scientific point of view, theistic evolutionism betrays the theological outlook itself and thereby gives rise to a wholesale corruption of sacred doctrine. The point needs to be made, in particular, that where there is no vertical descent, there can be no vertical ascent either. A thing which has its first origin in space and time will have its last ending in space and time as well; such a thing is bound to perish, bound to disappear like a riven cloud. But this is not the case with things that have being, and thus an essence and an act-of-being. Only a cosmology, therefore, which enshrines the dimension of verticality can support a religious outlook and allow a doctrine of human immortality. Within the confines imposed by a horizontal cosmology, the claim of religion becomes a sham, or at best a consoling fiction.</p><p>[[1]]: Wolfgang Smith, <em>The Quantum Enigma</em> (Peru, IL; Sherwood Sugden, 1995), 85-97</p><p>[[2]]: Etienne Gilson, <em>The Christian Philosophy of St. Thomas Aquinas</em> (University of Notre Dame Press, 1994), 185</p><p>[[3]]: Francis Darwin, ed., <em>The Life and Letters of Charles Darwin</em>, vol. 2, 303-12</p><p>[[4]]: Michael Behe, <em>Darwin's Black Box</em> (New York: The Free Press, 1996), 39</p><p>[[5]]: Charles Darwin, <em>On the Origin of Species</em> (Harvard University Press, 1964), 189</p><p>[[6]]: William A. Dembski, <em>No Free Lunch</em> (New York: Rowman & Littlefield, 2002), 155. It should perhaps be mentioned that the term “no free lunch” has in recent years acquired a technical sense: it refers to a class of mathematical theorems concerning so-called evolutionary algorithms, proved in the late 90's. It turns out that the problem-solving capacity of such algorithms is severely limited. Dembski's theory can be seen as a generalization of these “no free lunch” theorems</p><p>[[7]]: See l.eon Brillouin, <em>Science and Information Theory</em>, 2nd cd. (New York: Academic Press, (962), 267-69</p><p>[[8]]: In point of fact, the proof is not quite as simple as my summary suggests: for example, one must get around the difficulty that the functions <em>f</em>(ω) and <em>f</em>(x) may carry CSI, and thus inject CSI into the process. For a full discussion, see William A. Dembski, <em>No Free Lunch</em>, op. cit., 149-166</p><p>[[9]]: Having discovered the correct radiation formula by empirical means, Planck derived its physical significance by way of a statistical analysis. As he put it in his Nobel Prize address of 1920, “After some weeks of the most intense work of my life, light began to appear to me and unexpected views revealed themselves in the distance”</p><p>[[10]]: I have reported on this work in ‘Eddington and the Primacy of the Corporeal,’ <em>Sophia</em>, vol. 6, no. 2, 2000, 5-38</p><p>[[11]]: René Guénon, <em>The Reign of Quantity</em> (London: Luzac, 1953), 13</p><p>[[12]]: As the reader may have observed, I have not entered into the mathematical formulation of detachability. Suffice it to say, the condition hinges upon the statistical notion of rejection functions, a technical concept with which we need not concern ourselves in the present summary</p><p>[[13]]: Wolfgang Smith, <em>The Quantum Enigma</em> (Peru, IL: Sherwood Sugden, 1995), 106-107</p><p>[[14]]: Opusculum, De unitate intellectus contra Averroistas, iii; quoted by Joseph Rickaby in <em>Of God and His Creatures</em> (Westminster, MD: Carroll Press, 1950), p. 127n</p><p>[[15]]: Thomas Aquinas, <em>Summa Theologiae</em>, 1.117.1 & 1.45.6</p><p>[[16]]: The question arises whether this conclusion can also be reached on the basis of Dembski’s theorem, given that the process in question (the movement of the bow) carries CSI. The problem is this: Conceivably the violinist does no more than transmit CSI, stored presumably in his brain, and thus operates in the manner of a machine. The fact is that Dembski's analysis does not preclude that possibility</p><p>[[17]]: I do not wish to suggest that the intellectual act occurs “before” the musical idea is conveyed to the world of sense: my contention is that the intellectual act has ontological as opposed to temporal priority. The intellectual act can in fact have no temporal priority, seeing that it is atemporal. It belongs to a realm where there is no “before” or “after,” hard as it may be for us to grasp this point. To do so, it seems, we require a metaphysical symbol, a kind of mental icon: think of a circle, for instance, the circumference of which represents the realm of time, of temporal sequence. Inasmuch as the center of the circle is equidistant to all points on the circumference, it is equally “present” to each of these points, regardless of its position within the temporal sequence. What is needed in order to grasp the idea of atemporality is thus an extra dimension: and this is indeed “the dimension of verticality.” I might add that it is the loss of that dimension — the inability, in other words, to understand the meaning of metaphysical symbolism — that most accurately characterizes the modern mind</p><p>[[18]]: Wolfgang Smith, <em>The Quantum Enigma</em> (Peru, IL: Sherwood Sugden, 1995), 95-97</p><p>[[19]]: Manfred Eigen, <em>Steps Towards Life: A Perspective on Evolution</em>, (Oxford University Press, 1992), 12</p><p>[[20]]: The most compelling exponent of theistic evolutionism, it is safe to say, is none other than Teilhard de Chardin, whom many regard as a veritable prophet. On this subject I refer to my monograph, <em>Teilhardism and the New Religion</em> (Rockford, IL: TAN Books, 1988)</p><p>[[21]]: According to Thomistic ontology, the <em>ratio seminale</em> of an organism, strictly speaking, is itself “exterior” to its act-of-being, which constitutes, one may say, the existential basis or that <em>ratio seminale</em> itself, and thus indeed the “innermost point” of the organism: the true center of our symbolic circle<br></p> <div> <div> <a href="https://www.sacredweb.com/volume-10/editorial-on-faith-and-intellect/"> </a><div><a href="https://www.sacredweb.com/volume-10/editorial-on-faith-and-intellect/"> <img src="https://storage.ghost.io/c/26/b5/26b5a426-214e-4897-b490-fa1deaa9b7ae/content/images/size/w400/2024/03/whirling-dervishes-sama-sufi-sacred-web.jpg" alt=""> </a></div><a href="https://www.sacredweb.com/volume-10/editorial-on-faith-and-intellect/"> </a> </div></div> </div>
<p><a href="https://www.sacredweb.com/volume-10/intelligent-design-and-vertical-causality/" target="_blank">- Enlace a artículo -</a></p>
<p>Más info en https://ift.tt/FgtxKWV / Tfno. & WA 607725547 Centro MENADEL (Frasco Martín) Psicología Clínica y Tradicional en Mijas.
#Menadel #Psicología #Clínica #Tradicional #MijasPueblo</p>
<p>*No suscribimos necesariamente las opiniones o artículos aquí compartidos. No todo es lo que parece.</p>