Peerless Reviewings and Lesser Diatribes


Whenever someone writes that “increasing the level of vibrations” is the royal road to better health, something one sees all over the energy medicine literature, one has to immediately become skeptical. Say we assume that this wording is just a way of speaking on a popular level, so as not to overwhelm the average reader. The writer doesn't distinguish between speeding up the frequency of oscillations, extending (?) the wavelength, increasing the energy density of the waves, amplifying the waveform, and other such things “increasing the level of vibrations” could possibly refer to. But say the writer knows all this and is just making it simple for the average reader. If so, then he is greatly misinforming the reader, because if the machine the writer is trying to sell with his discourse could actually “increase the level of vibrations”, whatever exactly that may mean, then that machine very likely would kill the reader. Since it appears the machine is not killing a lot of readers, the only reasonable conclusion to draw, not having the opportunity to make a comprehensive study of it without making a major financial investment, is that the machine likely is not doing much at all to the reader's vibrations.

Just scanning material on the Rife frequencies is enough to see that “increasing the level of vibrations” is absolutely the wrong way to look at it. What needs to be done to the “vibrations” to improve health is, quite literally, approaching infinite complexity (increasing, decreasing, various forms of sideways in every which way but loose: each organic circumstance is virtually unique, given the extraordinary interactivity of all “involved” processes, the meaning of “involved” largely being a matter of how one chooses to define it by donning the particular make and model of blinders one chooses from the selection available).

A very important point not often made, but there to be discovered in the scientific literature: the most-biologically-active modes of vibration are known to be, not electrical or optical, but acoustic (which is exactly what our 1979 superconductant DNA model describes). The acoustic modes spoken of in this model are not the usual phonon-generating stereochemical structural DNA modes, disruptions of which constitute steric hindrances: torsional, compressional, and two transverse bending motions. In our model, it is as if the free electrons in the plasma gas core about which the helices wrap themselves are sealed in a “plastic bag” (it is somewhat useful, but only somewhat, to draw a comparison with the structure of a photo-acoustic spectrometer, with its “plastic bag” component). Ambient electromagnetic radiation impinges upon the “plastic bag” and the interior is heated in rhythmic pulses, like in a greenhouse over a period of days and nights, only here the heating-cooling frequency is much faster. With the energy input from the radiation, free electrons in the ionized plasma gas begin to form parcels which oscillate in temperature (not in space, an idea that was a major issue during the peer review process for the paper), causing them to expand and contract. The trains of temperature oscillations and expansions-contractions rapidly get in-step, like troops of goose-stepping Nazi soldiers. The synoptic energy input and the frictional heat dissipation are brought into perfect synchrony with each other. The “goose-stepping” on a "rope bridge" leads to formation of solitary pressure waves, solitons, where one part of the wave is identity-transparent with another (properly described only with m-valued logics), so that the solitons are self-driven, self-organizing, self-sustaining, and persist indefinitely (i.e., are superconductant). The superconductance is in the gas core, not along the backbones or steps of the molecule. Electrons are not moving from place to place, only the messenger particles of the electromagnetic field, i.e., the trapped photons moving within the “plastic bag” containing the plasma gas core (an elastic plasma bottle like that sought for nuclear fusion, hot or cold). These pressure-wave solitons rhythmically beat on the “plastic bag” from the inside causing it to vibrate (almost like the pulsations coursing along the walls of a tornadic vortex). This vibration sets in motion through the structured intracellular water and cytoskeleton microtubules a coherent (i.e., in-step) acoustic wave (like in a LASER beam -- light amplification by stimulated emission of radiation) emanating in pulse-code fashion (carrying by wave-component modulation the complete genetic message) from the molecule to locations where it interacts with many cellular structures. In this case, however, what is amplified is not light, but sound. Sonic LASERs, sonic holograms (Musculpt). However, this process is actually much more complex than that, because the waves are not merely normal sound; they are acoustically-modified gravity-wave modes that radiate well beyond the cell membrane, amplifying with height. This is apparent from the equations describing the process, but what is actually implied nobody presently knows (or so it would appear). Some energy medicine machines are based on the thesis that the DNA molecule radiates photons, light, the messenger particles of the electromagnetic field -- not acoustic waves, let alone acoustically-modified gravity-wave modes. Some people have apparently taken photographs of this optical radiation, photos which adorn booklets and books. From our perspective on the microphysics of superconductant DNA, it is hard to know what to make of these photos. So, just in this one case -- DNA's vibrational interactions -- there is involved light, sound, gravity (short-range?), electromagnetic properties, compression waves, pressure waves, temperature oscillations, damping, amplification, threshold behaviors, resonance, critical Curie states, and on and on -- variations in any of the “involved” factors having significant biological effects relative to clocks, immunity, recording of chronic stress, autogenic brain discharge of these stress recordings, and the full range of factors involved therewith. Some health problems related to such factors may require “increasing the level of vibrations”, but many others may require “decreasing the level of vibrations” -- or alterations of a large number of other things engaged with the “level of vibrations”. So, it is not hard to be skeptical about any treatment modality which externally impresses “vibrations” upon the body. And given the complexity of the myriad active factors, it is equally not hard to be skeptical about diagnostic machines based upon supposed measurements of “vibrations”.


Thanks for the copy of John Upledger's book Cell Talk (North Atlantic, 2003) essaying an osteopathic orientation I generally sympathize with, certain exceptions taken. Okay, let's look at this “plastic bag” issue, which, in the DNA case, “surrounds” the superconductant free-electron gas core and is actually, to our way of thinking, a temporal-operator, temporal-curl and fractal-dimension, fractal-entrapment discontinuity, not the holy grail of a leak-proof “bag” or “bottle”. Four years ago, I spent six weeks studying Gilbert Ling's technical unpublished monograph, Life at the Cell and Below-Cell Level: The Hidden History of a Fundamental Revolution in Biology, which gives a comprehensive account of his work. This is an exceptional essay in the history of science. In contextualizing Ling's work, this monograph also details the history of experiments -- from the 1600s until virtually the present -- of importance to the idea that cells have membranes and also the membrane-pump hypothesis. Upledger gives the prevailing consensus account of how the cell works with its structurally distinct membranes, osmosis pumps, ion channels, and so on -- with the addition of attributing consciousness and decision-making power to the cell, cytoskeleton, and macromolecules (one response to the enormous complexity and staggering timing issues which traditional belief amongst chemists held that random thermodynamic heat-driven motion “organized” on the basis of statistical probabilities -- which few people can any longer quite fully believe in). Ling argues that the typical cell does not have a cell membrane (only in mature plant cells, for plant structure reasons -- a cell type predominantly studied in the early period of cell biology when the membrane-pump theory came to dominate the field), is not filled with a free-solute solution, and that there is no such thing as a membrane pump. I am presently reading Ling again to see if I can understand this loggerhead within the field of cell biology. A couple of historical quotes provided by Ling from those who do not embrace the membrane-pump theory (p. 5): “Cells do not possess a covering membrane chemically different from the protoplasm” and “The name, cell, is a misnomer, because cells do not, in general, have the form of hollow chambers, as the name suggests, but are typically solid bodies.” Secondary accounts of Ling's work, with few exceptions, inevitably have little to do with what he actually has to say. Moreover, when non-genetic-mutation theories of cancer are given in the alternative medicine literature, frequently Otto Warburg's discovery that cancer cells are deficient in oxygen is cited, but almost never is any real account given of Albert Szent-Györgyi's related later observation that cancer cells have “less water structure”. Ling, of course, takes note of this, as Ling's seminal 1951 paper on the fixed charge hypothesis states what is tantamount to the inverse of this idea as regards healthy cells, which Szent-Györgyi presented independently and differently in 1957 relative to cancer cells (and which incorporates Warburg's discovery within a much larger theoretical framework than Warburg's, one that is quite contrary to the consensus account of cell function). Given that MRI technology is based on Ling's mathematical models of cell function and that this tends to strongly substantiate his ideas, which do not support the membrane-pump hypothesis, it is a pretty big conundrum to try to understand the issues involved -- and what, most fundamentally, motivates the two opposing groups of cell biologists. The notions of “free radicals” (electron donors) and “antioxidants” (electron acceptors) are deeply involved with these issues -- also going back to the 1940s and '50s with Linus Pauling's ideas and Szent-Györgyi's isolation of vitamin C (an “antioxidant” -- a term I don't remember as ever having seen in any of the six books by Szent-Györgyi I studied back in the 1970s and unsuccessfully tried to interest Cornell biochemists in at the time). I cannot understand how these terms relate to “charge transfer complexes” and “electron transport chains” (malfunctions of which are at the core of Szent-Györgyi's theory of cancer), as “free radicals” and “antioxidants” are relative terms, where one (donor) becomes the other (acceptor) as electron transport proceeds -- belying much that has been published in the popular literature on the subjects of oxidation-reduction and oxidative phosphorylation, at least as I can understand it.

Truly, I do not know what to make of the idea that elementary particles with non-simple identity and non-local properties in any meaningful way move from point A to point B. The very notion of such movement makes no sense to me. What could such movement accomplish that is not already accomplished by the facts of non-simple identity and non-locality?

I feel that the following statement by Upledger, given on page 52, may contain a big hint (relative to the biophysics of superconductant DNA):

It could also be the nucleus with its DNA intellect communicating either with the cytoskeleton or the cell's membrane or cytoplasm that initiates morphological change.

Not only do I have the suspicion that a plasma bottle is no more an actual bottle than a cell membrane is an actual sac-type membrane, but I also bet that this “DNA intellect” is not 1T2-Boolean but m-logically-valued and that the “communication” involved does not actually require movement from point A to point B. What may be required is alteration of temporal-spin (in the ensemble, temporal curl), which, to our way of thinking, is the quantum equivalent of the thermodynamic notion of temperature (this was the issue lying behind the 1978 peer-review exchange prior to publication of the superconductant DNA paper). But contemporary physicists and chemists still, all these years later, continuing to buy into conventional probabilistic interpretation of quantum mechanics, however much they might play the game of considering alternative non-m-logically-valued interpretations, have no idea what temporal curl might be or how it relates to the random motion traditionally regarded explanatory of temperature. Unabashedly embrace non-simple identity and non-locality and there is no fundamental notion in physics that remains untainted. Roger Penrose, for example, rather than look closely at temporal spin in relation to twistors, spinors, and complex angular momentum cascade, instead tries to get his head together by surveying everything known about the whole universe: The Road to Reality: A Complete Guide to the Laws of the Universe (Jonathan Cape, 2004). Temporal curl and m-valued logics -- periodically drawn to Penrose's attention over the past 30 years -- apparently are not on the road to reality and therefore are not mentioned in the book.


A little more on this issue about existence or non-existence of cell membranes which I have been struggling with. It is really a very good example of why I strongly dislike popular science writing, which I consider the worst form of black propaganda, extremely destructive, and reflecting accurately the moral posture of people who write or publish popular science.

Look at page 50 of Upledger's book Cell Talk. One of the exceptions taken. There is provided a diagram of a phospholipid bilayer cell membrane with protein channels for osmotic transport. The discussion in text is written as if this is fact, not inference within a given theoretical construct. Indeed, it is written in such a way that most readers will tacitly assume that these membrane structures can be seen through a microscope (and if the reader thinks about it, he will likely conclude: seen through an electron microscope). This Cell Talk explanation is pretty standard biology textbook stuff. When a further step is taken, that is, when simple CONCRETE visual metaphors are used to help the average reader “understand” -- the basic stylistic technique all publishers of popular science books require -- then the falsifying distortions are even greater. This is one reason why I am interested in the ABSTRACT metareference of a Musculpt that would evolve through use in the fashion of a natural language.

Even Howard Bloom's fairly comprehensive introduction to Gilbert Ling's thought in Global Brain (John Wiley, 2000) does not clearly present the no-membrane thesis. He gingerly skirts the issue, undoubtedly because a forthright treatment would undercut his arguments in support of extreme individualism and raw competition. Ling's monograph runs through technical description of experiment after experiment related to the issues involved with question of existence or non-existence of semi-permeable cell membranes, making periodic summarizing statements, a couple of which I will quote (p. 12):

Modern electron microscopy, in contrast to light microscopy, can see a 100 A-thick cell membrane [reference here to a book by another author published in 1990 by Academic Press: Sjostrand's Deducing Function from Structure, Vol. 1: A Different View of Membranes] when the cell preparation has been properly fixed and stained with electron-dense uranium and lead, for example. Using this technique I. L. Cameron made electron microscopic pictures of the cut edges of frog muscle cells both immediately and some time after amputation. The EM plates showed no membrane regeneration at the cut ends.

In summary, there is a diffusion barrier at the surface of the cell, which offers resistance to the passage of some substances at least. But the barrier is a relative one and not one of absolute permeability or impermeability. Cells as a rule do not behave like a perfect osmometer. The central tenet that only impermeant solute at high concentration causes sustained cell shrinkage has been unequivocally disproved -- throwing in doubt membrane theory's classical explanations for the mechanism of cell permeability, cell volume control, as well as solute distribution. [Long discussion is provided of experiments beginning in the 1850s on NaCl uptake or non-uptake by cells. Until the 1930s, when radioactive tracer techniques were developed, it was generally concluded that the cell does not uptake NaCl. With the new tracer techniques, however, that was shown not to be the case and ionic theories of resting and action potential emerged, followed by the sodium-pump theory, although leading journals were still publishing articles into the 1940s predicated on the thesis that the cell is impermeant to Na cation.] Nor do frog muscle cells regenerate instantly or more slowly a new membrane at the cut surface of the (gel-like) muscle protoplasm… Out of the four physiological manifestations, which the membrane theory had at one time been able to explain, only the electrical potential remains intact at this point and it will be the subject of the following subsection.

It is in this electrical-potential area, of course, that Ling's biggest contributions to cell biology came. Now, more specifically about the phospholipid bilayer cell membrane described in Cell Talk. Again, concluding remarks by Ling following ten pages of detailed descriptions of experiments (p. 67):

A continuous lipid or phospholipid bilayer cannot produce a bulk-phase limited diffusion of water as we have demonstrated. [Note: there is no direct way to “see” these membrane constituents, with scanning microscopy or even with x-ray diffraction crystallography, so everything known is by inference from experiments upon variables theoretically relating structure and function.] The established bulk-phase limited diffusion of water offers one refutation of the lipid layer or phospholipid bilayer theory of cell membranes.

With lipid layer and phospholipid bilayer out of contention, the only other building material of the living cell, which can form a continuous layer (and yet act as a quasi-impermeable barrier to virtually all solutes dissolved in water but not water itself, can only be (polarized-oriented) water itself. More evidence refuting the lipid or phospholipid membrane model and supporting the water membrane model follows.

This is where the quantum chemistry details of Ling's theory (first formulated in the early-1950s) begin to be explicated in the monograph. Water “bounding” water at the “edge” of the cell. Ling's major experimental and theoretical predecessors during the period mid-1930s to early-1950s (and up until 1982 when Troshin died) were predominantly Soviet scientists of “The Leningrad School” of cytology. Notions of heavy water, bound water, structured water, “ice nine”, plasma-bottle containers for nuclear fusion (hot, later cold) all appeared on the scene during this period. The idea we have long been arguing, by direct implication, is that all of these (exemplified, perhaps, in tornadoes) involve, most fundamentally, temporal-operator and fractal-spatial-dimension-induced discontinuities (mathematically-involutive formation of m-logically-valued identity-transparent limited spacetime domains, a kind of splitting and fractal entrapment into superposed multiple worlds of this one and only universe). It is interesting to note in this context that the product CELLFOOD, a solute-rich solution made using the “water splitting technology employed in the H-bomb's fusion trigger”, was created in the mid-1950s by Everett Storey, an expert on deuterium and heavy water as used in building of the hydrogen bomb. I would be using this stuff regularly, but it is simply well out of my price range.

So, the whole idea of a semi-permeable membrane sac establishing simple-identity, individuality, of the smallest living organisms is actually a theoretical construct the initial formulations of which were made soon after compound microscopes were independently created by Galileo Galilei and Janssen in the first decade of the 1600s as the Cartesian-Newtonian-Westphalian worldview construct was gestating: cell as little nation-state, little Leviathan (one of those concrete visual metaphors). If, as appears the case, the supposed molecular structure of the semi-permeable membrane is an ideologically-conferred property, how much of the rest of molecular biology is similarly conferred -- given that virtually all of contemporary cell physiology (in anthropomorphic metaphor to assembly-line manufacturing infrastructure and mega-urban-region microdynamics), including that of brain cells, is in one way or another related to aspects of the semi-permeable membrane's conferred properties? Note also that the type of consciousness Upledger attributes to the cell and its constituents, in developing an alternative to statistical thermodynamic explanations of organization, is the individualized human consciousness writ small to the cell and its constituents, which is consistent with the Cartesian-Newtonian-Westphalian worldview construct (that statistical mechanics and later statistical thermodynamics began to undermine in the period leading up to the Franco-Prussian war). And, clearly, the warfare we see around us today is a direct (metaphorical projective) expression of the continued psychological forcing underlying contention over these unresolved scientific issues -- a forcing that will continue until the issues are resolved. Upledger would likely justify the attribution of individualistic cellular and molecular consciousness via, for instance, a cytoskeleton which engages in Boolean computations as Hameroff has described. Hameroff and Karl Pribram have also described quantum optical coherence in microtubules of cytoskeleton. Prior to WWII, the contents of this same debate were focused upon consciousness of “animal societies” relative to entelechy of the state (integral aspect of the development of national socialism in both Germany and Japan).

I believe that much of the detail in the currently prevailing ideas is ideologically leveraged (for the nation-state system and its supraordinal agglomerations and against animistic identity transparency). In Ling's case, his political orientation and use of The Great Wall of China metaphor for fixed charge dynamics subtly reveal some of this leverage, it seems to me: of two cultures, of two minds never fully articulated and integrated. I do not think that ATP as cardinal adsorbent and free-energy provider is adequate explanation for the critical state cooperative phase transition Ling describes as establishing the structured-water properties of protoplasm. Ling himself speaks of the necessity for unidirectional stirring so that the linear (not globular) protein chains about which intracellular water becomes layered can be brought into parallel formation, and so the cation sunflowers can all simultaneously turn their heads toward the cardinal adsorbent (adaptation of Bloom's metaphor), thus becoming polarized. Anyone who has made Jell-O knows you have to stir, but are Jell-O and living protoplasm the same thing? Ling is also not interested in new isolation-flotation tank media based on these notions, even though direct experiential exploration of the bottle-nosed dolphin's sonic-visioning perceptual environment would be greatly enhanced thereby, thus providing new insights into acoustico-optical facilitation of long-range inter-neuronal coherence and its associated subjective states. The tendency to quenching of the excited atomic states involved in the critical-cooperative phase transition, as Szent-Györgyi intensively studied in the laboratory (and recorded particularly in Introduction to a Submolecular Biology), is not easily overcome. Ling also does not have a three-dimensional model of the multilayer polarization structure his theory hypothesizes. He maintains that the problem of creating this model is so mathematically complex that it is presently beyond our capacities. Yet, he is not interested in mathematical similarities between Sakharov's multi-sheet model of the universe and the notion of multilayer polarization, nor is he interested in superconductant DNA-generated coherent waves as source of unidirectional stirring (even though prior to his prime-of-life suicide, Freeman Cope, Ling's associate, was a leading theoretician of biological superconduction). Adequate account of the required unidirectional stirring is not given, and apparently not sought. These coherent waves would introduce free energy derived from the ambient radiational field into the energetics of the cardinal-adsorbent trigger to cooperative phase transition, possibly just what is required to insure that there will be no quenching of the involved excited states. Pribram never provided an adequate account of origins of the reference beam for his neural hologram hypothesis, nor was he interested in superconductant DNA-generated coherent waves as being this reference beam when the idea was presented to him in 1979. Likely he would also not be interested in these coherent waves relative to quantum coherence in cytoskeletal microtubules. The inherent incompatibility, in my judgment, between cytoskeletal quantum optical coherence and Boolean 1T2-logical processing is not addressed by Pribram and Hameroff. This problem is avoided in Pribram's case, apparently, by adoption of David Bohm's hidden variables interpretation of quantum mechanics. I don't exactly know about Hameroff. Unbridled embrace of quantum coherence would involve an m-logically-valued interpretation of Schrödinger's wave-function, but that would mean that the cytoskeleton is not a Boolean processor and that the cell could not manifest an individualist consciousness. The cytoskeleton would be an m-logically-valued processor. It would also mean that the mathematical problems in creating a multi-sheeted m-logically-valued Hilbertian reference space for the biological organism would have many aspects in common with the mathematical problems in creating a three-dimensional model of multilayer polarization and the plasma-containing “bottle” which the cell clearly is. Physical substrate for the non-decomposed m-logically-valued reference space: life as we don't know it is? All of these ideas are rooted in quantum-level processes, but the MRI machine that grew out of Ling's ideas is stated to employ a non-invasive, non-injurious diagnostic technique. Widespread employment of fMRI these days for non-medical-treatment purposes. That machine may not be invasive on the cellular level or even on the molecular level, but, clearly, on the atomic and quantum levels it is extraordinarily invasive, invasive upon the exact scale levels Ling's association-induction hypothesis is formulated and presented as regulating essential biological functions, indeed, as regulating life and death issues for the cell. And given the degree to which the MRI is involved with the issue of quantum spin, one has to suspect involvement with temporal curl and maintenance of biological limited spacetime domains.

Long before going to Vietnam as a lowly enlisted man, while an adolescent military brat of a career officer, I learned that the intelligent person does not take a leadership position in a bad war -- or that person turns down the assignment given and resigns the commission previously bestowed. The human species itself has long since become a bad war. What we witness today is emergence of endgame whilst the smug go about their idiot business believing the fight is about this or that when, actually, the subject matter discussed above is far closer than this or that to origins of that state of bad war the human species is.


Well, what can I say? All of these issues are touched upon in MOON. They could not be rigorously investigated after 1979 because of the intense resistance to the ideas that had developed, and whatever else was transpiring in the involved areas of research. One of the major issues, of course, is what exactly is going on between the free-electrons -- of the superconducting gas core about which the DNA helices uncoil and coil during radiation-induced replication -- and the water of the intra-nuclear or intra-mitochondrial complex coacervate phase within which the molecule is situated. Is the coacervate “pushed out” of the topologically-convoluted cylindrical core (the genetic molecule's very own instantaneously appearing and dissolving microtubule hole in spacetime) during formation of electron parcels as critical state for helix-coil transition is reached -- leaving some kind of vacuum state momentarily occupied by the parcels? Could the molecule tap zero-point energy during replication, or are there nonlinear-temporal critical states (“overtone” series of the Curie temperature) as “locks” on multiple trans-grammar levels of genetic encipherment (non-linear-temporal semantics of m-logically-valued relative-states [operator-time being an expression of the logical operators of involutory decomposition] with linear-present and linear-future values of environmental variables) and only under one or more of these hypercritical, trans-Curie states can zero-point-energy tap transpire? ((An aside: Madame Curie, being a woman, coming from the part of Europe she did with its contemporaneously emergent peculiar m-valued logics, deeply experienced in transference dynamics with grad students as she was, was the only one on the scene with real insight into the woman-man psychodynamics of the appearance of seed idea for Special Relativity, and the subsequent contemporaneous prevailing and persistent misinterpretation of Schrödinger's wave-function.)) Does the water become or cease being gravity-free salt water, oxygenated water, activated and energized water, hydration water, occlusion water, Quellungswasser or Schwellungswasser, non-solvent water, structured water, chemically-bound water, ionized water, electrostatically-plasmahauted water, water in oriented adsorption films, heavy water? Could “hot” processes appear (to linear-time-bound probes) “cold” (or cold, hot) by virtue of the temporal curl involved in hypercritical phase transitions to generic (no distinguishing fluid from conductor) super-states at hyper-temperatures (hyper-temperature being measure of nonlinear-temporal spin)? Is some kind of “plasma jet” or “electron gun” involved in genesis of the solitons beating upon the “plastic-bag” plasma-bottle “wall” (really an involutory-decomposition temporal-inversion and fractal-dimension trap)? How much does “second sound” enter into this -- given that the coherent waves generated are acoustically-modified and appear to involve some sort of gravitational component (the involved super-state transposing the electromagnetic to/from the gravitational)? There are a lot of questions one could ask. In the mid-to-late-70s, we thought of a few of them. Resistance to the ideas was interesting in and of itself, and I set about investigating the larger implications of this resistance. The investigation inevitably led to conclusions like those conversationally essayed by one or another character in MOON. A suggestive quote from MOON (Vol. 1, p. 194):

DR. BELKNAP: But the idea of an electrodynamic double helix is too simple to be true, physically, not to mention the idea of its being an engine to drive a comet. I can see it now: ADACHI ANNOUNCES DISCOVERY OF COMETARY THRUSTERS. Magnetoplasmadynamics! Plasma jets instead of seawater jets. He's probably working under a special projects grant from the Japanese Foundation for Shipbuilding. [Throws his head back and laughs robustly.] The whole thing was put to rest long ago. Schrödinger's wave equation dissolves the helix into the vapor of a probability distribution. Synge had an article on that in the Festschrift volume put together for Professor Wheeler.

The cast freezes, as if locked in a time eddy. As the image of a comet appears on the video screen and slowly gets larger and larger, moving directly toward the audience, the ensemble -- accelerando, and pianissmo rising to crescendo -- initiates a coloratura cantabile by the lead soprano of the chorus: ringing glissandos on the celesta; mounting pulses on the drums; screeching quarter-tone slides on the violins; high-pitched fleeting whistles on the flutes; deep ebbing moans, sforzando, on the French horns. Lead soprano alone raises her voice in an extended series of octave-leaped bird calls, sustaining always at the higher register before sliding abruptly to the lower. The comet turns, revealing its double-helical wake vortices, and the full chorus repeatedly intones HaiYaaaaaRaaaaa… HaiYaaaaaRaaaaa in resonant pulses that push the voices outside the boundaries of their normal registers. As the comet recedes, there is general diminuendo, interrupted only by lilting cries uttered con dolore by the lead soprano.

Another (Vol 2, pp. 304-5):

“Interesting… Lakshmi is going to do one of Takemitsu's percussion pieces?”

Lakshmi laughed raucous as a whirligig. “That's a bit beyond me at the moment, I'm afraid. I do emulate Yoshihara Sumire, though. Her performance of 'Munari by Munari' just blew my mind… Have you listened to the piece?”

“A few times,” replied Derek. “It's available on record in the States. I found myself most affected by the section which sounds like rain drops falling on the still surface of a pond.”

“Ah, but the real listener,” Lakshmi proclaimed with a gentle smile, “hears the unstruck sound, anahata, between the drops… A flower does not talk, you know,” she added, quoting an abbot of Nanzenji.

“Intrinsic hearing?”

“Whatever else does the percussionist try to imitate but Bosho's frog? Munari does mean void… nothing. You must know that it is from the void, mu, that the auditory space, ma, is birthed.”

Hmmmm, he thought. The void decomposes as a mathematical involute by means of vacuum fluctuations to constitute spacetime as an actor. How much the Japanese have forgotten the original mathematical meaning of their Shinto cosmogenesis myth! in distorting it into a means of social control to service stratification, and later to drive the formation of a nation-state. Schrödinger's wave equation describes the decomposition of mu into ma. Just as in ancient days a priesthood intentionally misinterpreted the cosmogenetic involute as a hierarchy, so, in modern times, was the wave-function dissimulated… and for the same reason! To sustain privilege and the psychological states associated with it.

“Moment form,” chirped Ilse.

“What?” Derek asked.

“Listen to Stockhausen's 'Ylem' sometime.”…

“Yes, the silence heard,” said Derek, reflectively.

“But which silence?” asked Kenji.

Oh God! Derek inwardly exclaimed, as an electrical surge hit him in the solar plexus and a pin-wheel light-yantra flashed momentarily in his visual field. Bright-light state.

Attend to the interface! Guna entity transforming its unit of space. Nothing is a something you do not know.

Zero silence; first silence; second silence: the words appeared unsolicited in his awareness. If hyper-temperature is a measure of the spin moment of operator-time (quantized as a twistor), then the absolute zero of normal temperature for a given domain structure (the configuration drawn by the oscillating center of mass of a translating pi-electron, for instance) is a function of the spin moment. Turn the temporal-spin rheostat (with complex angular momentum cascade) and some absolute zero exists at every degree Kelvin. Zero-point energy is everywhere! Background radiation is everytime! Thermodynamics? anyone. Problems understanding the high-temperature super-state? Zero sound; first sound; second sound. What we have been calling two-fluids is actually the interpenetration of two temporal densities. Sound of no-sound! Silence is a cacophony of sound-types for those individuals entering time warps at the requisite baud-rate of consciousness. Velocity is one distinguishing feature. NOW! ask me what carrying a mantra is all about: any 'sound' is a 'silence' for some chronotopology; transforming a sound into a silence is to modulate the temporal density. Think of the physical phenomena involved in this modulation! Even Dirac would succumb to Zitterbewegung, I'd bet… Or the stimulated enhancement of neuronal coherence which occurs in those rare instants of musical rendezvous achieved in group improvisation: running a Coltrane lick on DNA's quantum wave harmonic (the real modus operandi of musical tone-color therapy). I'll wager La Monte Young has never known how truly genetic his music is! along with that of the whales, of course.

Now, I ask you (to rephrase Schrödinger's famous question, leaving the notions of homeopathic potency and its figurative inverse [edema] unmentioned): What, really, is colloid-rich coacervate-phase protoplasm?


Your skepticism is appreciated, as well as the criticism that I Mixmaster what should not be conflated: science and politics. Trouble is, the tossing together of science and politics (more accurately political expression of psychological dread) in ways not generally acknowledged began well before I was born. What people find unacceptable is not the mixing of science and politics, but having that mixing pointed out to them as an omnipresent aspect of the scientific method employed throughout the history of science. Science has been a socializer of cognition par excellence. Question is: Is it intrinsically a socializer? In the present context of discussion it is interesting to note that in the English-speaking world, during the run up to World War II, colloid chemistry was chased to ground and exterminated, while it continued to thrive in the German-speaking world. As Gilbert Ling points out in his monograph, the English-language attack on colloid chemistry in the period 1930 to '36 was an attack on the bound-water notion of protoplasm and a defense of the membrane-enclosed dilute solution of free-solutes notion of cell function. The chemists' metaphorical attack on national socialism and defense of freedom-loving democracy? The Journal of Colloidal Chemistry died with the war. Was this just pure happenstance? English-speaking chemists like Nobel laureate A. V. Hill were defending biological individualism with their membrane-centric theories, while German-speaking chemists, who kept their journals of Kolloid Beihefts intact, defended animistic identity transparency with their membrane-absent bound-water theories. Cellular participation mystique! The connection between scientific interpretation of data (separated from “noise”) in relation to issues of psycho-socio-political ideology exhibited during this period may not have been fully conscious, but there is no believable denial of subliminal near-awareness. One of the bibliographic annotations to MOON which has received considerable criticism is relevant here (Vol. 2, p. 751):

Birkhoff, G. and J. von Neumann. “The Logic of Quantum Mechanics”, Ann. of Mathematics, 37, 1936. (One truly must wonder at the extraordinary lengths the mind is willing to go in order to avoid looking the multivalue straight in the face. Here, it is recognized that quantum logic has some relation to projective geometry, but where is the Riemann surface stack? Though some doubt is cast on the utility of Hilbert space, where is the recognition that every point in the referencing phase-space is multivalued, that translation across a single-valued sheet is projected as a static lattice to the multivalued referencing function space? How strange that they should invoke the concept of a logical 'lattice' but not view it as a true point-set topology! Similarly, they use involutory relations and the concept of skew-fields, but the multivalue screaming in the background is completely ignored. Dropping distributive and/or commutative laws for two-valued propositions, indeed! Can laughter be suppressed? These missing recognitions are not ignorance speaking; they are expressions of psychological dread. And this paper, summarizing a decade [at least!] of collective psychoneurotic posturing, was published just three years before the first actions were taken in the inevitable avalanche of consequences!)

I would point out that a three-dimensional model of the multilayer polarization structure of intracellular water (as physical substrate of the non-decomposed m-logically-valued reference-space/frequency-domain for biological organism) will certainly require resurrecting Birkhoff's and von Neumann's leveraged “missing recognitions”. But that is not about to transpire in the political climate of the first decade of the 21st century, an atmospheric inversion in cognitive climes far more intense than that prevailing during the 1930s as catalyst-precipitator of World War II.


Given that the Cantorian universe is at the root of fractal nesting and Poincaré was no great friend to the Cantorian persuasion, I will quote your comment at length before going on to the Poincaré conjecture, which is highly relevant:

Here's a perspective on the existence/non-existence of cell membranes, taken from my studies of atmospheric dynamics. The reason the Americans and Russians took to testing nuclear devices in the stratosphere (following the ground-based tests in the rush to create these weapons that led to an understanding that the A- and H-bombs released prodigious amounts of radioactivity into the troposphere), was that the prevailing definition of the tropopause stated it was a dynamical barrier to the mixing of properties that had come to distinguish the stratosphere from the troposphere: e.g., high vs. low values of potential vorticity, the same for ozone and the reverse for H2O, and then, with the advent of testing, values of strontium-90 that quickly grew to orders of magnitude larger in the stratosphere vis-à-vis the troposphere.

Then came the unexpected increase of Sr-90 in precipitation falling to earth that led to Bob Dylan's song “What Have They Done to the Rain?”. Obviously, there was something more going on than the textbook explanation that chemical, dynamical, and other distinguishing properties of either regime were leached across the dynamical barrier by slow diffusion. The explanation lay with the folding and breaking of the tropopause boundary that came to distinguish large-scale cyclogenesis that was most intense during the meteorologically-active spring and fall months, driven by the hyper pole-equatorial temperature gradient when viewed on the hemispheric scale. Once frontogenesis became rooted through and into the stratosphere, the downward transport of air out of this stable regime could then be tapped by convective-scale processes representing precipitation production at its maximum efficiency. It was this very rapid exchange of properties that Ed Danielsen's Atomic Energy Commission grants focused upon, that proved slow diffusion was nothing compared to the two-way transport occasioned during each and every tropopause rupture. The statistically-significant jump in birth defects that fanned out from the New Mexico and Nevada proving grounds was no fluke, just as the “hot” rainwater and snowmelt was no fluke. The stratosphere was actually a porous medium that exchanged potential vorticity, Sr-90, O3, and H2O on numerous occasions. The result was a bilateral stratospheric test ban treaty which, ironically, led the French and eventually the Chinese to test their devices on Pacific atolls or over the Gobi Desert, while the Americans and Russians commenced underground testing.

Back to cell membranes. We know that the acceleration which initiates a tropopause fold and then rupture, owes its origins to quantum generation. In our view, the classical backdrop of static space and time is superceded by a process requiring fractal geometry to explicate. (This would be why F=ma is a subset of relativity, the resting state of quantum-relativistic physics, where it is possible to obtain Newton's postulate derived from relativity theory, but not the other way around.) Does this mean that each cell is its own limited spacetime domain, the “membrane” in question remaining intact until the relativistic-quantum process erupts?

I'll have more to add about the holograms generated during the universal quantum process, where, as Roger Penrose proposed when conceiving of twistor theory, it would appear the quantum exchange of information parallels the need for spacetime dynamics, the appearance of static space decoupled from linear-time being as unreal as the flat earth or the illusion that the stars, the planets, and the Sun all orbit around the Earth. We may have had it all wrong, insofar as we thought quantum-relativity was fundamental to understanding the subatomic and atomic worlds of matter, ditto astrophysics and cosmology, leaving single-valued space and time intact in the molar scales which include atmospheric and all of the Earth sciences.

One observation which could be made concerning what you say about the tropopause (phase) boundary and atmospheric cascade dynamics is that prevailing orientations in the mathematical field of topology have little real application to actual physics, regardless of how much contemporary physicists might like to think so. Against the rules of homotopy theory, nature pokes holes in surfaces, cuts, twists, pastes, forms singularities. Indeed, that's what quantum-relativity physics is actually all about, no matter how much the community of official physicists might not like that fact in their search for structural invariants (as opposed to, say, search for functional invariants). Singularities, for instance, periodically form in equivalent potential temperature surfaces, their appearance diagnosing onset of complex angular momentum cascade dynamics. That's infinite temperature relative to defining absolute limiting values of the variables establishing identity-transparent (meaning in quantum relative-state) atmospheric limited spacetime domain “cellular laminations”.

In order to illustrate the contrary orientation (what one might easily regard the contemporary physicist's physics analogue of a global monoculture predilection) as exemplified in prevailing topology and physics, and growing naturally out of Poincaré's attitude toward Cantorian mathematics, I quote from the recent Scientific American article (“The Shapes of Space” by Graham Collins, July, 2004) on probable proof of the Poincaré conjecture by Russian mathematician Grigori Perelman (p. 76):

How might we try to geometrize a manifold -- that is, give it a uniform curvature throughout? One way is to start with some arbitrary geometry, perhaps like an eggshell shape with various lumps and indentations, and then smooth out all the irregularities. Hamilton began such a program in the early 1990s, using an equation called the Ricci flow (named after mathematician Gregorio Ricci-Curbastro), which has some similarities to the equation that governs the flow of heat. In a body with hot and cold spots, heat naturally flows from the warmer regions to the cooler ones, until the temperature is uniform everywhere. The Ricci flow equation has a similar effect on curvature, morphing a manifold to even out all the bumps and hollows. If you began with an egg, it would gradually become perfectly spherical.

Of course, in a nature not “built up of”, not “recursively emergent”, but “involutionally decomposed from”, the Platonic “perfect sphere” would have logical and ontological precedence over lumps, indentations, and other irregularities (L2I's) -- rather than the L2I's having temporal precedence over the perfect shapes of space. Continuing the quotation:

Hamilton's analysis ran into a stumbling block: in certain situations the Ricci flow would cause a manifold to pinch down to a point. (This is one way that the Ricci flow differs from heat flow. The places that are pinched are like points that manage to acquire infinite temperature.) One example was when the manifold had a dumbbell shape, like two spheres connected by a thin neck. The spheres would grow, in effect drawing material from the neck, which would taper to a point in the middle… When the manifold is pinched in this way, it is called singular… a way around this stumbling block had to wait for Perelman.

You are saying that nature didn't have to wait for Perelman, that acoustically-modified gravity-wave modes performed the surgery long before Perelman became a doctor. Or, maybe, no surgery was needed; what was needed was nature's very own singularity, however schizophrenogenic that might be for a single-valued mindset living in an m-logically-valued universe. Equivalent potential temperature surfaces peridocially “pinch down” in just the way described. In cascade modeling of tornado genesis, however, there is no attempt to conceal the singularity; appearance of the singularity becomes an initializer of complex angular momentum cascade. Continuing the account (p. 77):

In his paper, Perelman added a new term to the Ricci flow equation. The modified equation did not eliminate the troubles with singularities, but it enabled Perelman to carry the analysis much further. With the dumbbell singularities he showed that “surgery” could be performed: snip the thin tube [my observation: somehow it has been consensually decided this does not violate the usual topological rule against cutting, pasting, or poking holes when, by-stretching/compacting-alone, one establishes the homotopy of a given manifold of whatever genus; Collins' article does not explain this variance from the usual] on each side of the incipient pinch and seal off the open tube on each dumbbell ball with a spherical cap. Then the Ricci flow could be continued with the surgically altered manifold until the next pinch, for which the same procedure could be applied…

Note how this is a partial analogue for the “pinch off” of a spatial dimension (as the special-relativity limiting velocity for the given limited spacetime domain is approached) and twist into imaginary dimension (i.e., temporal operation on space as linear-time stops at this limiting velocity) under complex angular momentum exchange in tornado genesis -- except, what Perelman is doing is concealing the process transpiring in nature, as will later become more apparent. Continuing the quote (same page):

When the Ricci flow and the surgery are applied to all possible 3-manifolds, [me again: a 3-sphere, for instance, is not like a ball in 3-space, but like a hyper-ball in 4-space, the kind of “ball” temporal curl creates] any manifold that is as “simple” as a 3-sphere (technically, that has the same homotopy as a 3-sphere) necessarily ends up with the same uniform geometry as a 3-sphere. The result means that topologically, the manifold in question is a 3-sphere. Rephrasing that, the 3-sphere is unique.

Beyond proving Poincaré's conjecture, Perelman's research is important for the innovative techniques of analysis it has introduced… The Ricci flow used by Hamilton and Perelman is related to something called the renormalization group, which specifies how interactions change in strength depending on the energy of collision…

Increasing the collision energy is equivalent to studying the force at a shorter distance scale. The renormalization group is therefore like a microscope with a magnification that can be turned up or down to examine a process at finer or coarser detail [while, I would observe, unlike cascade theory, concealing the laminated cellular properties of spacetime and the hyper-quantum-relativistic dynamics those in-relative-state identity-transparent laminations invoke, including horrifying removal of all possible classical limits]. Similarly, the Ricci flow is like a microscope for looking at a manifold at a chosen magnification…

I would note that changing magnification, i.e., magnitude of nested-grid space and frequency-window time steps, was one of the generative ideas for the cascade theory of tornado genesis. As far as contemporary physicists are concerned, however, space is superspaced, made “cellular”, only at the Planck length (that length being a single-valued universal constant). Planck's length is not m-valued, let along m-logically-valued (though contemporary string theory contemplates it perhaps being a “field”), and there are no identity-transparent cellular laminations to spacetime except in the limit of the most small -- well out of the realms that could possibly have implications for human social, political, or economic life. The good doctors' renormalization surgeries see to that. The cascade theory of tornado genesis ran into the same cognitive inversion in atmospheric science Ling's association-induction hypothesis ran into in cell biology. No “in principle” difference between the two cases.


Thanks for the article (“If you fund it, they will come”, by Maggie Fox, Reuters, October 24, 2004) on the research agenda of the Howard Hughes Medical Institute (HHMI), but I can assure you that they would not be interested in our ideas. They are dumping hundreds of millions, eventually billions, of dollars into shoring up the failed paradigm in cell (as opposed, for example, to sub-molecular, sub-atomic, and sub-spacetime) biology for all the obvious (if to them subliminal) politically-motivated reasons. To quote Ms. Fox's account of the problem definitions being set for HHMI's Ashburn, Virginia research campus:

While biologists have a rough idea of what goes on in a cell, current scans all record the action indirectly, by measuring glucose uptake, for instance.

What if you could take a picture of a brain cell at the very moment it recorded a thought?

They are dead set on visually seeing sight, so to speak, as a way of demonstrating to themselves validity of what they have come to doubt has validity. Superconductivity-conversioned photo-acoustic/electromagnetic-gravitational processes subject to Heisenberg (whose indeterminacy hides the m-logically-valued candle placed under a basket by probabilistic interpretation) have to be non-interventionally “seen”, according to HHMI's aspirations, so that a discrete cell in non-quantum-relative-state can be unequivocally demonstrated, for all the world to “see”, as engaging in individualistic 1T2-logical ratiocination. Can you imagine 11 billion dollars spawned by an idiosyncrat like Howard Hughes being placed on some other sort of politically-leveraged bet? If he who funds chooses the “it” to be funded, the intelligent “they” will not come. Better no “it” gets done than the wrong “it”.


Thank you very much for finding and sending me the reference to the paper by Karl Simanonok (light.simanonok.com). This is an excellent and thought provoking article about which I can make a number of observations. First of all, I was very stimulated by Simanonok's discussion of the role of collagen in intercellular communications and morphogenesis, and its connection with tumor formation when intercellular collagen connections break down. I also noted that he references a paper on the LE (lupus erythematosus) cell in his bibliographic list. My thoughts about DNA, which along with those of Doug Paine eventually evolved into a mathematical description of the genetic molecule as possessing a superconductant core of pi-electron parcels which generate coherent waves in response to impinging radiation, took a major leap into detail when I began reading into the literature on systemic lupus erythematosus (SLE), an autoimmune collagen disease. I began reading about this disease in 1972 when a close friend of mine came down with it. Fulmination periods are often triggered by UV radiation exposure, and it is recommended that the patient stay out of direct sunlight. Russian research in the 50s and 60s demonstrated that SLE outbreaks are highly correlated with the sunspot cycle. There is also an inherited component to etiology of the disease. Years later, it became apparent that this friend lived on the other side of a reservoir in northern Virginia from a military camp heavily involved in development and testing of anti-personnel beam weapons. This was announced in the local papers after the camp was shut down and turned into an ecological park. Obviously, I have no way of knowing whether or not this was actually a factor in onset of my friend's case of SLE. These weapons were also, at the time, being tested by the Navy in the Chesapeake Bay, as reported by the Washington Post when local citizen groups protested. I read these articles when published and my thought was influenced by them. The notion of inherited frequency anomalies (as miasms that can be triggered by non-ionizing radiation exposure) grew in context of contemplating all this. My first letter to Wolfgang Luthe, in 1972, which eventually led to the invitation to present a paper to the Autogenic Therapy symposium in Kyoto and subsequently to actual writing of our superconductant DNA paper, was about autogenic brain discharges as adjunctive therapy in the treatment of SLE, the idea being that these brain discharges could be instrumental in normalizing altered biologically-active intra-organismic frequency parameters of intra- and inter-cellular radiation exchange processes. It just so happened that unpublished clinical research being conducted by T. Abe at Tokyo University Medical Center at the time was independently confirming that such brain discharges do appear to help improve the condition of SLE patients -- hence explanation of the source of Luthe's interest in my ideas. As SLE develops, autoantibodies appear (antibodies to self-antigens). This is not specific to SLE. Later, anti-DNA antibodies appear. This is also not specific to SLE. Only when cell morphology changes and the characteristic LE cell appears (a modified collagen cell) can a firm diagnosis of SLE be made. This march -- given the role UV plays in pathogenesis of the disease -- is suggestive of consequences of prolonged or repeated exposure to critical parameters of non-ionizing environmental radiation. The ideas Doug Paine and I developed about the correlated frequency aspects of immune signifiers and biological clocks emerged in context of contemplating these matters. The ideas I expressed to you a decade ago about radiation triggering inherited frequency anomalies, thus allowing HIV to initially get through human anti-viral immunity sufficiently to have established itself in the human organism as to allow AIDS later to propagate by the currently attributed means of transmission, arose in this context of thought. They were first stated in conversation, as depicted in MOON, at the Kyoto medical congress in 1977 relative to Epstein-Barr virus and mononucleosis. Were frequency anomalies inheritable as miasms, this may be part of an explanation as to why geographically-removed but genetically-related population groups, such as those in Central Africa and the Caribbean, were amongst the first to manifest a high incidence of AIDS in early stages of its epidemiologic evolution.

For me, the most provocative statement in Simanonok's paper is:

In light's timeless frame of reference, a volume of our time-bounded spacetime containing dynamic holographic patterns of endogenous light within a brain overlaps all the other light that ever did or ever will cross that volume. That volume of spacetime is the volume from which the Bleb of God is formed…

I can easily view this formulation as an independent re-statement of our notion of a Limited Spacetime Domain (i.e., a “Bleb of God”) as explicated in our papers of 1977 and 1980, “Toward a General Theory of Process” and “Some Preliminary Considerations Toward Development of a Mathematical Model of the Autogenic Brain Discharge as Spontaneous Localization in Quantum Measurement”. Times sure have changed! I can assure you that use of terminology like “Bleb of God” would have precluded conference presentation and publication in the late-70s, as our use of terms like “Limited Spacetime Domain” and “temporal curl” was then sufficient to divert interest.

All along I have been trying to follow Roger Penrose's involvement in the theory of microtubules of cytoskeleton (developed on analogy with fiber optics) and have been a bit mystified as to how the premiere theoretician of quantum-gravity has resolutely kept the involved cytoskeletal concepts well away from General Relativity. The above-given formulation from Simanonok, who embraces Penrose's orientation, is a case in point: the involved notion of timeless light is specific to Special Relativity. According to Special Relativity, time, relative to the reference frame of the moving “object”, stops as that “object” attains the absolute limiting velocity, the speed of light. Also according to Special Relativity, length of the “object” in the direction of movement of the “object” as it approaches the speed of light contracts until, at the absolute limiting velocity, that “object”-length becomes measure zero. Under General Relativity, however, the spacetime the “object” traverses does not exist independent of the “object” traversing it. The traversing “object” is a highly topologically convoluted curvature configuration of the empty spacetime being traversed. Therefore, in terms of General Relativity, when “object”-length in the direction of movement reaches measure zero, from perspective of the reference frame of the “object” traversing at the speed of light, the dimensions required for measurement in the direction of movement of the spacetime continuum through which the “object” is traversing also reach whole integer measure zero: in other words, not only is the time dimension lost, this loss of time involves a fractalization of the space. Spacetime itself is topologically transformed by “object”-movement at the absolute limiting velocity. The “object” in question here is a quantum-“object”; which is to say an “object” that “exists” at quantum length-scales, length-scales not existing in whole integer dimensions in the direction of movement at the speed of light.

Where do those length-scales in the direction of movement go when they disappear at the speed of light? But actually this is a backwards question, a question addressed from the perspective of Special Relativity to General Relativity. Given that Special Relativity is a special case of General Relativity, the correct question would be addressed from the perspective of General Relativity to Special Relativity. Where do the whole integer dimensions required for measurement in the direction of movement come from when subluminal velocities are obtained? Absent whole integer dimensions, measurement is on a Koch curve, so, not only would there be Heisenberg uncertainty, but also the more accurately any length is measured the more it ceases to exist: it evaporates into the holes between the Cantorian dust forming the Koch curve. The whole integer dimensions unfold from David Bohm's enfolded implicate order. This unfolding of the enfolded, however, cannot be adequately accounted for simply on the basis of those topological transforms occurring at the absolute limiting velocity, the speed of light. In the “Toward a General Theory of Process” paper, we describe topological transforms occurring at the absolute limiting acceleration and also at the absolute limiting time rate of change of acceleration. When these three absolute limits are “simultaneously” met in the reference frame of a traversing quantum-“object” -- not, that is, between reference frames -- a Limited Spacetime Domain, i.e., a Bleb of God, comes into being, a spacetime domain through which objects of measure non-zero may traverse. By virtue of the General Relativity equivalence of acceleration and gravitation (gravitation in General Relativity being curvature of spacetime geometry), the absolute limiting acceleration and the absolute limiting time rate of change of acceleration must correspond to two distinct types of topological operation. We argue, speaking from General Relativity to Special Relativity, that the absolute limiting time rate of change of acceleration corresponds to the unfolding of the possibility of change from non-orientability to orientability, that the absolute limiting acceleration corresponds to the unfolding of the possibility of change of connectivity; and that the absolute limiting velocity corresponds to the unfolding of the possibility of appearance of whole integer dimensions over a laminated, Cantorian, fractal spacetime.

Now this formulation is rather confusing because we ostensibly have topological operations being performed upon the geometry of a ponderable spacetime continuum “before” such exists. Only once subluminal velocities are obtained does time exist. Where does time come from in this scheme? Indeed, where does a ponderable spacetime come from? Since the traversing “object” is a quantum-“object”, our attention is directed to the Schrödinger wave equation. The Schrödinger wave equation is a linear equation; it, therefore, cannot synoptically describe “objects” moving at the absolute limiting acceleration (a nonlinear variable) and the absolute limiting time rate of change of acceleration (a third-order variable). We, therefore, add two non-linear orders to the Schrödinger wave equation so as to make such synoptic description possible. But this still does not allow us to approach arriving at an answer to the question: “Where does a ponderable spacetime come from?” In this scheme, operations on geometry of the spacetime continuum ostensibly exist “before” that spacetime exists. What could the meaning of this “before” possibly be? Where in the Schrödinger wave equation might we find some guidance? The time evolution group? No. This concerns “after” spacetime exists. What about the wave-function itself (which now involves nested third, second, and first order variables)?

Is there something about these variables as they appear in the Schrödinger wave-function that is different from these same variables as they appear in Newton's laws of motion? Yes, very definitely. In Newton's laws of motion, these variables are single-valued; in Schrödinger's wave-function, they are multi-valued. Can the multiple values of these dynamical variables as they appear in the Schrödinger wave-function tell us something about the aforementioned “before” we are interested in the meaning of? Is there some relationship between the multiple values of the dynamical variables and the topological operations that occur at absolute limiting values of these same variables? Of course there is.

The “before” we are after the understanding of is related to getting classes of specificity out of non-specificity. How do you get classes of specificity out of non-specificity? Multi-valued wave-functions simplify the involved identity tag business very greatly. Systemic integration has to do with the general properties of systems, which are non-specific and can be represented by invariants of permissible transformations (transformation is the geometrical way of saying algebraic function). The nest of non-specific invariants prerequisite to systemic integration in a given class of systems can be physically embodied as a class of superposed frequencies. By decomposition, the frequency-tagged structures participating in the given class can be identified. Therefore, again by decomposition, those frequency-tagged structures not participating in the given class can also be identified. In order to accomplish this, a Fourier-transform operator on a frequency space, an operator with a composite wave-function embodying invariants prerequisite to systemic integration in a wide class systems, is required. But function is an algebraic way of saying geometrical transformation! So the given topological transform may have a wave-functional analogue physically embodied as a frequency tag. And likewise for the total transformational prerequisites of systemic integration in a given class of systems. The invariants of the frequency space itself define the regime by which specific processes are integrated spatially and temporally! Were this the case, the overall structure of the total value array of a universal wave-function would be the set-theoretical equivalent to the algebraic function descriptive of the invariants of permissible transformations under General Relativity -- in the present scheme, a General Relativity with three orders of permissible transformations.

But this still does not tell us much about the meaning of our “before”. What is the fundamental nature of the abovementioned Fourier-transform operator, which in the present case involves three nested orders of operation? Consider how one might go about structuring the full set of values associated with the involved universal wave-function. The most straightforward way to structure the multiple values of a multi-valued function is by placing those values on a multi-sheeted Riemann surface and evaluating the manifest numerical relation-structures under multi-valued logics. Which appears to be tantamount to what Alexander Karpenko has recently done at Moscow State University in discovering functional relationships between m-valued logics and prime numbers. Superstring theory and loop quantum gravity obviously have to do with the geometrical-transformations correlate to this set-theoretical and algebraic-function representation. In this scheme, the fundamental nature of the Fourier-transform operator in question would be that of a logical operator. And since time unfolds from the enfolded implicate domain under the three orders of this logical operator, the “before” we are after the meaning of is a matter of logical precedence, not temporal precedence. Nonlinear precursors (as operator-time) to linear-time unfold whole integer dimensional space by means of imposing operational logics. This is what John A. Wheeler once called the “pregeometry”. Direct awareness of timelessness involves placing the consciousness in the pregeometry by assimilating the involved m-logically-valued operational logics. This assimilation undoubtedly minimally requires brain-directed autogenic discharges and the associated processes Simanonok describes in his paper.

There are many ways to go with this overall conception, and much detail that could be added, but I think this gives some impression of how the perspective Simanonok elaborates differs from my own.


I note from Samantha Power's recent review (“A Hero of Our Time”, The New York Review of Books, November 18, 2004) of the Romeo Dallaire's soon to appear book (Shake Hands with the Devil: The Failure of Humanity in Rwanda) that he has yet to run across the body of literature associated with autogenic brain discharges. It is unfortunate that Wolfgang Luthe no longer practices Autogenic Therapy in Montreal.


Enjoyed your reflections on Simanonok. There definitely does appear to be multiple synchronicities about all this. I hypothesize that there is a universal natural “language” of light and sound, sensible and insensible to normal percept and propriocept, embodied in abstract forms, a component of which is the basis of the neural code. If this is so, an aspect of doing with archetypes what you propose would involve finding an art-science way to “grow” this language in consciousness through use. Technological aids may facilitate this process.

Perhaps it would be of value to point out that Simanonok's ideas about the cilia lining the ventricles and about the ventricles being resonant cavities for light is a bit of a reincarnation of an idea Itzak Bentov came up with in the late-70s. Bentov died very prematurely in an airplane crash at O'Hare International. He was a medical instruments inventor and technician who wrote a popular book entitled In Search of the Wild Pendulum. One of the ideas this book contained was that the ventricles are resonant acoustic cavities and that the various resonant states attained therein are involved with what is called Kundalini. I don't remember details, but the march of Kundalini's “spiritual phenomenologies” was associated by Bentov with the spread of induced beat frequencies across the cortex. What is most interesting to me is that the two ideas -- those of Simanonok and Bentov -- are similar: one based on light; the other, on sound.

Back in the 70s, I noticed a correspondence between the physical structure of the then recently developed photoacoustic spectrometer and the processes described in our superconductant DNA model. A very important point not often made, but there to be discovered in the scientific literature: the most-biologically-active modes of vibration thus far experimentally demonstrated are known to be, not electrical or optical, but acoustic (which is exactly what our 1979 superconductant DNA model describes). The acoustic modes spoken of in this model are not the usual phonon-generating stereochemical structural DNA modes, disruptions of which constitute steric hindrances: torsional, compressional, and two transverse bending motions. In our model, it is as if the free electrons in the plasma gas core about which the helices wrap themselves are sealed in a “plastic bag” (it is somewhat useful, but only somewhat, to draw a comparison with the structure of the photo-acoustic spectrometer, with its “plastic bag” component). Ambient electromagnetic radiation impinges upon the “plastic bag” and the interior is heated in rhythmic pulses, like in a greenhouse over a period of days and nights, only here the heating-cooling frequency is much faster. With the energy input from the radiation, free electrons in the ionized plasma gas begin to form parcels which oscillate in temperature (not in space, an idea that was a major issue during the peer review process for the paper), causing them to expand and contract. The trains of temperature oscillations and expansions-contractions rapidly get in-step, like troops of goose-stepping Nazi soldiers. The synoptic energy input and the frictional heat dissipation are brought into perfect synchrony with each other (this being the necessary and sufficient condition for superconductivity). The “goose-stepping” on a “rope bridge” leads to formation of solitary pressure waves, solitons, where one part of the wave is identity-transparent with another (properly described, I believe, only with m-valued logics), so that the solitons are self-driven, self-organizing, self-sustaining, and persist indefinitely (i.e., are superconductant). The superconductance is in the gas core, not along the backbones or steps of the molecule's “rope bridge”. Electrons are not moving from place to place, only the messenger particles of the electromagnetic field, i.e., the trapped photons moving within the “plastic bag” containing the plasma gas core (an elastic plasma bottle like that sought for nuclear fusion, hot or cold). These pressure-wave solitons rhythmically beat on the “plastic bag” from the inside causing it to vibrate (almost like the pulsations coursing along the walls of a tornadic vortex). This vibration sets in motion through the structured intracellular water and cytoskeleton microtubules a coherent (i.e., in-step) acoustic wave (like in a LASER beam -- light amplification by stimulated emission of radiation) emanating in pulse-code fashion (carrying by quadripolar wave-component modulation the complete genetic message) from the molecule to locations where it interacts with many cellular structures. In this case, however, what is amplified is not light, but sound. Sonic LASERs, sonic holograms (Musculpt). However, this process is actually much more complex than that, because the waves are not merely normal sound; they are, we believe, acoustically-modified gravity-wave modes that radiate well beyond the cell membrane, amplifying with height. This is suggested by certain terms in the equations describing the process, but what is actually implied nobody presently knows (or so it would appear). Some energy medicine machines are based on the thesis that the DNA molecule radiates photons, light, the messenger particles of the electromagnetic field -- not acoustic waves, let alone acoustically-modified gravity-wave modes. Some people have apparently taken photographs of this optical radiation using photomultipliers, photos which adorn booklets and books. From our perspective on the microphysics of superconductant DNA, it is hard to know what to make of these photos. So, just in this one case -- DNA's vibrational interactions -- there is involved light, sound, gravity (short-range?), electromagnetic properties, compression waves, pressure waves, temperature oscillations, damping, amplification, threshold behaviors, resonance, critical Curie states, and on and on -- variations in any of the “involved” factors having significant biological effects relative to clocks, immunity, recording of chronic stress, autogenic brain discharge of these stress recordings, and the full range of factors involved therewith.

Let's look at this “plastic bag” issue, which, in the DNA case, “surrounds” the superconductant free-electron gas core and is actually, to our way of thinking, a temporal-operator, temporal-curl and fractal-dimension, fractal-entrapment discontinuity induced by relativistic velocities, accelerations, and time rates of change of acceleration, not actually the holy grail of a leak-proof “bag” or “bottle”. Were this “plastic bag” a fractal boundary with fractal entrapment capabilities, it would be not so much a dimensional “trap” or “sink” as a dimensional analogue of the enveloping helical “rope bridge” (the nucleotide-pair “steps” functioning something like a screen grid in a beam power tube relative to the messenger particle photons of the free electron parcel gas, or like how electrons spontaneously polarize their spin states when hitting voltage-strained sections of a semiconductor); the fractal dimensions of this “rope bridge plastic bag” boundary would act somewhat like the dynodes of a photomultiplier tube which eject electrons when struck by electrons ejected from a photocathode. Transpositions of similar principles occurring at molecular and submolecular scales of motion. But the superconductant DNA model implies that the coherent waves radiated by the model are acoustically-modified gravity wave modes. It is interesting to note here that pulses of laser light have been used to align orientations of electrons. Recently a Berkeley physicist, Raymond Chiao, in a paper entitled “Superconductors as quantum transducers and antennas for gravitational and electromagnetic radiation”, has provided a theoretical description of how a superconductor can transduce electromagnetic radiation to gravitational waves and vice versa. I would note here, in relation to archetypes, that Jung's and Pauli's formulations about the structure of the collective unconscious was viewed by them as being analogous to Einstein's General Theory, his theory of gravitation.

So much of currently evolving technology seems relevant. If the sugar-phosphate helices are analogous to superheterodyne antennas, there is a frequency funnel that beats down the radiation impinging upon the DNA molecule, beats it down to the resonate frequency of the molecule. This beating-down process may induce something like the Feshbach resonances recently used to create a Fermion condensate (identity transparency between Fermions, as opposed to that between Bosons). Such processes may be involved in formation of the pi-electron parcels of the free electron gas core of the molecule. Bose-Einstein condensates and Fermion condensates have been created in the lab only at temperatures very near absolute zero. Creation of these condensates involves manipulating the quantum spin of the involved atoms, separating out atoms of different spin and energy states, and thus progressively lowering the temperature until condensation transpires. Quantum spin, which relates to the atom's behavior in a magnetic field, is not rotation in the normal sense; the atom does not rotate in a continuous fashion as would be required for analogy with a spinning top. The quantum spin state relates to the angular measure of magnetic pole tilt and precession: the axis precesses in a circle under magnetic force. Changes in tilt occur by quantum jump, not continuously; thus, they are numbered. Remove the magnetic field and the tilt angle remains intact. New magnetic RAM chips are based on these principles. Quantum processors, as currently envisioned, will use these differing tilt angles as phase digits in computing; the more precisely the phase angles can be measured, the more information density available. The recent Scientific American (September 2004) issue devoted to Einstein has some good stuff on this in it.

Condensates are being used to create gravitational detectors, called atom interferometers, which measure acceleration and rotation. If the DNA molecule radiates acoustically-modified gravity wave modes, biological atom interferometers may exist in the cell. Standing waves of laser light act like a grading to defract atoms of the condensate. Originally, the wavelength and phase of atoms in the condensate are the same; with accelerated movement along different paths they diverge. The interferometer measures induced divergence of acceleration (i.e., gravity) via wavelength and phase change of the defracted atoms.

One big issue here is temperature. It is well to remember that up until Einstein's 1905 paper on Brownian motion, the notion that heat is random motion of molecules was controversial. WHAT IS HEAT AT SUBMOLECULAR SCALES OF MOTION? In our superconductant DNA paper, we proposed that free electrons in the ionized plasma gas begin to form parcels which oscillate in temperature (not in space), causing them to expand and contract. This idea was a major issue during the peer review process for the paper. But times have changed! These days the notion of submolecular temperature doesn't cause an eyelash to bat. One place to see this is in the field of radar auroras. I quote from "Radar Aurora":

Type 4 echoes are relatively rare, short lived (from several seconds to several minutes) and variable, and are observed during strongly driven conditions of ion acoustic wave generation (e.g., Haldoupis et. al., 1991, where the main properties of these echoes are discussed together with the shortcomings of the present theories). They are related to electron temperature enhancements around an altitude of 110 km. These enhancements are not connected with auroral particle precipitation, since they are much too large (e.g., from 300 K to 1500 K), and because no correlation with electron densities can be found (Wickwar et. al., 1981). Heat conduction from above can be eliminated just from the fact that Te maximizes in the middle of the E - region, and since Ti < Te, the ion population cannot heat the electrons. The correlation with higher-altitude ion temperature indicates a relationship with Joule heating. This relationship must be indirect, since the Joule heating of electrons is negligibly small. One possibility is some kind of plasma instability driven by strong convection electric field, like vd = ve-vi dependent modified two-stream (MTSI, Farley-Buneman) instability. This instability is also confined to region around 110 km, where we find large Hall currents in combination with as low a collision frequency as possible (Schlegel and St.-Maurice, 1981). Electron heating rate due to wave heating is modeled, e.g., by Robinson (1986), Q = Ne me n* (vd-c)^2 where n* is the anomalous or effective collision frequency (due to scattering of the electrons by the unstable waves; term electron-plasmon collision frequency is used by Jones et. al., 1991 [plasmons are pseudo-particles representing the waves]), and c is the ion acoustic velocity. Equation is analogous to the expression for the Joule (frictional) heating rate, except that the electron-neutral collision frequence is replaced by n* and the neutral gas velocity by c. In paper by Machida and Goertz (1988) this type of heating is actually called “anomalous resistive heating”.

One implication of this is that, for any given regime, temperature is random motion of that regime's sub-scale constituents: molecules for a gas, atoms for a molecule, subatomic particles for an atom. But if that is so, what about the temperature of quarks or whatever the smallest constituents of matter are found to be? Our idea was that “submolecular” temperature is a measure of “temporal spin resonance”. The greater the resonance, the lower the “submolecular” temperature. Think about it. According to the equivalence principle, gravity is equivalent to acceleration, a given accelerational gradient being equivalent to a spacetime curvature configuration. If the gravitational field is a property of spacetime geometry called curvature, what property of spacetime is the electromagnetic field? Riemann said that “charge is lines of force trapped in the topology of space”. What comes “before” (the logical “before” of my last communication), lines of force or topology of space? If absolute limiting time rate of change of acceleration comes (logically) “before” absolute limiting acceleration which comes (logically) “before” absolute limiting velocity, and topological transforms come with each absolute limit, then topology of space comes (logically) “before” lines of force. Time rate of change of acceleration, acceleration, and velocity are each time-factor variables; at limiting values of these time factors topological transforms occur. Positive and negative charges are at opposite ends of a topological twist at or near the Planck scale, a twist in the form of a mini-wormhole. But are they simply the ends of the wormhole or something else? Gravitational lines of force -- be they long-range or short-range -- are curvature gradients. Topologically speaking, what are electromagnetic lines of force?

We must remember that the space of the pregeometry is not ponderable space. Logical “befores” are relative to the m-logically-valued reference space of the pregeometry, not relative to the ponderable space of spacetime. In the above, we have time not as a dimension but as a logical and topological operator. The two nonlinear orders of time are precursors of the sensed linear-time of ponderable spacetime. Time operates, in the absolute limit of time-factor variables, to logically and “later” topologically transform the m-logically-valued reference space such that ponderable spacetime is decomposed from the m-logically-valued reference space. Electromagnetic lines of force appear (logically) “after” time operates on the reference space such that orientability emerges out of non-orientability; this occurs with the limiting time rate of change of acceleration. Electromagnetic lines of force appear (logically) “after” time operates on the reference space such that changes of connectivity “become” possible; this occurs with the limiting acceleration. Following this scheme, we further speculate that electromagnetic lines of force appear (logically) “after” time operates on the reference space such that fractalization of the reference space into laminated limited spacetime domains transpires; this occurs with the limiting velocity. Only with sub-limiting time rates of change of acceleration, sub-limiting accelerations, and subluminal velocities does ponderable spacetime fully emerge.

What then, topologically speaking, are electromagnetic lines of force? Ask the question algebraically first. If transformation is a geometrical way of saying algebraic function, then it is easy to see how the Regge calculus was discovered. The Regge calculus takes a spacetime curvature configuration (solution to Einstein's gravitational field equations) and translates it into an n-dimensional lattice equivalent. The Regge calculus and Einstein's field equations are dealing with ponderable spacetime. But the m-logically-valued pregeometrical reference space is not only n-dimensional, it involves m-valued functions under m-valued logical operators. A stack of multi-sheeted Riemann surfaces, as universal covering surface, with Gödel numbers mapped upon it, is required to represent this reference space. The multiple sheets and multiple surfaces are bridged in multiple ways. One candidate for the algebraic equivalent of electromagnetic lines of force are bridges between the sheets at the branch points of the involved functions, these bridges logically corresponding to a class of Gödel numbered propositions. I would call these bridges cross-temporal bridges because they result from the “action” of logical operators, the effects of which when decomposed into ponderable spacetime give rise to time-like properties (connection through time, antecedent-consequent relations). Were this the case, “submolecular” temperature would be a measure of alignment/non-alignment of these cross-temporal bridges, in other words, a measure of temporal spin resonance. “Spin” because tilt of magnetic phase is what physicists currently call quantum spin and these cross-temporal bridges would be the pregeometric logico-algebraic equivalent to magnetic orientation as it appears in ponderable spacetime. Speaking from the perspective of that ponderable spacetime, it is our thesis that onset of high temperature superconductivity is deeply involved with this temporal ordering: bringing clocks within clocks within clocks into phase alignment by bringing energy input and heat dissipation into perfect synchrony. Speaking from the perspective of the pregeometric m-logically-valued reference space, however, the super state would be the base state from which all material processes are decomposed. Exploration of these notions in detail, as they apply to neural code and so on, may provide considerable guidance as to how to think of the archetypes-in-themselves.


What do you expect? The ideas in the “Toward a General Theory of Process” paper were brought together just as string theory was starting to emerge. It is clear from the march of his theoretical formulations that Roger Penrose didn't even regard the basic ideas eventually producing string theory as part of a potential “world picture” until the early 1970s -- just before the ideas in the “general process” paper began to come in upon us. But I will suggest that nothing fundamental found today in string theory is not also found in some form in this paper: including dark matter, dark energy, higher orders of time. Whereas there is quite a bit in this paper not yet fundamentally in string theory. The “general process” ideas emerged in context of working with a computer model of a relativistic notion of tornado genesis (designated “cascade theory”) in process of transiting to a quantum-relativistic notion of tornado genesis wherein acoustically-modified gravity-wave modes played large, while simultaneously, and by analogy, there developed the notion of DNA having a superconductant pi-electron gas core, where, again, acoustically-modified gravity-wave modes were explicit. For me, personally, two early-morning bright-light-state critical-insights arrived as schematic visualizations summarized in my awareness with words: “hypertemperature is temporal-spin resonance” and “the twistor is the quantization of the temporal curl!”. These both came in late 1975. In 1973, if I correctly remember the account given me, Doug Paine, the professor with whom I was collaborating at Cornell, and with whom I co-authored the “general process” paper, had cornered John A. Wheeler after one of his talks and tired to discuss with him the gravity-wave-mode aspects of the cascade theory of tornado genesis. After listening for awhile, Wheeler was dismissive. So, it was left to me to write Wheeler about my “critical insights”. Having no formal training in mathematics and physics, and relying primarily upon my capacities for what I call my “inner Musculpt”, I decided to describe to Wheeler the schematic appearing in my bright-light states, rather than attempt to go into the physics I was learning from Doug Paine concerning acoustically-modified gravity-wave modes. As I remember, I did not give the associated verbal statements in this account written for Wheeler. Only early in the following year did I discover that aspects of this schematic were intuitive ways of understanding the import of what is called a Riemann surface. Wheeler responded with the suggestion that I contact Roger Penrose whom, he said, was rigorously pursuing similar notions. I already knew this, of course, as the second verbal formulation indicates. The literature at this time on twistors was solely highly technical and it took me awhile to realize that the Riemann surface was an underlying concept. But Musculpt-wise I understood this very well on a general principle level, though the twistor jargon and notation was an obstacle to comprehending details. My life circumstances never provided me the opportunity to master this jargon and notation.

One thing I struggled with over the ensuing years was why Roger Penrose felt no intuitive resonance with the ideas I sent to him -- beyond the fact that the notion of a “classical limit” (which he ostensibly rejected) prevented consideration of the idea that the involved principles have application to tornadoes and DNA. I rejected then, and continue to reject, the interpretation that his lack of resonance is simply because my ideas are insufficiently technically stated. I searched for a pattern in the drift of his thought that was sufficiently divergent from that of my own so as to account for the lack of resonance. One thing a person will do when they are stumped is review everything they know about the problem. This seems to be a possible primary motivation for Penrose writing The Road to Reality (Cape, 2004). Given my quest for a pattern, I could almost flatter myself that, on some level, this book was written for me -- or others like me. There are several interrelated areas of very significant divergence I now have a better appreciation of the import of: (1) the absence in Penrose's thought of a purely mathematical account of the presence of symmetries in mathematics, without recourse to physical theory; (2) absence of m-valued logics in Penrose's thought; (3) Penrose's notion that spacetime is constructed from twistors, not that twistors are mathematical involutes. I believe that symmetries identified in the various fields of mathematics derive from the self-referential properties of m-valued logics and that twistors involute under m-logically-valued operations to spacetime, the final stage of this involution constellating the broken symmetries identified in physics, all of which, apparently, are associated in one way or another with the Second Law of Thermodynamics (this law being definable only by recourse to the notion of linear-time).

I “discovered” my understanding of the Riemann surface (this understanding first encountered intuitively with the bright-light schematic) while trying to comprehend “the meaning” of Emil Post's m-valued logics. Penrose does not mention m-valued logics in his book. Though m-valued functions are implicitly discussed throughout his book, and explicitly discussed early on, they are not focused upon sufficiently per se to warrant an entry in the index. He does not discuss m-valued functions relative to Abel having solved for the first time a general equation of the 5th degree. My understanding of this came from searching out and reading all of Eric Temple Bell's books in the early 1970s. Bell gives several brief accounts of the Riemann surface in his books, but I did not understand significance of this surface when first reading about it. I passed rapidly over the descriptions and it laid there in my subliminal mind. Only after struggling with the hypernumber arithmetics of the notion of complex angular momentum cascade in tornado genesis, and following appearance of the bright-light schematic, did “understanding” start to dawn. I could then see that Doug Paine's notion of “temporal curl”, as he used it in atmospheric cascade theory, was tantamount to treating angular momentum as complex angular momentum, only the involved transforms were time driven, not merely space-like -- and that this was the direct result of treating spatial contraction at the absolute limiting velocity of Special Relativity as having m-valued properties. Hence, the idea of time as a topological operator on space. Accepting this implied looking at all the fundamental physical constants as likely being m-valued. If so, are these values, say of the absolute limiting velocity, in natural log distribution? How to make sense of this? That's why m-valued logics seemed so necessary. Temporal operation as logical operation; the space operated upon being the m-logically-valued reference space (Hilbert space under m-valued logics). Aha! “The twistor is the quantization of the temporal curl!” But I want to try to give a thoroughly accurate account of the origins of my engagement with these notions, which was simply my attempt to make sense to me of Doug Paine's ideas.

Being intensely exposed to animistic identity as a child living in rural Japan, and then being plunked back into mid-1950s America, I sustained considerable identity disequilibrium. Trying to understand what had happened to me became a life-long pursuit. At a certain point, I understood that I had not so much directly experienced two different identities as two very different ideas of what identity is. This realization did not help a lot because I could not formulate what that realization actually meant. This was not something you could talk about with others, for as soon as the attempt was made they treated you as having mental problems. I learned to keep my mouth shut and became a voracious reader. The first big clue came at age 18 in 1963 when I read the appendix to Volume One of J. G. Bennett's book The Dramatic Universe, wherein he gave an account of “skew-parallelism” by way of explicating his notion of “diversely identical skew cubes”. This was done in context of explicating his stab at a unified field theory on a five-dimensional manifold. I understood none of that, but immediately knew beyond all doubt that the idea of “diversely identical skew cubes” was a mathematical analogue of the animistic state of identity I had experienced as a 9-, 10-, and 11-year-old child. At this same time, in late-1963, I was researching and writing a paper for Abdul Aziz Said, entitled “The Predicament of Existentialism”, wherein I discussed Gödel undecideability and Heisenberg indeterminacy relative to two propositions: the Platonist notion that “Essence precedes existence” and the Existentialist notion that “Existence precedes essence”. I found myself, at conclusion of the paper, challenging the very idea of “precedes” -- logically, ontologically, temporally. Juxtaposition of this challenge with Bennett's notion of “skew-parallelism” led to a bright-light realization I eventually encapsulated as: “Indeterminacy is the tip of the iceberg of skew-parallelism”. When I later ran across Emil Post's 1921 paper on m-valued logics, I immediately knew beyond all doubt that these were the logics behind “diversely identical skew cubes”. I first read of fiber bundle arithmetics in Science News during the mid-70s. I don't remember whether this was before reading Post or afterwards, but I immediately knew these arithmetics had somehow to be related to “skew parallelism”. It was much later that I learned the “Clifford parallels” are actually skew. I seriously doubt that Bennett's 1956 notion of “skew parallelism” was based on Clifford parallels, or he certainly would have mentioned that somewhere -- though I may be wrong on this.

The twistor has time going not only forwards and backwards, but also many different degrees of sideways. Since, by virtue of Doug Paine's notion of “temporal curl” in angular momentum cascade leading to tornado genesis, I was “visualizing” time as a topological operator that takes space into imaginary dimensions, I tended to view this “sideways” of time as related to simultaneously existing skew-parallels. Temporal curl in tornado genesis is about the tilting into imaginary dimensions of twisting axes of spin. J. G. Bennett briefly discussed what he called “pencils of skew-parallels”, which I came to associate with fiber bundles. This way of thinking got very interesting once we began entertaining the notion that DNA has a superconductant pi-electron gas core at physiological temperatures that produces coherent wave output. The modeling of this came directly out of the double-helical flow-pattern trajectories of air-parcel feeder bands in cascade theory of tornado genesis. The electron parcels of BCS theory of superconductivity come about by the virtual particle exchanges permitted by Heisenberg indeterminacy. But what if this indeterminacy is just a shadow of the temporal operators responsible for skew-parallelism? Position and momentum riding diversely identical skew-parallels? What would that imply about lines of force and equipotential surfaces? Would there “emerge” something like “skew-perpendicularity” of lines of force and equipotential surfaces? Riemann's charge creation as “lines of force trapped in the topology of space”? And if so, wouldn't these skew-perpendiculars relate to the m-values of the Schrödinger wave-function? The exclamatory statement “The twistor is the quantization of the temporal curl!” came in the context of contemplating these notions. Are there higher order abstractions of temperature for an electron gas? What is the analogue of atmospheric potential temperature and atmospheric equivalent potential temperature relative to a pi-electron gas? Could these have to do with skew-parallels of the tilt of the axis of electron spin? And what are these quanta of spin-axis tilt if not the temporal curl resulting from the logical operations on the reference space (Hilbert space under m-valued logics)? “Hypertemperature is temporal-spin resonance” across the topologically-contorted equipotential surface. Temporal curl is fully “viewable” in the atmospheric cascade process only when the model is not filtered for accelerations and time rates of change of acceleration. Axes of spin tilt into imaginary dimensions under accelerations and time rates of change of acceleration. Do away with the filtering of such rates of rates and rates of rates of rates (David Bohm's “clocks within clocks within clocks”) and the roles of acoustically-modified gravity-wave modes become apparent as what we, in the mid-70s, called “connective mass-energy” and “configurational mass-energy” (two categories of “darkness”, that darkness being associated with “sideways” time -- remembering, of course, that temporal “sideways”, “backwards”, and even “forwards” are inadequate conceptual “lags” for description of logical operations on Hilbert space under m-valued logics). Hence, gravity might have something to do with collapse of the Schrödinger wave-function, if and only if that wave-function were interpreted in terms of m-valued logics. And if the fundamental constants are m-valued, there is no “classical limit” preventing these notions from being relevant to the idea that electromagnetic fields are converted to gravitational fields, and vice versa, by superconductant processes associated with the tornado and DNA -- which is exactly what we described in the mid-to-late-70s. See, for instance, the 1978 paper entitled “The Discovery of a Superconductant Exchange of Hydrothermodynamic Properties Associated with a Limited Domain of the Atmosphere” and our 1979 superconductant DNA paper.

However, if one has looked directly at the Fallacy of Contradiction in the coupled propositions of Platonism and Existentialism -- “Essence precedes existence” and “Existence precedes essence” -- a bone of contention between, say, a Penrose and a Hawking, and chosen to challenge the notion “precedes”, then one looks for something more fundamental in logic than “truth-value” upon which to base one's understanding of the orders of self-reference that the self-referential propositions of m-valued logics are an expression of. A “pregeometry as calculus of propositions” would be m-logically-valued. I already had the answer, of course, as a 9-, 10-, and 11-year-old child: different ideas about what identity is. Order of logical-value corresponds of degree of identity transparency. The greater the number of superposed values of the Schrödinger wave-function, the greater the degree of identity transparency, the greater the degree of Bennett's “diverse identical-ness”. The greater the number of skew-parallels in a “pencil”, the greater the non-locality. Identity transparency on Hilbert space under m-valued logics -- decomposed under the logical operations of involutory twistors understood as quantizations of temporal curl -- is translation, motion, propagation. The infinite number of different solutions the M-theory version of string theory posits are not possible solutions in Hilbert space under m-valued logics; they are actual solutions. In order to sort those solutions out according to the order of logical-value from which they come, a prerequisite to description of decompositional involutes of twistors, Hilbert space has to be reconstructed under m-valued logics. This would likely involve -- à la Alexander Karpenko's demonstration of functionals between prime numbers and m-valued logics -- a prime-number and Gödel-number mapping on stacks of stacked Riemann surfaces. Universe as universal covering surface. Fiber bundles, strings, bands, branes: they are all there in Hilbert space under m-valued logics.

But, I must now raise the question as to why a self-professed Platonist like Roger Penrose would have twistors constructing spacetime, rather than operating as decompositional involutes. And why a strong advocate of self-reference, as was John A. Wheeler during the 1970s, never contemplated a “pregeometry as calculus of propositions” being m-logically-valued. And this will be unpleasant -- for us all. My bizarre notions of the origins of world war had their origins in my experience as a child in Japan, of course. These notions began to get concrete in 1963 as I worked on the paper for Abdul Said. Let me quote from Roger Penrose's recent book and juxtapose that quotation with something I wrote long ago. From Chapter 8, “Riemann surfaces and complex mappings” (The Road to Reality, p. 136):

Before Riemann introduced the notion of what is now called a “Riemann surface”, mathematicians had been at odds about how to treat these so-called “many-valued functions”, of which the logarithm is one of the simplest examples. In order to be rigorous, many had felt the need to regard these functions in a way that I would personally consider distasteful. (Incidentally [I would say not so incidentally], this was still the way I was taught to regard them myself while at university, despite this being nearly a century after Riemann's epoch-making paper on the subject.) In particular, the domain of the logarithm function would be “cut” in some arbitrary way, by a line out from the origin to infinity. To my way of thinking this was a brutal mutilation of a sublime mathematical structure.

And yet, on page 138, Penrose says:

In the case of Riemann surfaces, the manifold (i.e., the Riemann surface itself) is glued together from various patches of the complex plane corresponding to the different “sheets” that go to make up the entire surface. As above, we may end up with a few “holes” in the form of some individual points missing, coming from the branch points of finite order, but these missing points can always be unambiguously replaced, as above. For branch points of infinite order, on the other hand, things can be more complicated, and no such simple statement can be made… There is an infinite order branch point at the origin and also at infinity -- but, curiously, we find that the entire spiral ramp is equivalent just to a sphere with a single missing point, and this point can be unambiguously replaced so as to yield simply a sphere.

Now from my article, “Echo of the Mockingbird: Why Postwar Historiography is Anti-Historiographic”:

In 1826, the mathematician Niels Henrik Abel introduced a notion in his famous theorem on transcendental functions which has reverberated throughout subsequent history: the multiformity opposed to the uniformity Gauss dealt with in functions of a complex variable. This multiformity arises from the fact that solutions to general equations of the 5th degree and higher cannot be solved in a finite number of steps; their solutions are nonalgebraic. Discussions of this can become extremely complex, but the basic idea is quite straightforward. Multiformity is the result of multivaluedness: the circumstance where the identity of an element is associated with more than one value. The simplest case is a two-valued function where the value of a variable can be both positive and negative: plus and minus 2, for instance. In the general case, the number of values involved is infinite, n-valued. Abel’s ideas were developed by Jacobi, Weierstrass, Riemann, and others, eventually finding their way into physics, most notably in the multivalued functions associated with Schrödinger’s wave equation.

The notion that identity can have more than one value was not welcome. It was resisted. Ways were sought to get around it. Implications of the idea were not followed out cleanly. As an example of what transpired, there is the Riemann surface. Multivalued functions have many branches into which their value arrays fall. The branches cut across one another at critical values called branch points. By analogy, one could say that the strength characterized by a function lies not in the mass it represents -- as the geodesic dome so well illustrates in structural engineering -- but in the number of its branch points, which is a measure of its multiformity. And the degree of multiformity is a statement about the type of identity state involved. During the 1850s, Riemann found a way to represent all this with a multi-sheeted hypersurface where the sheets are connected to one another at the branch points of the function. He was thus able to follow the march of the function continuously through all of its values by moving from sheet to sheet over the hypersurface. The branches having been made relatively invisible, over the following decades, multivalued functions came to be treated as if they were somehow equivalent to single-valued functions. After the Riemann surface, it was much easier to forget that there is a fundamental difference between the notions of identity associated with multiformity and uniformity. Single-valued identity implies that an element or entity is singular, selfsame, self-identical, the-same-as-itself -- and that there is absolute distinction, an identity opaqueness, between separate entities. Multivaluedness, on the other hand, implies multiple selves, interior dissociation, collective properties, hidden aspects, group behaviors -- and that distinctions between entities cannot be absolute, that there is identity transparency.

There are many places on the MOON website, other than in “Echo of the Mockingbird”, where my bizarre notions as to origins of world war are explicated. I invite you to read them, even if this is likely to be unpleasant.


Good point. Actually, I don't often try to say what I mean. This is because there are always m-ways of thinking about it, whatever “it” may be. Generally, I just speak in the context of the discourse, otherwise there is the necessity for “nesting” whatever is said -- and since the nest is always deeper than I can imagine, it would be tiresome always to preface every statement. I am therefore not particularly good at running talk, as I have to “sort” amongst those m-ways appearing Musculptly to awareness before deciding what to say in the given context, and could easily give way to paraphasia. Staying “in the talk”, like “in the body”, requires maintenance of an associational “tension” that is a kind of “pushing away” of “stuff”. For this reason, alone, would I advocate developing Musculpt as mathematical notation (and would I not devote much time to preprogramming myself with contemporary forms of notation, be they mathematical or musical, as this would only short-circuit my “inner Musculpt”). It is correct that at one place I say I believe the universal physical constants are m-valued and at another that I believe those constants to be m-logically-valued. You point out that a constant is by definition some numerical value and therefore cannot be either of these, particularly the latter. I say: the involved definition depends on which m-way you wish to think about “it”. Saying that the universal physical constants are m-valued means that these constants are functions of the defining properties of the limited spacetime domains, or sets thereof, to which they relate (these defining properties, of course, establish the partitionings of a given “use”, and this is where the measurement problem comes in with a vengeance: Swedenborg's discourse on the “forms of uses” is very useful here). Under m-valued logics interpreted, not in relation to truth-value, but in relation to identity transparency, however, the situation would appear to be much more interesting: each of the m-values of a given constant would be non-self-identical values of the single Gödel number the universal constant signifies. In this interpretation of m-valued logics, not only is the Law of Distributed Middle dispensed with, so is the Law of Contradiction (m-times over). This “base” Gödel number would be that mapped on the universal covering surface, while the involutes of that “base” would be the non-self-identical values of that very same number as they appear on the decomposed collection of Riemann surfaces and the decomposed sheets of those decomposed Riemann surfaces. Challenging the notion “precedes” -- logically, ontologically, and temporally -- means, in this context, embracing non-orientability between universal covering surface and decomposition involute, means, that is, embracing something like a polytope Klein bottle. But I could “nest” again, and say that the mind-body/observer-object[system] aspects of the measurement problem demand another fundamental non-orientability involving counter-hypernumbers, counterspace, and counter-temporal operations (which is not a reference to mere time reversal): this is the thesis addressed in the (G. Spenser) Brownian wave statement appendix to the 1980 paper written for Wolfgang Luthe and Roland Fisher entitled “Some Preliminary Considerations Toward Development of a Mathematical Model of the Autogenic Brain Discharge as Spontaneous Localization in Quantum Measurement”. Remembering, of course, that this thesis involves intra-neuronal DNA molecules having superconductant cores, with all the implied processes. M-valued logics, to this way of thinking, posit non-self-identical numbers. I can, therefore, under the involved assumptions, maintain that the universal physical constants are not only m-valued, but also m-logically-valued. There is also the question of the order of m: finite, denumerably infinite, the Cantorian continuum, and so on (with the additional non-orientabilities these may involve). The “nests” are always deeper than can be imagined.


Well, I can offer a poor analogy. In MOON, the following is asserted (Vol. 2, p. 668):

The Multivalued Reference Space axiOOOMMitized (from the point of view of two-valued logic):

Here, the fundaments of non-self-identical m-logically-valued number are viewed from the perspective of 2-valued logic. This is somewhat analogous to viewing relations in hyperbolic geometry from a Euclidian perspective. Discussing the latter, Roger Penrose (The Road to Reality, p. 34) draws attention to the fact that M. C. Escher's Circle Limit 1 illustrates the involved relations. To the Euclidian observer, at approach to Escher's limit circle the depicted figures become evermore closely packed, yet from the perspective of the figures themselves on the hyperbolic space there is no appearance of packing. This is what the experience of identity transparency is like, the experience of non-simple identity, of Bennett's "diverse identical-ness", of nonlocality. Sitting in a room with a group of friends, relaxing into the resonance, if one truly lets go of always-there ergotropic self-associational tension (transiting toward zero action-potential on the electromyograph, particularly of the extraocular and laryngeal muscles), suddenly one recognizes that there has been a subtle “shift”, a timewarp has formed. A magic circle has been drawn around the group. If someone suddenly gets up and leaves the room, it is as if a cusp catastrophe has occurred, as if one has sustained a near-lethal existential threat. One feels spatially closer to the others from the normal, while simultaneously more in ones own presence. One knows the spatial distances have not changed, but percepts come as if the space is collapsing, as if the “other” were imploding into oneself. Whole body petit morte. One is affectively leaving being-for-the-self and entering being-in-itself. The defining characteristics of what I call the “feeling-space” are undergoing transformation as access to m-logically-valued processing develops (long-range phase coherence between superconductant intraneuronal DNA molecules within the brain and between brains). The spatial collapse and identity-transparent “close packing” in the feeling-space transpires as a result of the fact that the ego-sphere is retaining its hold on 2-valued logic, while other faculties are increasingly accessing higher orders of m-valued logic. As I say, this is likely only a poor analogy, but an analogy that may provide some insight into the phenomenologies associated with animistic modes of comprehension. There may be things to learn from this useful for consideration of the "measurement problem" in quantum mechanics and what occurs at relativistic absolute limiting velocities, accelerations, and time rates of change of acceleration. And it may offer a different notion about how to understand “superposition”, “spontaneous fusion and localization” in the Schrödinger wave-function.


Okay. I can respond with the following. If Heisenberg indeterminacy is the tip of the iceberg of skew-parallelism, then the anthropic principle is the tip of the iceberg of the quantum measurement problem. We start with Derek's assertion in MOON (Vol. 2, p. 594) which modifies the Playfair axiom, to wit, “For any straight line and for any point not on the line, there is an infinite number of straight lines through the point parallel to the line” or, alternatively, relative to the “hypothesis of the acute angle”, there is an infinite number of acute angles through which the involved lines do not meet. This would be the basis of J. G. Bennett's “diversely identical skew-cubes”. The limit pencil of skew-parallels (involving a denumerable transfinite set) carries the manifold through the twist into non-orientability (Riemann-surface map on Möbius strip, on Klein bottle). Mapping of the mapping of the complex plane on these topological surfaces requires hypercomplex numbers. “Number the numbering!”, as the Third Voice keeps demanding in MOON. Numerically speaking, counter-hypercomplex numbers are required to map the “back” of the strip, the “inside” of the Klein bottle. Counter-hypercomplex numbers are entered through minus-zero and minus-infinity, more accurately, through division by these quantities. 1/zero = infinity. -1/zero = -infinity. 1/-zero = -counterinfinity. -1/-zero = counterinfinity. The inverse of these is given for division by infinity and minus-infinity. Such division is required to map the twist. If you use only the standard definition 1/zero = infinity, and not the others, you stay in the complex domain and map the twist there, yielding Möbius strip and Riemann sphere. You get the Riemann sphere if you stay in the complex domain (thus obtaining a treatment of m-valuedness as if it were equivalent to single-valuedness: the objective of this whole line of mathematical developments); the Klein bottle, if you go to the hypercomplex domain with the expanded rules for division by zero and infinity. When the Euler-Riemann zeta function is viewed in this hypercomplex context (beyond what Riemann considered), its single-valued properties are lost and the functionals generated correlate with orders of m-valued logics -- or so it would seem. The Riemann hypothesis is speculation about one feature of functional form when consideration is restricted to the complex domain (which speculation may or may not be “proven” by recourse to 2-valued logic); more importantly, one would speculate about functional form in the hypercomplex domain relative to m-valued logics. “Proven” is put in quotation marks because, one must realize, placing Gödel within the context of m-valued logics, “proof” is relative and goes on and on and on from one order of logical-value to the next. Derek confronts this in MOON, under upbraiding from the Third Voice, thus (Vol. 2, p. 669):

Don't cradle your head! This is no occasion for catatonic response. Four-eighteen. Four-eighteen. Breathe! Walk and breathe. Four-eighteen. Anapana-sati. The satin-flow essence which you breathe. Kayanupassana. Vedananupassana. Cittanupassana. Dhammanupassana. Insight meditation. It's constellated now; it will KILL you unless you live it. Oh, how many diseases it knows how to approximate!

There is also division by counterquantities, which surely involves limit pencils at higher orders of infinity and relates to the properties of topological autopoiesis and self-organization (as asserted in MOON, Vol. 2, p. 669). As far as I know, this is not the way Charles Musés began his work on hypercomplex and counter-hypercomplex numbers, such as the square-root of plus-one (which is not equal to plus-one). But, clearly, he did realize that counter-hypercomplex numbers are required as a basis for explorations of the measurement problem in quantum mechanics. We were attempting to think our way into these sorts of things in the late-'70s when the circumstances permitting such work fell apart. Joseph Bridger was a great stimulus here. M-valued functions and, hence, m-valued points (i.e., “distinct” points on the manifold are the same point) are directly involved; hence, also, non-self-identical numbers (specifying “diversely identical” skew-figures). Now, if there are functionals related to m-valued logics, as Alexander Karpenko's work has demonstrated relative to prime numbers, then numbered Gödel numbers (“Number the numbering!”) map on the involved manifold. Gödel numbers designate logical propositions. Numbered Gödel numbers designate m-logically-valued propositions. If the fundamental physical constants are m-valued and m-logically-valued, as earlier asserted, then the particular values of these constants relative to any given universe of discourse are determined by the order of logical-value chosen through which to view the manifold. This choice sets the terms of measurement in the universe of discourse. Choose the 2-logically-valued case, wherein the manifold is viewed from a Euclidian perspective, conformal or otherwise, and a set of unique values for the universal physical constants magically appears: anthropic principle.


Sorry, I was insufficiently clear. I am not talking about a Möbius strip or a Klein bottle. I am talking about collections of Riemann surfaces in the form of a Möbius strip or a Klein bottle (illustrative only, as an aid to visualization), where a given multi-sheeted Riemann surface could be said to represent the fractal map of a given scale level of an integral dimension of Hilbert space under m-valued logics. It is with regard to such a notion that I imagine counter-hypercomplex numbers are required to map the “back side” or the “inside”, which has to do with “observer state” as opposed to “object system”, these two ontological categories being distinguished only by the “strange” twist of non-orientability: in far enough is out; out far enough is in. A multiplicity of multiple nestings here. Logical operations on this “manifold” devolve to time-like properties (“viewable” as changes of representation space). This is the sense in which I take time to be the set of all topological operations on the m-logically-valued reference space, and the notion of consciousness offered in MOON is directly related to this “sense”. The devolution of such operations is from hypercomplex to complex to real to integral. Only at the complex level do they induce 90-degree twists. Logically and ontologically “before” 90-degree twists, the involved operations on the reference space are fully graded, “discrete/smooth” to use the term employed by Derek in MOON (Vol. 2, pp. 292-3). At the hypercomplex level, time's operational precursors run in all “directions”, all “phases”. Clifford algebras may be sufficient to handle any given “pencil of skew-parallels”, but, as Yoshio speculates in MOON (Vol. 2, p. 338), probably, Grassman algebras are required to handle collections of bundled pencils of skew-parallels. The manipulation of such bundles of pencils is what temporal operations accomplish: temporal precedence devolves out of the ontological precedence implicit in logical precedence. At transfinite orders of m-logically-valued propositions, time-like properties have not “yet begun” to devolve. At those orders of logical processing ponderability does not exist (no “ponderable” or “ponderer”; indeed, probably “eight no's”, one for each of Cayley's 8-tuples), being-in-itself holds to itself relative to division algebras, as identity transparency is superintegrated; it has not “yet begun” devolution into becoming (which involves the progressive loss of orders of logical-value). Any induced change of a representation space at a given order of logical-value is “always” there as a logical proposition, not at the next higher order of logical-value, but on the “base-state” of the m-logically-valued reference space. Hence, speaking in terms of ultimates, it is permitted to doubt the notion of precedence relations -- logically, ontologically, temporally. Because of the ultimate non-orientability of observer-state and object-system, penetration of the involved nests of nests can be accomplished (from the “position” of ponderability) by cultivation of a radical empiricism either inwardly or outwardly directed. The notion might thereby be contemplated that the “base-state of Tzog-Chen” is “the base-state of the m-logically-valued reference space” -- which is capable of “…a graded self-formulation and modification of itself…”. This quote comes from the best introduction to this contemplation I have run across, Sir John Woodroffe's The World as Power (p. vii, Ganesh, 1974 edition of the tracts written in the period immediately following WWI). I studied this book in the mid-'70s as awareness of these notions evolved in my presence. It is not clear, without a whole university at disposal, how else this could be described. Collaborating with my wife, I made my best effort in MOON.


Skipping the subjective stuff, for which I must refer you to MOON, I can give some sort of account. My personal engagement with the notion of time as a topological operator had something of a dramatic origin. This was due to the fact that this notion, according to my understanding, is implicit in the calculus. In taking Introductory Calculus, I could not accept the idea of a limit, as taught in class. After class, I kept raising questions about it with the instructor. Finally, this instructor told me not to ask fundamental questions, as this would cause me to lose interest in class material. After considering this observation, I found that I agreed with it, and I thus dropped the class. I never signed up for another mathematics class, but continued to ask questions -- of myself. It seemed to me that the very idea of a limit intrinsically involves time dilation, for otherwise, in demonstrably pragmatic terms, conversion of an infinite sequence or an infinitesimal sequence to an infinite set cannot transpire. Only in a timewarp could a non-dissociated mathematician actually accomplish such a conversion and thus really take the derivative, really perform an integration… Well, some subjective stuff is inescapable, I guess. One has only to think of John Nash. Quoting from MOON (Vol. 2, p. 288):

The hysteric multiple-personality's “loss of time” through sequential cycling of split-off autonomous complexes has a curious parallel in the quantum theory: thermodynamic constraints apparently dictate that the various branches of the wave function cannot be “aware” of one another; the branches decompose through collapse sequentially in linear-time, but can have no cross-temporal bridges. In voluntary dissociation, however, no time is lost and the partly autonomous complexes are “stacked” and mutually interactive through time-slow-down. Is hysteria a dissociation locked into a linear-time-bound-closed-system where there is no free energy exchange of the multivalue? In voluntary dissociation, do relativistic factors intervene to modify the thermodynamic constraints in such a fashion as to “cross-time” the wave functional branches of each partly autonomous complex?… Maybe it is all a phasing problem in the hypertime. Could thermodynamic constraints of the 2nd Law simply be “hyper-temporal dysphasia”?

As comprehension of such matters comes in spiral fashion, nothing is presented contiguously or in syllogistic order in MOON. From Volume 1 this time (p. 733):

Husserl's epoché extends even to Descartes' famous dictum! Given that the sense of being-in-itself increases as identification with one's thoughts is dissolved, Descartes' “Cogito ergo sum” must be more precisely stated: I identify with my thoughts, therefore I claim separate-self-existence absolutely.

I recognized this issue in the calculus as an analogue of the basic issue in contention behind the Axiom of Choice. Moreover, from point of view of limits in derivatives and anti-derivatives, motion is change of space, not change in space. Measures of motion are inversely related to areas and volumes. In the limit of a measure, change of the representation space transpires. Topological operators on the representation space operate only in the limit. In an indirect fashion, both Special and General Relativity are implicit in the idea of a limit in the calculus. At Cornell, we began to look at something along these lines in relation to the processes involved in tornado genesis and superconductant core of DNA. This can be seen in the handwritten doodle-records of four typical conversations (the only such records retained) of the period, provided at the end of the “General Process” paper. That domain of discourse was entered upon by considering two logarithmic spirals an idealized heuristic model of tornado genesis (one spiral representing the “downward” cascade; the other, the “upward” cascade), intersection points being those points at which acoustically-modified gravity-wave modes are released. Tornadoes are regarded as “pathological phenomena” not only because of their destructive capabilities, but also because, subliminally, they evoke fear of schizophrenia (the multivalue). We felt this heuristic model should be expressible solely in terms of the numbers explicit and implicit in Euler's famous equation: e^pi + 1 = 0 (p, i, e, 1, and 0 being explicit; infinity being implicit in e, p, and i). Differentiation is, essentially, power series expansion, wherein the multiple values of a coefficient (constant) are inversely proportional to the sequence of factorials. These multiple values of the constant are involved in definition of the nested spacetime steps of the cascade model. The value of e is arrived at in similar factorial form to that of the sequence of derivatives. The phase (angle of rotation) of a complex number is a natural logarithm to base e. It is, perhaps, not too hard to see how one might “jump” from such considerations to the notion that “universal constants are m-valued” and that “time is a rotational operator on the representation space” and that “The twistor is the quantization of the temporal curl!”.


I think that if you read Penrose (The Road to Reality), taking note of how at every stage in development of the underlying higher mathematics, treatment of zero and infinity had to be “delicately” handled, you will see that everything they did to get around the glitches in quantum theory was also analogously done in atmospheric science. For instance, bar-h was introduced by Dirac to fix the scaling of momentum, introduced for the same reason Charney scaled the primitive equation set of atmospheric science to get rid of acoustically-modified gravity-wave modes, the signature of the m-logically valued. Scaling of momentum establishes the classical limit, whereas h is actually both m-valued and m-logically valued. Fixing the scaling of momentum takes “out of view” the fact of skew-parallelism underlying indeterminacy: this scaling places position and momentum on separate skew-parallels, when actually they both are on all skew-parallels always. What changes (in taking the sum to the limit as an integral, rather than embracing the Axiom of Choice) is order of logical-value of observer-state, which imposes decompositional operations as temporal curl. They do not see this, of course, and doing the scaling ultimately pushes the m-logical-valuedness into universes of a multiverse, such that the actual properties of multi-valued identity can be avoided here on Earth.


Return to:
•Top
•Home page