The Krasnow Institute for Advanced Study, of George Mason University

George Mason University

Krasnow Institute > Monday Seminars > Abstracts

Self-Representing Information Processing
and the
Relationship of the Material to the Mental

Kathryn Blackmond Laskey*
Department of Systems Engineering and Operations Research
George Mason University

As a working definition of consciousness, I propose the following: A conscious organism represents its environment, and possibly itself, to itself, and uses this representation to engage in adaptive behavior with respect to its environment. A representation is a mapping between two systems, the representing system and the represented system, such that properties of the representing system map onto corresponding properties of the represented system. A major reason for the evolutionary success of humans is our striking ability to use representations to predict the effects of actions, to communicate these predictions to each other, and then to implement only those actions predicted to have desirable effects.

I represent consciousness by saying that some part of the internal state of a conscious organism (the locus of which appears to be the brain) consists of a representation of its environment. It is important to remember that this representation of consciousness is not what consciousness is. Non-conscious systems (e.g., books, computers) can also represent other systems. One crucial difference between conscious and non-conscious representation systems is the phrase "to itself" in the above working definition: a conscious system is the consumer of its own representations, whereas we are the consumers of the non-conscious representations we construct. We are begining to learn the rudiments of how brains do their representing, and to apply this understanding to build more sophisticated non-conscious representations. As yet, however, we have no theory of the properties of brains that cause their states to "have meaning" to an organism.

Even in the absence of a theory of meaning, we can study the relationship between representation and reality by building models of the representation process. John von Neumann appreciated that a full understanding of measurement in quantum mechanics required an integrated model encompassing both system and observer. This talk describes such an integrated model. The theory of probability as belief dynamics provides a way to model both physical and representational aspects of a conscious system in a unified way. To do this, we must abandon the nineteenth century positivist view that probability is appropriate only for modeling inherently random processes, and move back to the older view of probability as a representation of a rational agent's degrees of belief about propositions about which the agent is uncertain. The common mathematics underlying sequential decision theory and the dynamics of physical systems is used to construct a theory of the evolution of a conscious system's representation of itself and its environment. In this theory, the "physical" (observed features in the system's representation language) and the "representational" (probability distributions for time-space evolution of these features, including one-step ahead predictions) are complementary variables, as are position and momentum in quantum mechanics.

This talk describes how cross-fertilization between statistical physics and machine learning has led to the development of efficient algorithms for highly complex and heretofore intractable learning and optimization problems, presents a general framework for the dynamics of self-representing information processing systems that is compatible with current models of our physical universe, and describes how the integration of the quantum mechanics with evolutionary theories of learning and multiple-actor decision making fill complementary gaps in current theories of computing, computational psychology, and physics. The result is a unified ontology for science that integrates the material and mental aspects of reality, and is a fully adequate foundation for scientific understanding across the physical, biological, and social sciences. Conscious experience, learning, and decision making play a central and fundamental role in this ontology, as distinct from their epiphenomenal role in the classical ontology. The talk concludes with speculations on potential observational tests of the framework proposed here, and speculations on the sociological and technological implications of a unified scientific understanding of the physical and mental aspects of nature.

*This research was sponsored in part by a Career Development Fellowship from the Krasnow Institute at George Mason University

Back to Top

The Krasnow Institute for Advanced Study
Mail Stop 2A1, George Mason University, Fairfax, VA 22030
Phone: (703) 993-4333 Fax: (703) 993-4325
Email: krasnow-webmaster@gmu.edu