Showing posts with label HTDE. Show all posts
Showing posts with label HTDE. Show all posts

August 31, 2014

Godel's Revenge: All-Encompassing Formalisms vs. Incomplete Formalisms

This content is cross-posted to Tumbld Thoughts. A loosely-formed story in two parts about the pros and cons of predicting the outcome of and otherwise controlling complex sociocultural systems. Kurt Godel is sitting in the afterlife cafe right now, scoffing but also watching with great interest.



I. It's an All-encompassing, Self-regulation, Charlie Brown!


Here is a video [1] by the complexity theorist Dirk Helbing about the possibility of a self-regulating society. Essentially, by combining big data with the principles of complexity would allow us to solve previously intractable problems [2]. This includes more effective management of everything from massively parallel collective behaviors to very-rare events.


But controlling how big data is used can keep us from getting into trouble as well. Writing at Gigaom blog, Derrick Harris argues that the potentially catastrophic effects of AI taking over society (the downside of the singularity) can be avoided by keeping key data away from such systems [3]. In this case, even hyper-complex AI systems based on deep learning can become positively self-regulating.

NOTES:

[2] For a cursory review of algorithmic regulation, please see: Morozov, E.   The rise of data and the death of politics. The Guardian, July 19 (2014).

For a discussion as to why governmental regulation is a wicked problem and how algorithmic approaches might be inherently unworkable, please see: McCormick, T.   A brief exchange with Tim O’Reilly about “algorithmic regulation”. Tim McCormick blog, February 15 (2014).

[3] Harris, D.   When data become dangerous: why Elon Musk is right and wrong about AI. Gigaom blog, August 4 (2014).


II. Arguing Past Each Other Using Mathematical Formalisms


Here are a few papers on argumentation, game theory, and culture. My notes are below each set of citations. A good reading list (short but dense) nonetheless.

Brandenburger, A. and Keisler, H.J.   An Impossibility Theorem on Beliefs in Games. Studia Logica, 84(2), 211-240 (2006).

* shows that any two-player game is embedded in a system of reflexive, meta-cognitive beliefs. Players not only model payoffs that maximize their utility, but also model the beliefs of the other player. The resulting "belief model" cannot be completely self-consistent: beliefs about beliefs have holes which serve as sources of logical incompleteness.

What is Russell's Paradox? Scientific American, August 17 (1998).

* introduction to a logical paradox which can be resolved by distinguishing between sets and sets that describe sets using a hierarchical classification method. This paradox is the basis for the Brandenburger and Keisler paper.

Mercier, H. and Sperber, D.   Why do humans reason? Arguments for an argumentative theory. Behavioral and Brain Sciences, 34, 57-111 (2011).


Oaksford, M.   Normativity, interpretation, and Bayesian models. Frontiers in Psychology, 5, 332 (2014).

* a new-ish take on culture and cognition called argumentation theory. Rather than reasoning to maximize individual utility, reasoning is done to maximize argumentative context. This includes decision-making that optimizes ideonational consistency. This theory predicts phenomena such as epistemic closure, and might be thought of as a postmodern version of rational agent theory. 

There also seems to be an underlying connection between the "holes" is a culturally-specific argument and the phenomenon of conceptual blending, but that is a topic for a future post.

April 6, 2014

Fireside Science: The Structure and Theory of Theories

This content is being cross-posted to Fireside Science. This post represents a first-pass approximation (and is perhaps a confounded, naive theory in itself). Hope you find it educational at the very least.


Are all theories equal? In an age where creationism is making its way into the school curriculum (under the guise of intelligent design) and forms of denialism and conspiracy theory are becoming mainstream, this is an important question. While classic philosophy of science and logical positivist approaches simply assume that the best theories evolve through the scientific process, living in an era of postmodernism, multiculturalism, and the democratization of information, demands that we think about this in a new way.

Sense-making as Layers of Information
By taking cues from theoretical artificial intelligence and contemporary examples, we can revise the theory of theories. Indeed, we live in interesting times. But what is a theory --  and why do people like to say it's "just a theory" when they disagree with the prevailing model? One popular view of theory is that of "sense-making" [1]: that is, theories allow us to synthesize empirical observations into a mental model that allows us to generalize without becoming overwhelmed by complexity or starting from scratch every time we need to make a predictive statement.

The process of making sense of the world by building theories. Keep this in mind as we discuss the differences between naive and informed theories. COURTESY: Figure 2 in [1b].

Yet sense-making is not the whole story, particularly when theories compete for acceptance [2]. Are all theories equal, or are some theories more rigorous than others? This question is in much the same vein as the critique of "absolute facts" in postmodern theory. To make sense of this, I propose that there are actually two kinds of theory: naive theories and informed theories. Naive theories rely on common sense, and can often do very well as heuristic guides to the world. However, they tend to fall apart when presented with counter-intuitive phenomena. This is where informed theory becomes important. Informed theories are not synonymous with scientific theories -- in fact, some ancient beliefs and folk theories can fall into this category alongside formal scientific theories. We will see the reasons this nominal equivalence (and non-equivalence of more naive theories) as we go through the next few paragraphs.

Naive and informed theories can be distinguished by their degree of "common sense". Normally, common sense is a value judgement. In this case, however, common sense involves a lack of information. Naive theories tend to be intuitive rather than counterintuitive. Naive theories are constructed only from immediate observations and abductive reasoning between these observations. Naive theoretical synthesis can be thought of as a series of "if-and-then" statements. For example, if A and B are observed, and they can be linked through co-occurrence or some other criterion, then they are judged to be plausible outcomes.

The role of abductive theories in organizations. COURTESY: Free Management Library.

Informed theories, on the other hand, utilize deduction and can be divided into working theories (e.g. heuristics) and deep theories that explain, predict, and control. Working theories tend to utilize inductive logic, whereas deep theories tend to rely upon deductive logic. Since deep theories are inductive, they tend to be multi-layered constructs with mechanisms and premises based on implicit assumptions [3]. As a deductive construct, a deep informed theory can lead to inference. Inference gives us a powerful way to predict outcomes that are not so intuitive. The inference of common ancestors in phylogenetic theory allows us to reconstruct common ancestors to extant species that may look nothing like an "average" or a "cross" between these descendants.

A contingency table showing the types and examples of naive and informed theories.



NAIVE


INFORMED

SHALLOW


Cults, Philosophies based on simple principles


Pop-psychology and pop-science

DEEP


Conspiracy theories

Scientific theories

Naive and informed theories can also be distinguished by their degree of complexity. As they are based on uninformed intuition, naive theories are self-evident and self-complete, perhaps too much so. Fundamentalist religious belief and denialist-based political philosophies are based on simple sets of principles and are said by some to be tightly self-referential [4]. This inflexible self-referential capacity these theories rely on common sense over social complexity. Conspiracy theories and denialist tendencies are deeper versions of naive theories [5], but unlike their informed counterparts, do not get by on objective data, and are particularly resistant to updating [6]. By contrast, formal theories are based on abstractions and possess incompleteness-tolerance. This is often by necessity, as we cannot observe every instance of every associated process we would like to understand.

Sometimes the deepest naive theories lead to conspiracies. I have it on the highest authority.

Theory of Ontological Theories?
This leads us to an interesting set of questions. One, are the informed theories that currently exist in many fields of inquiry inevitable outcomes? Second, why are some fields more theoretical than others, and why are theory and data more integrated in some fields but not others? This is a question of historical contingency vs. field-specific structure. Is the state of theory in different areas of science due to historical context or a consequence of the natural laws they purport to make sense of? To answer these three questions, we will not briefly examine five examples from various academic disciplines. Underlying many of these approaches to informed theory is an assumption: theories are a search for ontological truths rather than the product of interactions among privileged experts. This is where informed theories hold an advantage -- they can change gradually with regard to new data and hypotheses while also remaining relevant. This is an ideal in any case, so let us get to the examples:

1) Economics has an interesting relationship to theory. Formal macroeconomic theory involves two schools of thought: freshwater and saltwater. The former group favors the theories of the free-market, while the latter group adhere to Keynesian principles. However, there are also adherents of political economy, who favor models of performativity over formal mathematical models. Since the financial crisis of 2008, there has been a rise of interest in alternative economic theories and associated models, perhaps serving as an example of how theories change and are supplanted over time. And, of course, a common naive theory of economics is based on confounding micro- (or household) and macro- (or national-scale) economics.

2) Physics is though of as the gold standard of scientific theory. For example, "Einstein" is synonymous with "theory" and "genius". The successes of deep, informed theories such as relativity and quantum mechanics is well-known. Aside from explanation and prediction of physics theory are logical consistency and grand unification as an enterprise that can often be separated from experimentation. As the gold standard of scientific theory, physics also provides a theoretical conduit to other disciplines, sometimes without modification. We will discuss this further in point #5.


 This book [7] is a statement on self-anointed "bad" theories. The statement is: although string theory is structurally elegant, it is not functionally elegant like quantum gravity. But does that make quantum gravity a superior theory?

3) In neuroscience and cell biology, theories are as often deemed superfluous and inherently incomplete in lieu of ever more data. This is partially due to our level of understanding relative to the complex nature of these fields. Yet many naive and informed social theories exist, despite the complexity of the social world. So what is the difference? It could be a matter of neuroscientists and cell biologists not being oriented towards theoretical thinking. This may explain why computation neuroscience and systems biology exist as fields quite independent of their biological counterparts.

4) Theoretical constructs associated with evolution by natural selection are the consensus in evolutionary biology. This wasn't always the case, however, as 19th century German embryologists and 18th century adherents to Lamarkian theory had competing ideas of how animal diversity was produced and perpetuated. However, Darwinian notions of evolution by natural selection did the best job at synthesizing previous knowledge about natural history with a formal mechanism for descent with modification. In popular culture, there has always been a resistance to Darwinian evolution. Usually, these divine creation-inspired naive theories are embraced as a contrarian counterbalance to deep, informed theory advocated by scientific authorities. In this case, theories have a social component, as Social Darwinism (a social co-option of Darwinian evolution) was popular in the 19th and early 20th centuries.

5) Because informed theories can explain invariants of the natural world, they often cross academic disciplines. Sometimes these crosses are direct. Evolutionary Psychology is one such example. Evolutionary theory can explain biological evolution, and as we are the products of evolution, the same theory should explain the evolution of the human mind. A simple analogical transfer, but much harder to yield the same results. But sometimes theories cross into domains not because of their suitability for the problem at hand, but because they are mathematically rigorous and/or have great predictive power in their original domain. The "quantum mind" is one such example of this. Is "quantum mind" theory any better or more powerful than a naive theory about how the mind works? It is unclear. However, this co-option suggests that even the most reputable informed theories can be cultural artifacts. A real caveat emptor.

Roger Penrose et.al [8] will tell us about everything, in the spirit of physics and mathematics.

Properties of the Theory of Theories
The inherent dualisms of the theory of theories stems from deeper cognitive divisions between matter-of-fact and abstract thinking. As cultural constructs, matter-of-fact theories are much more amenable to narrative structures that permeate folklore and pseudo-science. This does not mean that abstract theories are "better" or any more "scientific" than matter-of-fact formulations. In fact, abstract theories are more susceptible to cultural blends [9] or symbolic confabulation [10], as these short-cuts aid us in conceptual understanding.

Scientific theories tend to be abstract, informed ones, but scientific theories that are more well-known by the general public have many features of naive theories. Examples of this include Newtonian physics and the Big Bang. There is a certain intuitive satisfaction from these two theories that are not offered by, say, quantum theory or Darwinian evolution [11]. This satisfaction arises from consistency with one's immediate sensory surroundings and/or existing cultural myths. Interestingly, naive (and mythical) versions of quantum theory and Darwinian evolution have arisen alongside the more formal theory. These faux-theories use their informed theory counterparts as a narrative template to explain everything from the spiritual basis of the mind (Chopra's Nonlocality) to social inequalities (Spencer's Social Darwinism).

But what about beauty in theory? Again, this could arguably be a feature of naive theorizing. Whether it is the over-application of parsimony or an over-reliance on elegance and beauty [7], informed theories require a degree of initial convolution before such features can be incorporated into the theory. In other words, these things should not be goals in and of themselves. Rather, deep, informed theories should be robust enough to be improved upon incrementally without having to be being completely replaced [12]. The beauty of parsimony and symmetry should only considered to be a nice side-benefit. There is also a significant role for mental and statistical models in theory-building, but for the sake of relative simplicity I am intentionally leaving this discussion aside for now.

Tides go in, tides go out. When it's God's will, it's a short and neat proposition. When it's more complicated, then it's scientific inquiry. COURTESY: Geekosystem and High Power Rocketry blogs.

In a future post, I will move from the notion of a theory of theories to the need for an analysis of analyses. Much like the theory of theories, a deep reconsideration of analysis is also needed. This has been driven by the scientific replication crisis, the proliferation of data (e.g. infographics) on the internet, and the rise of big data (e.g. very large datasets, once again enabled by the internet). 

NOTES:
[1] Here are a few references on the cognition of sense-making, particularly as it related to theory construction:

a) Klein, G., Moon, B. and Hoffman, R.F.   Making sense of sensemaking I: alternative perspectives. IEEE Intelligent Systems, 21(4), 70–73 (2006).

b) Pirolli, P., & Card, S.   The sensemaking process and leverage points for analyst technology as identified through cognitive task analysis. Proceedings of the International Conference on Intelligence Analysis (2005).

[2] Here are some references that will help you understand the "hows" and "whys" of theory competition, with particular relevance to what I am calling deep, informed theories:

a) Steiner, E.   Methodology of Theory-building. Educology research Associates, Sydney (1988).

b) Kuhn, T.   The structure of scientific revolutions. University of Chicago Press (1962).

c) Arbesman, S.   The Half-life of Facts. Current Press (2012).

[3] sometimes, naive theorists will accuse deep, informed theorists of being "stupid" or "irrelevant". This is because the theories generated do not conform to the expectations and understandings of the naive theorist.

Paul Krugman calls one such instance "the myth of the progressive economist": Krugman, P.   Stupidity in Economic Discourse 2. The Conscience of a Liberal blog, April 1 (2014).

[4] Religious fundamentalist and  denialist groups also seem to theorize in a deep naive manner, using a tightly self-referential set of theoretical propositions. In these cases, however, common sense is replaced with a intersubjective (e.g. you have to be part of the group to understand) self-evidence. The associated logical extremes tend to astound people not in the "know".

a) Example from religious fundamentalism: Koerth-Baker, M.   What do Christian fundamentalists have against set theory? BoingBoing, August 7 (2012) AND Simon, S.   Special Report: Taxpayers fund creationism in the classroom. Politico Pro, March 24 (2014).

For a discussion of Nominalism (basic math) vs. Platonism (higher math) in Mathematics, please see: Franklin, J.   The Mathematical World. Aeon Magazine, April 7 (2014).

b) Example from climate change denialism: Cook, J. and Lewandowsky, S.   Recursive Fury: facts and misrepresentations. Skeptical Science blog, March 21 (2013).

[5] for one such example, please see: Roberts, D.   Conservative hostility to science predates climate science. Grist.org, August 12 (2013).

For a more comprehensive background on naive theories (in this case, the development of naive theories of physics among children) please see the following:

a) Reiner, M., Slotta, J.D., Chi, M.T.H., and Resnick, L.B.   Naive Physics Reasoning: a commitment to substance-based conceptions. Cognition and Instruction, 18(1), 1-34 (2000).

b) Vosniadou, S.   On the Nature of Naive Physics. In "Reconsidering Conceptual Change: issues in theory and practice", M. Limon and L. Mason, eds., Pgs. 61-76, Kluwer Press (2002).

For the continued naive popularity of the extramission theory of vision, please see the following:

c) Winer, G. A., Cottrell, J. E., Gregg, V., Fournier, J. S., & Bica, L. A. (2002). Fundamentally misunderstanding visual perception: Adults' beliefs in visual emissions. American Psychologist, 57, 417-424.

[6] sometimes, theories that are denialist in tone are constructed to preserve certain desired outcomes from data that actually suggest otherwise. In other words, a narrative takes precedence over a more objective understanding. Charles Seife calls this a form of "proofiness".

For more, please see: Seife, C. Proofiness: how you're being fooled by numbers. Penguin Books (2011).

[7] Smolin, L.   The Trouble with Physics. Houghton-Mifflin (2006).

[8] Penrose, R., Shimony, A., Cartwright., N., and Hawking, S.   The large, the small, and the human mind. Cambridge University Press (1997).

[9] Fauconnier, G.   Methods and Generalizations. In "Cognitive Linguistics: foundations, scope, and methodology". T. Janssen and G. Redeker, eds, 95-128. Mouton DeGruyter (1999).

[10] Confounds are a psychological concept that identifies when ideas and deep informed theories are confused or otherwise condensed for purposes of superficial understanding or misinterpretation. In the case of creationists, such intentional confounds are often used to generate doubt and confusion of subtle and complex concepts.

a) Role of confabulation in cognition (a theory): Hecht-Nielsen, R.   Confabulation Theory. Scholarpedia, 2(3), 1763 (2007).

b) Example of intentional confounding from anti-evolutionism: Moran, L.A.   A creationist tries to understand genetic load. Sandwalk blog, April 1 (2014).

[11] By "conforming to intuitive satisfaction", I mean that Newtonian physics explains the physics of things we interact with on an everyday basis, and the Big Bang is consistent with the idea of divine creation (or creation from a singular point). This is not to say that these theories were developed because of these features, but perhaps explains their widespread popular appeal.

[12] Wholesale replacement of old deep, informed theories is explained in detail here: Kuhn, T.   Structure of Scientific Revolutions. University of Chicago Press (1962).

November 19, 2013

Fireside Science: The Inefficiency (and Information Content) of Scientific Discovery

This content has been cross-posted to Fireside Science.


In this post, I will discuss somewhat of a trendy topic that needs further critical discussion. It combines a crisis in replicating experiments with the recognition that science is not an perfect or errorless pursuit. We start with a rather provocative article in the Economist called "Trouble at the Lab" [1]. The main idea: the practice of science needs serious reform in its practice, from standardization of experimental replicability to greater statistical rigor. 


While there are indeed perpetual challenges posed by the successful replication of experiments and finding the right statistical analysis for a given experimental design, most of the points in this article should be taken with a grain of salt. In fact, the conclusions seem to suggest that science should be run more like a business (GOAL: most efficient allocation of resources). This article suffers from many of the same issues as the Science article featured in my last Fireside Science post. Far from being an efficient process, the process of making scientific discoveries and discovering the secrets of nature require a very different set of ideals [2]. But don't just rely on my opinions. Here is a sampling of letters to the editor which followed:


The first is from Stuart Firestein, the author of "Ignorance: how it drives science", which is discussed in [2]. He argues that applying a statistician's theoretical standards to all forms of data is not realistic. While the portion of the original article [1] discussing problems with statistical analysis in most scientific papers is the strongest point made, it also rests on some controversial assumptions. 

The first involves a debate as to whether or not the Null Hypothesis Significance Test (NHST) is the best way to uncover significant relationships between variables. NHST is the use of t-tests and ANOVAs to determine significant differences between experimental conditions (e.g. treatment vs. no treatment). As an alternative, naive and other Bayesian methods have been proposed [3]. However, this still makes a number of assumptions about the scientific enterprise and process of experimentation to which we will return.


The second letter is refers to one's philosophy of science orientation. This gets a bit at the issue of scientific practice, and how the process of doing science may be misunderstood by a general audience. Interestingly, the notion of "trust, but verify" does not come from science at all, but from diplomacy/politics. Why this is assumed to also be the standard of science is odd.


The third letter will serve as a lead-in to the rest of this post. This letter suggests that the scientific method is simply not up to the task of dealing with highly complex systems and issues. The problem is one of public expectation, which I agree with in part. As experimental methods provide a way to rigorously examine hypothetical relationships between two variables, uncertainty may often swamp out that signal. While I think this aspect of the critique is a bit too pessimistic, let's keep these thoughts in mind.......

A reductionist tool in a complex world

Now let's turn to what an experiment uncovers with respect to the complex system you want to understand. While experiments have great potential for control, they are essentially hyper-reductionist in scope. When you consider that most experiments test the potential effect of one variable on another, an experiment may serve no less of a heuristic function than a simple mathematical model [4]. And yet in the popular mind, empiricism (e.g. data) tends to trump conjecture (e.g. theory) [5].

Figure 1. A hypothesis of the relationship between a single experiment and a search space (e.g. nature) that contains some phenomenon of interest.

Ideally, the goal of a single experiment is to reliably uncover some phenomenon in what is usually a very large discovery space. As we can see in Figure 1, a single experiment must be designed to overlap with the phenomenon. This can be very difficult to accomplish when the problem at hand is complex and multi-dimensional (HINT: most problems are). A single experiment is also a relatively information-poor way to conduct this investigation, as shown in Figure 2. Besides being a highly-controllable (or perhaps highly reduced complex) means to test hypotheses, an alternate way to think about experimental design is as an n-bit register [6].

Figure 2. A single experiment may be an elegant way to uncover the secrets of nature, but how much information does it actually contain?

Now to get an idea of how such overlap works in the context of replication, we can turn to the concept of an experimental footprint (Figure 3). Experimental footprints qualitatively describes what an experiment (or it's replication) uncovers relative to some phenomenon of interest. Let's take animal behavior as an example. There are many sources of variation that contribute to a specific behavior. In any one experiment, we can only observe some of the behavior, and even less of the underlying contributing factors and causes. 

A footprint is also useful in terms of describing two things we often do not think about. One is the presence of hidden variables in the data. Another is the effect of uncertainty. Both depend on the variables tested and problems chosen. But just because subatomic particles yield fewer surprises than human psychology does not necessarily mean that the Psychologist is less capable than the Physicist.

Figure 3. Experimental footprint of an original experiment and it's replication relative to a natural phenomenon.

The original maternal imprinting experiments conducted among geese by Konrad Lorenz serve as a good example. The original experiments were supposedly far messier [7] than the account presented in modern textbooks. What if we suddenly were to find out that replication of the original experimental template did not work in other animal species (or even among ducks anymore)? It suggests that we may need a new way to assess this (other than chalking it up to mere sloppiness).


So while lack of replication is a problem, the notion of a crisis is overblown. As we have seen in the last example, the notion of replicable results is an idealistic one. Perhaps instead of saying that the goal of experimental science is replication, we should consider a great experiment as one that reveals truths about nature. 

This may be best achieved not by the presence of homogeneity, but also a high degree of tolerance (or robustness) to changes in factors such as ecological validity. To assess the robustness of a given experiment and its replications (or variations), we can use information content to tell us whether or not a given set of non-replicable experiments actually yield information. This might be a happy medium between an anecdotal finding and a highly-repeatable experiment.


Figure 4. Is the goal of an experiment unfailingly successful replication, or a robust design that provides diverse information (e.g. successful replications, failures, and unexpected results) across replications?

Consider the case of an experimental paradigm that yields various types of results, such as the priming example from [1]. While priming is highly replicable under certain conditions (e.g. McGurk effect) [8], there is a complexity that requires taking the experimental footprint and systematic variation between experimental replications into account. 

This complexity can also be referred to as the error-tolerance of a given experiment. Generally speaking, the error tolerance of a given set of experiments is correspondingly higher as information content (related to variability) increases. So just because the replications do not pan out, they are nonetheless still informative. To maximize error-tolerance, the goal of an experiment should be an experiment with a small enough footprint to be predictive, but a large enough footprint to be informative. 

In this way, experimental replication would no longer be the ultimate goal. Instead, the goal would be to achieve a sort of meta-consistency. Meta-consistency could be assessed by both the robustness and statistical power of an experimental replication. And we would be able to sleep a little better at night knowing that the line between hyper-reductionism and fraudulent science has been softened while not sacrificing the rigors of the scientific method.

NOTES:

[1] Unreliable Research: trouble at the lab. Economist, October 19 (2013).

[2] Alicea, B.   Triangulating Scientific “Truths”: an ignorant perspective. Synthetic Daisies blog, December 5 (2012).

[3] Johnson, V.E.   Revised standards for statistical evidence. PNAS, doi: 10.1073/pnas.1313476110

[4] For more information, please see: Kaznatcheev, A.   Are all models wrong? Theory, Games, and Evolution Group blog, November 6 (2013).

[5] Note that the popular conception of what a theory is and what theories actually are (in scientific practice) constitutes two separate spheres of reality. Perhaps this is part of the reason for all the consternation.

[6] An n-bit register is a concept from computer science. In computer science, a register is a place to hold information during processing. In this case, processing is analogous to exploring the search space of nature. Experimental designs are thus representations of nature that enable this register.

For a more formal definition of a register, please see: Rouse, M.   What is a register? WhatIs.com (2005).

[7] This is a personal communication, as I cannot remember the original source. The larger point here, however, is that groundbreaking science is often a trial-and-error affair. For an example (and its critique), please see: Lehrer, J.   Trials and Errors: why science is failing us. Wired, December 16 (2011).

[8] For more on the complexity of psychological priming, please see: Van den Bussche, E., Van den Noortgate, W., and Reynvoet, B.   Mechanisms of masked priming: a meta-analysis. Psychological Bulletin, 135(3), 452-477 (2009).

April 11, 2013

Richard Gordon, Transmogrifying from Virtual to Physical, Brought us Bits of Embryogenesis

I was honored to be able to bring Dr. Richard (Dick) Gordon to the Michigan State campus for a seminar on April 9 (see video on Vimeo). Currently at the Gulf Specimen Marine Laboratory (and retired from the University of Manitoba), Dick is a theoretical development biologist of the highest caliber [1]. He gave a talk entitled "Cause and Effect in the Interaction between Embryogenesis and the Genome" [2]. He even brought toys [3] to illustrate his theory of cellular differentiation.

Dick's virtual world avatar (Paleo Darwin) is seated in the middle picture.

Dick Gordon, master collaborator.

The theoretical model he presented suggests that differentiation waves [4] pulse through the embryo during development, which set up spatially-restricted gene expression and differentiation into distinct cellular types. According to this view, each cell's differentiation is a binary and recursive process (e.g. one "decision" point building upon another), and is contingent upon the cell's position and environment. In this sense, higher-level organization (e.g. modules) are not caused by gene expression. Rather, gene expression changes that lead to observable phenotypic modules [5] and other patterns are caused by the extracellular environment of a developing organism.

An example of a Wurfel toy, taken from a slide in his talk. A fine example of Canadian innovation.

There were many profound moments in this lecture. An overarching theme of the talk was how candidate ideas (e.g. hypotheses) are tested, implemented, and critically examined in the course of doing science. One of these was the "organizer" experiments of Hans Spemann [6], in which a piece of tissue transplanted to an embryo can induce the formation of a second animal. Subsequent experiments have shown that while transplanted tissue accomplishes this, other transplanted materials (even some which are non-organic) can induce this response as well. Perhaps the effect is not due to the tissue itself, but the hydrophobicity or hydrophilicity of the materials transplanted. This might be characterized as a special case of Type I error due to incomplete experimental information [7].

Picture of Hans Spemann (inducers).

Another candidate idea presented was Alan Turing's notion of "morphogens". Morphogens are hypothetical molecules proposed by Turing to drive pattern formation in a developing organism [8]. According to the talk, morphogens are not the causal factor for morphogenesis, nor are genetic regulatory cascades. Instead, they are both driven by expansion and contraction waves that course through the embryo. These waves (which have been observed) also trigger the mechanisms of differentiation (e.g. signaling molecules and gene expression changes) in cells. A good example of the problems related to establishing causality in a complex systems.

Picture of Alan Turing (morphogens).

Time-course (and illustration) of differentiation waves moving across an embryo from the talk.

After the talk, Dick and I discussed the possible role of differentiation wave-like activity in the process of in vitro (or perhaps even in vivo) cellular reprogramming (the controlled phenotypic transformation of a cell from one phenotype to another). Interesting stuff, and as always, you are welcome to participate in the Embryo Physics course [9], which is made possible by a fine group of people. Please contact myself or Dick if you are interested in presenting.

The scene of the crime, so to speak. Some quiet moments before my virtual lecture (Scenes from a Graphical, Parallel Biological World) given in April, 2012.


NOTES:

[1] He was originally trained in chemical physics at the University of Oregon (home of the Oregonator). See his Google Scholar profile for more information. According to their records, he has a h-index of 32 (which is quite impressive). He also has an Erdos number of 2i (long story).

Animation of the Oregonator (activator/inhibitor system). COURTESY: Scholarpedia.

[2] Here is a link to the version of this talk (.pdf slides) presented in the Embryo Physics course on March 20, 2012.

[3] One of these was a Wurfel, which is a bunch of wooden blocks joined together with an elastic string. I own one of these, and before this lecture I had no idea as to its name!

The Wurfel was used to demonstrate the configurational constraints and opportunities afforded to the genome due to a cell's biophysical and epigenetic context. For more fun (and combinatorics) with puzzles, please see the following blog post: Puzzle Cube. Paleotechnologist blog, August 31 (2011).

[4] According to his talk, these may either be calcium waves or something functionally similar. For an introduction to embryonic calcium waves (and how to image them), please see:

Gillot, I. and Whittaker, M.   Imaging Calcium Waves in Eggs and Embryos. Journal of Experimental Biology, 184, 213–219 (1993).

[5] Here is a video from Jeff Clune (University of Wyoming) demonstrating how modularity might have evolved using the software platform HyperNEAT (evolutionary neural networks). Based on the following paper:

Clune, J., Mouret, J-B., and Lipson, H.   The evolutionary origins of modularity. Proceedings of the Royal Society B, 280, 2012-2863 (2013).

[5] Here is a YouTube video that explains Spemann's organizer experiments in more detail.

[6] This fits very much within the scope of the Hard-to-define-Events (HTDE) approach. For more information, please see the HTDE 2012 workshop website.

[7] Here are some examples of morphogenesis (sensu Turing) the morphogen concept modeled using the Gro programming language (from the Klavins Lab, University of Washington).

The morphogen concept was some of Turing's later work. Even though Turing was a computing pioneer, his coupled reaction-diffusion model of chemical morphogenesis have become a prevailing view of how developmental morphogenesis proceeds. However, these ideas are also useful in the computational modeling of textures. See Turing's classic paper for more information:

Turing, A.M.   The Chemical Basis of Morphogenesis. Philosophical Transactions of the Royal Society of London, 237 (641), 37–72 (1952).

[8] While not formally a MOOC, the Embryo Physics course is an example of distributed learning. For more information, watch for the forthcoming paper:

Gordon, R.   The Second Life Embryo Physics Course. Systems Biology in Reproductive Medicine, x(x), xxx-xxx (2013).

March 26, 2013

Upcoming Second Life Lecture and Summary Talk

This is cross-posted from my micro-blog, Tumbld Thoughts:


Here are slides from a lecture [1] to be given this Wednesday (March 27) to the Embryo Physics group (in Second Life) at 2pm PST. Slides posted to Figshare. A shorter version of the talk was originally part of HTDE 2012, a workshop in association with the Artificial Life XIII conference.

Selected slides from talk


Also, I am currently on the job market. Here is a short slideshow [2] that profiles my personal research expertise and interests (and the current version of my CV, which can be found here). Please take a look at both, and comments are welcome.


NOTES:

[1] Alicea, B.   Multiscale Integration and Heuristics of Complex Physiological Phenomena. Figshare, doi: 10.6084/m9.figshare.657992 (2013).

[2] Alicea, B.   Short Job Talk. Figshare, doi:10.6084/m9.figshare.639185.

UPDATE:
The talk went well, with about six avatars in attendance (I am the Tron lightcycle avatar). Below are some images from the proceedings, with a transcript of the talk also available.


February 25, 2013

Innovation-palooza for February

Here are a couple more cross-posts (on Hard-to Define Events and Scientific Innovation, respectively) from my micro-blog, Tumbld Thoughts.






A. Take a look at the link [1] to a slideshow I am presenting in the near future as a follow-up to the Hard-to-define Events workshop I organized last summer. It is called "If your results are unpredictable, does it make them any less true?", and focuses on applying the hard-to-define events paradigm to biology and the development of scientific theory.


B. And here is an article from Joe Nocera of the NYT called "Innovation Nation at War" [2]. It is a critical assessment of the current patent system and how it is hurting the innovation economy. Specifically, he discusses patents in terms of their economic utility, and how companies buy patent portfolios and file patent lawsuits [3] in a manner that severely violates this principle. Perhaps a better solution would be a bond system ties to ideas futures, which would provide a more immediate payoff with less legal jujitsu [4].





C. As an added bonus, here is a podcast (courtesy of the NanoNerds YouTube channel) with Brian Bergstein from MIT Technology Review on the top ten emerging technologies of 2012. The continued development of cheap nanopore DNA sequencing technology is my favorite.

NOTES:


[2] here is a real-time update of the "Tweetscape" for this article. Interesting comments.

[3] A conflict termed the "patent wars". Interesting infographic here.

[4] The martial arts is figurative, but the legal maneuvering is quite real (see here for an example on the anti-commons). See diagrams above for more information. These are some ideas I have been developing, and stand in comparison to intellectual property-oriented patents.



Printfriendly