January 28, 2012

Representing rare events


In this post I will be discussing the occurrence of rare events [1] and how they might be represented in computational systems. Rare events are intriguing precisely because they occur infrequently. We are all familiar with rare events in nature: avalanches, rogue waves, freak storms, and developmental mutations are but a few examples. Furthermore, while these events are rare, they are not improbable. I posted a few months back on the context of such processes (overproductive systems) and their potential relevance to biology. While the occurrence of rare events themselves is sparsely distributed in time, they occur against a background process filled with many events we like to call "normal". This is related to a normal distribution: background events occur within a certain range of expected outcomes.


One way in which rare events have been addressed from a statistical standpoint is to model them as a distribution of noise. In processes where we expect no rare events, we can assume that events will unfold according to a white noise distribution [2]. For processes where these rare events occur at increasingly larger magnitudes, a model of colored (or 1/f) noise can be used to model the expectation of rare events embedded in a process. So-called pink noise provides a series of rare events with a near-uniform size many orders of magnitude above the size of background events, while black noise provides a series of rare events occurring against a background of near-inactivity.


The comparison of extreme 1/f noise to white noise can be made using a thought experiment. Suppose you enjoy sleeping with a white noise machine running in the background. Why do you enjoy this? One reason might be that a white noise machine provides a stochastic but uniform auditory stimulus that allows your relax enough to fall asleep. Yet suppose that you wanted to awake from your slumber. Most people use an alarm clock, which presents bursts of auditory stimuli at different magnitudes, depending on your preference.
This transition from a uniform stream of sound to a series of bursts embedded in a more uniform background is the transition from common events to rare events. Not only do rare events capture your attention, but they also provide a basis for large-scale transitions in coupled processes. In this case, your alarm clock is coupled to your sensory systems, which force you to wake up abruptly.


One also can think of a noisy distribution as the null model for the expectation and occurrence of rare events. Yet during a given natural process, not only are rare events expected to occur at a certain rate, but also recur at a certain rate. This is because rare events tend to be stochastic, and as a result are not evenly distributed in time. This connection to recursion suggests that such events are computable, if we only had the proper data structure for these events.


From the time of the difference engine to today, representing something for a computational system has involved using a discrete binary model that conforms to a logical data structure (e.g. logic gate, tree, matrix). Yet "natural" computation (or computable behavior) has been observed in systems such as chemical (B-Z and Turing) reactions [3], collective behavior, and gene expression, which may be neither discrete and binary nor explicitly logical. However, one could argue that rare and extreme events occur in such systems, and that it is a phenomenon very poorly captured by contemporary hardware and software.


LEFT: an example of a B-Z reaction. RIGHT: examples of reaction-diffusion seen in so-called Turing reactions (a key mechanism in biological development).

Why aren't contemporary data structures suitable for what I am describing? It is true that much of the work on fractal geometry was made possible using standard, existing computing logic [4]. Yet most of modern artificial intelligence and machine learning is based on the notion of normalization. For example, in machine learning and pattern recognition, a series of exemplars are used to identify discrete groups and isolated objects. A basis function used to delimit these groups is often based on the normal distribution, and the margin of these functions are generally associated with a deterministic amount of variance. This leads to a high rate of true positives for classifying objects, but can lead to catastrophic failures from time to time.


An alternate approach is to use a genetic algorithmic or artificial life representation. While these techniques are capable of generating rare and/or extreme events (through mutation and recombination), we must still understand the context of rare events. By observing systems such as rogue waves and avalanches, we can see that fluctuations are an important ingredient for generating rare events. Thus, systems that exhibit flux (where there are undulations in the acceleration and movement of individual particles) should be abstracted to a formal data structure in order for commercially-available computers to completely model rare events. Incorporating these features into a genetic algorithm or artificial life application would greatly help us understand natural instances of non-uniformity.

But can this even be done using conventional computational abstractions? It is hard to say. I have thought about this problem, and remain convinced that we need fundamentally new forms of computation to truly address these features of nature. One option might be "natural" computing, which is the use of natural phenomena (such as chemical reactions or bubble formation) to execute computations [5]. Another option might be in the area of biomimetics, where the goal is to abstract functional features from biological and other natural systems for purposes of computing and developing engineering applications.

An interesting area indeed. More news as it develops.

References
[1] Rare events have been characterized by the Black Swan phenomenon and the Poisson distribution.

[2] A white noise distribution is a normally-distributed Gaussian process.

[3] Adamatzky, A. (2001). Computing in Nonlinear Media and Automata Collectives. Institute of Physics, Bristol, UK.

[4] Mandelbrot, B. (1982). The Fractal Geometry of Nature. W.H. Freeman, New York.


January 24, 2012

The Evolution and Neuromechanics of very-fast movements

How are movements completed in under a millisecond (ms) generated and regulated, particularly when they result in forces many times the body weight of the organism in question? Much of the recent work (over the last 15-20 years) has focused on neuromuscular, sensory, and mechanical mechanisms. Some people have proposed a field called neuromechanics [1, 2] to synthesize how animals process sensory information and use it for adaptive movement. A neuromechanical approach requires a feedback loop between sensory stimuli and muscular output, mediated by neural mechanisms. One simple example of this is control of the gill structure in Sea Slugs (Aplysia). Using a series of conditioned stimuli, the gill structure will open and close in an adaptive manner similar to the learning and memory found in more complex neural systems.

On the
January 20th episode of Science Friday (an NPR show on Friday afternoons), the video clip of the week featured work from the Patek Lab at UMass-Amherst. The Patek Lab focuses on the very fast predatory and fighting movements seen in Mantis Shrimp (Stomatopods) and Trap Jaw Ants (Odontomachus). By looking into the work being done at the Patek Lab, I was introduced to an entire body of work on ultra-fast and very powerful movements. This work is very cutting-edge, and provides a wealth of information on evolution, behavior, and the function of neuromuscular systems.

How does a mechanism specialized for rapid and powerful movements evolve and vary across phylogeny? In a paper by Spagna et.al [3], a phylogeny of trap-jaw ants was constructed by using observations of behaviors related to appendage usage, mechanical function, and several genetic loci. The authors were trying to trying to correlate differences in the speed and acceleration of appendage movement with morphological variation as observed across all species in the genus (see Figure 1).These findings suggest a recent origin and rapid diversification for appendage mechanisms.



Figure 1: Phylogenetic relationships between species of trap-jaw ant. Taken from [3].

Why do trap-jaw ants need to produce large amounts of force relative to their body weight, and why is it variable across evolution? While using their mandibular appendage, trap jaw ants produce a stress (force) that is 300x their own body weight [4]. This produces a movement so fast that it could not be studied until the proper high-speed motion capture equipment became available. If the ants strike the ground with their appendage, they can produce a movement called “ballistic” jumping, which results in an uncontrollable jump.

The authors of [4] refer to mandibular appendage closure as a "high-performance" behavior. In general, high-performance behaviors are associated with a single function. However, appendage closure is a multifunctional behavior, used for prey capture, fighting, and jumping to safety [4]. Perhaps more interestingly, multifunctional behaviors are related to evolutionary tradeoffs and the co-option (or exaptation) of shared and novel structures. In [3], it is suggested that size might be limited by energetic constraints but maximized by the requirements of prey capture. Yet since the production of force is many times larger than the organism produces in any other muscle, there could be other dynamics at work here, including evolutionary arms races within species.

In Figure 2, we can see the location of the mandibular appendage in relation to the anterior portion of their body. When this appendage strikes the ground, it produces the ballistic movement also seen in Figure 2. Ballistic movements are essentially bursts of muscular activity produced without much regard for control. Throwing, kicking, and eyeball saccade movements are examples of ballistic movements in humans.


Figure 2. LEFT: image of mandibular appendange. RIGHT: frames from video of a jaw propulsion movement. Taken from [3].

In the study of human movement, there is a notion called the speed-accuracy tradeoff, characterized by Fitts’ Law [5]. In essence, the faster the movement, the less accurate it will be. This has much to do with the lack of muscular control for movements that generate a large amount of force in a short period of time [6]. Highly-controlled movements such as drawing or balancing require that movements be made slowly and no large fluctuations in force be introduced. This requires extensive co-regulation of muscle groups, and results in a highly complex sequence of physiological events. Very fast movements, on the other hand, are produced simply by letting a single muscle or muscle group release the maximum amount of force that it is capable of producing.

Now that we know how muscle force production can be maximized with respect to output, we must now understand how muscle power is “amplified” in nature. From the standpoint of technology, we know how to amplify the amount of work done by a set of muscles in the human body. But how is this accomplished naturally? One way is via kinematic or mechanical linkage. In the human body, a whole-body movement such as a heave involves movement of the trunk, arms, and fingers. As one moves outward from the midline, the velocity and acceleration of a segment (e.g. humerus, forearm, or hand) become progressively larger and less uniform with regard to time. Now consider how the mantis shrimp uses its appendage. Unlike the trap jaw ant mandibular appendage, the mantis shrimp appendage is used for cracking open the shells of prey. This requires massive forces to be produced and displaced, but which also requires a highly specialized phenotype.

Muscle power amplification in the mantis shrimp works in a similar manner. Muscle power can be thought of as the amount of potential work done by the muscle [7]. Muscle “work” is related to the amount of force generated by the muscle and the appendage it is connected to. In [8] and [9], it is suggested that extremely fast movements for which power amplification is required are achieved by reducing the duration of the movement. The anatomical linkage involved in power amplification is the integrated function of three units (Figures 3 and 4): an engine, an amplifier, and a tool [9]. In the context of mantis shrimp anatomy, the engine is represented by muscle, while the amplifier is represented by spring and the tool is represented by a hammer. Consistent with the notion of kinematic linkage, the distal component of this system produce the greatest forces (and of course move the fastest).

Figure 3. LEFT, CENTER: biomechanical model depicting mantis shrimp appendage as a loaded spring. RIGHT: location of mantis shrimp appendage on body. Taken from [8].

Figure 4. LEFT: cartoon of the mantis shrimp appendage and a geometric model that approximates shape changes in anatomical structure. RIGHT: images of the mantis shrimp during behavior and after dissection/maceration. Taken from [8].

In [9], the authors consider the developmental origins of the muscle power amplifier system. To do this, they ask two questions. The first is whether or not the three components of the appendage system (muscle, merus region, and hammer) constitute three independent developmental modules. Since this is found to be the case, the second question centers around the scaling of the appendage system with respect to body size. The answer to this is yes, but with some interesting qualifications. As expected, muscle force increases proportionally to muscle shape and size. Yet there is a lack of change in shape relative to size, which is different from the isometric relationship between muscle size and body mass seen in jumping and running mammals. In addition, the merus region (the “spring” mechanism) has a selective capacity for stiffness, which results in maximal force production that scales allometrically to body size. Finally, the hammer (the “tool” mechanism) acts as a lever, which like the human foot provides a mechanical advantage to handling large loadings and forces.

References
[1] Nishikawa, K. et.al (2007). Neuromechanics: an integrative approach for understanding motor control. Integrative and Comparative Biology, 47(1), 16-54.

[2] Enoka, R. (2008). Neuromechanics of Human Movement. Human Kinetics, Champaign, IL.

[3] Spagna, J.C. et.al (2008). Phylogeny, scaling, and the generation of extreme forces in trap-jaw ants. The Journal of Experimental Biology, 211, 2358-2368.

[4] Patek, S.N., Baio, J.E., Fisher, B.L., and Suarez, A.V. (2006). Multifunctionality and mechanical origins: ballistic jaw propulsion in trap-jaw ants. PNAS, 103(34), 12787–12792.

[5] Meyer, D.E., Smith, J.E.K., Kornblum, S., Abrams, R.A., and Wright, C.E. (1990). Speed-accuracy tradeoffs in aimed movements: toward a theory of rapid voluntary action. In M. Jeannerod (ed.), Attention and performance XIII (pp. 173–226). Hillsdale, NJ: Lawrence Erlbaum.

[6] Ifft, P.J., Lebedev, M.A., and Nicolelis, M.A.L. (2011). Cortical correlates of Fitts’ law. Frontiers in Integrative Neuroscience, 5(85), 1-16.

[7] Enoka, R.M. and Fugelvand, A.J. (2001). Motor unit physiology: some unresolved issues. Muscle & Nerve, 24(1), 4–17.

[8] Patek, S.N., Nowroozi, B.N., Baio, J.E., Caldwell, R.L., and Summers, A.P. (2007). Linkage mechanics and power amplification of the mantis shrimp’s strike. The Journal of Experimental Biology, 210, 3677-3688.

[9] Claverie, T., Chan, E., and Patek, S.N. (2010). Modularity and scaling in fast movements: power amplification in mantis shrimp. Evolution, 65(2), 443–461.







January 10, 2012

Dynamics Days 2012 Report

Last week (January 3-7) I was an attendee at Dynamics Days, a complexity conference held this year in the Inner Harbor section of Baltimore, MD and hosted by the University of Maryland-College Park. The University of Maryland has an active research group studying chaotic systems and includes people such as Ed Ott, Wolgang Losert, and Michelle Girvan. In fact, this year's event featured a tribute to Ed Ott in honor of his 70th birthday.

I presented a poster on my own work, which was part of a very active poster session. There were also several overarching themes that were shared by a number of talks and poster presentations. One theme was using variations on Lagrangian analysis for understanding turbulent flows. One example is the calculation of finite size Lyapunov exponents (FSLEs) to understand the effects on scale in complex systems. Another example was in the use of second-order Lagrangian systems to understand the effects of turbulence on UAV aircraft. The third example involved incorporating a stochastic element into Lagrangian modeling through the use of an uncertainty estimator. In general, the use of Lagrangians to understand aggregations of particles due to flow field dynamics is currently at the cutting-edge of research in engineering and physics, most notably in the form of Lagrangian Coherent Structure (LCS) analysis.


Example of a Lorentz attractor (above) and a flow field containing Lagrangian Coherent Structures (LCSs) - (below).

A second theme was (of course) chaotic systems, or dynamical systems that are often operate far from equilibrium. Jim Yorke reminded us that there are several definitions of chaos, depending on the type of system under analysis. The best known form of chaos is "statistical regularity" chaos, the short hand of which is understood as a strange attractor. Other forms of chaos include: transient chaos, broad-band power spectra, and deterministic and bounded dynamic behavior. Chaos can be represented as either oscillatory regimes embodied in attractor maps or bifurcations and period doubling-type events embodied in Henon (horseshoe) and logistic maps. Yorke made the connection between each representation and how it relates to our understanding of chaos. He also briefly touched on an emerging application of chaos to system control, which involves a game of survival between the noise and control components of a system. In chaotic systems, where the noise is greater than the control component, the dataset first needs to be sculpted, which reveals a series of safe sets that can be further sculpted to increase controllability. A related presentation focused on the difference between "fast" and "slow" dynamics, and how they work together to define time dynamics.

Example of a Rossler attractor (above) and a logistic map (below).

A third theme was Biophysics. There were a number of posters on actin scaffoldings in cells and cell motility. In particular, actinomyosin elements in the cell are used for rigidity sensing and can be understood as a small-scale muscular machine. There was a presentation on the study of quorum sensing in bacteria using populations of magnetic dipoles as a physical analogues to biological signaling processes and spatiotemporal gradient sensing, and a related presentation on the statistical mechanics of chemotactic movements. There was also an interesting presentation by Theo Geisel on the perception of beat in music. We learned that derivations in drumbeat during the course of a song acts to "humanize" a given song, and that this information can be used to build audio editing technologies. There were several other presentations on neuroscience, including new directions in modeling the dynamical chaos of sleep/wake cycles (Victoria Booth) and the modeling of fast and slow neuronal dynamics using ion concentrations. Finally, Tomas Bohr gave a presentation on how plants are dynamical systems. In particular, the passive and active transport of water and sugars from the leaves to the root systems can be modeled using a series of mathematical techniques and predicts the growth and size limitations of plants.

Example of a Henon map, representing a horseshoe bifurcation.

A fourth theme was evolutionary game theory and the evolution of information and computation. This includes a wide range of approaches such as autonomous boolean networks, discontinuous percolation to model group selection, and encodings for regulatory networks. One notable presentation focused on a stochastic framework for understanding transcriptional dynamics. The major features of this model included relations between timescales and the role of oscillations, dampening, and excitability in the transcriptional process. Another talk involves using the rock-paper-scissors game to model the nonlinear hierarchical dynamics (where there is no dominant strategy) of viral infection and virulence. Jim Crutchfield's group had several posters on the nature of information and the role of entropy in biological complexity. One up-and-coming researcher by the name of Garrett Michener presented on his work encodings for regulatory networks that combines genetic algorithms, neural networks, regulatory networks, and computational linguistics. His models produce some interesting complex behaviors.

Example of active transport of water and sugars in a plant.

One obscure idea that interested me the most were various approaches to extreme events. While in many cases extreme events are treated as outliers, it is also known that not all data are normally-distributed. As a result, we need methods for understanding value that exhibit large variation from the mean in proper context. There was a poster than focused explicitly on how to measure extreme events in a system, and a presentation that discussed the phenomenon of rogue waves in optical systems with applications to rogue waves in other physical systems (such as the surface of the ocean). Next year's Dynamics Days (2013), obscure ideas and all, will be held in Boulder, Colorado.

January 3, 2012

OnInnovation, a site/blog/resource about "innovation"

As I post on the concept of "innovation" from time to time, I would like to share a promising-looking resource that I encountered recently. The OnInnovation site (run by the Henry Ford Museum) features a wide swath of innovators, from historic inventors such as Henry Ford and Thomas Edison to contemporary people such as Steve Wozniak (computing), Elon Musk (venture capitalist), Dean Kamen (mechanical engineering), Rosa Parks (social change), Toshiko Mori (architecture), and Martha Stewart (entrepreneur).

There are also a number of interactive features to this site, which make it a good resource for people interested in understanding the embodiment of entrepreneurship and creative thinking. One feature is a video submission site associated with the American Invents! initiative and the Maker Faire. This is a good opportunity for people without a lot of resources or financial backing to try something new or show off their ideas.

There are also materials related to STEM (science, technology, and mathematics) education. Their Innovation 101 course offers five modules (definitions of innovation, process of innovation, personal traits of innovators, keys to innovation, and intellectual property) concerning different aspects of innovating. OnInnovation also includes a blog, which is not updated often, but keeps one up to date on where the site is headed.

Printfriendly