May 31, 2011

Naturally Supervised Learning, new arXiv paper

Here's a paper of mine recently accepted to the arXiv, cross-listed under the cs.HC and q-bio.NC categories:

This is more work from the physical intelligence project, and it features three experiments I did when the MIND Lab was still up and running:

Here is the abstract and key points:

It will be argued that haptic and proprioceptive sensory inputs serve a supervisory function in movement production related to the control of virtual environments and human-machine interfaces. To accomplish this, an approach new to human factors called neuromechanics will be used. This involves the introduction of novel techniques and analyses which demonstrate the multifaceted and regulatory role of adaptation in interactions between humans and motion and touch-based (e.g. manipulable) devices and interfaces.

Neuromechanics is an approach that unifies the role of physiological function, motor performance, and environmental effects in determining human performance. In this paper, a neuromechanical perspective will be used to explain the supervisory role of environmental variation on human performance.

Three experiments are presented using two different types of virtual environment that allowed for selective perturbation. Electromyography (EMG) and information related to kinematics were collected. Measures related to human performance dynamics were used to model the results.

Results and Conclusions
Results presented here provide a window into neuromechanical performance under a range of technologically-mediated conditions. Both descriptive and specialized analyses were conducted: peak amplitude analysis, loop trace analysis, and the analysis of unmatched muscle power. These analyses demonstrated that there are myriad consequences to force-related perturbations related to dynamic physiological regulation.

The findings presented here could be applied to the dynamical control of touch-based and movement-sensitive human-machine systems. In particular, the design of systems such as human-robotic systems, touch screen devices, and rehabilitative technologies could benefit from this research.

Key Points
* emerging manipulable technologies (e.g. touch- and motion-based interfaces) will ultimately feature a number of non-uniform forces and sequences of stimuli that can be simulated using virtual environments with physical intermediaries.

* the dynamics and complex relationships between simulation, the physical world, and human physiology can be better understood through the lens of neuromechanics, an approach that unifies biomechanics, neuroscience, embodied perspectives, and systems engineering.

* it was found that selective perturbation, in relation to a staggered training protocol can uncover various differences in performance, which remain to be formally classified but are suggestive of underlying cognitive and morphological regulatory mechanisms.

* these findings can be integrated with existing mobile and virtual technologies to provide a versatile, programmable tool for rehabilitative and non-medical applications.

Be sure to take a look. As always, comments are appreciated.

Maker Faire, Detroit

Here's an interesting upcoming event (happening later this Summer). Unfortunately, I will not be presenting anything due to lack of spare time:

Maker Faire, Detroit 2011

Looks good, and if Adam Savage can fool around inside of a Faraday cage for the cause, it's gotta be good.

Adam Savage, Tomfoolerier*

* scene from SF Maker Faire, May 2011