May 31, 2015

Kuhnian Practice as a Logical Reformulation

Are 01110000 01100001 01110010 01100001 [1] shifts a loss, a gain, a mismatch, or an opportunity for intellectual integration and the birth of a new field?


In the Kuhnian [2] approach to empiricism, a well-known outcome observed across the history of science is the "paradigm shift". This occurs when a landmark finding shifts our pre-existing models of a given natural phenomenon. One example of this: Darwin's finches and their evolutionary history in the Galapagos. In this case, a model system confirmed previous intuitions and overturned old facts in a short period of time (hence the idea of a scientific revolution). 

During a recent lecture by W. Ford Doolittle at the Insititute for Genomic Biology, I was introduced to a term called "Kuhn loss" [3]. Kuhn loss refers to the loss of accumulated knowledge due to a conceptual shift in a certain field. One might consider this to be a matter of housecleaning, or a matter of throwing out the baby with the bathwater. The context of this introduction was the debate between evolutionary genomicists [4] and the ENCODE consortium over the extent and nature of junk DNA. During the talk, Ford Doolittle presented the definitions of genome function proposed by the ENCODE consortium as a paradigm shift. The deeper intellectual history of biological function would suggest that indeed junk DNA not only exists, but requires a multidisciplinary and substantial set of results to overturn. Thus, rather than viewing the ENCODE results [5] as a paradigm shift, it can be viewed as a form of intellectual loss. The loss, paradigmatic or otherwise, provides us with a less satisfying and robust explanation than was previously the case.

A poster of the talk. COURTESY: IGB, University of Illinois, Urbana-Champaign

Whether or not you agree with Ford Doolittle's views of function, and I am of the opinion that you should, this introduces an interesting PoS issue. In the case of biological function, the caution is against a 'negative' Kuhn loss. But Kuhn loss (in a linear view of historical progress) usually refers to the loss of knowledge associated with folk theories or theories based on limited observational power. In some cases, these limited observations are augmented with deeper intuitive motivations. This type of intuition-guided theory usually becomes untenable given new observations and/or information about the world. Phlogiston theory [6] can be used to illustrate this type of 'positive' Kuhn loss. Quite popular in Ancient Greece and Medivel Europe, phlogiston theory predicts that the physical act of combustion released fire-like elements called phlogistons. Phlogistons operated in a manner opposite of the role we now know oxygen serves in combustion and other chemical reactions. Another less clear-cut example of 'positive' Kuhn loss involves a pre-relativity idea called aether theory predicts that the aether (an all-enveloping medium) is responsible for the propogation of light in space.

In each of these cases, what was lost? Surely the conclusions that arose from a faulty premise needed to be re-examined. A new framework also swept away inadequate concepts (such as "the aether" and "phlogistons"). But there was also a deeper set of logical structures that needed to be reformulated. In phlogiston theory, the direction of causality was essentially reversed. In aether theory, we essentially have a precursor to a more sophisticated concept (spacetime). Scientific revolutions are not all equal, and so neither is the loss that results. In some cases, Kuhn losses can be recovered and contribute to the advancement of a specific theoretical framework. Midwinter and Janssen [7] introduce us to the physicist/chemist Van Vleck, who improved upon the Kuhn loss introduced when quantum theory was introduced and replaced its antecedent theory. Van Vleck did this by borrowing mathematical formalisms from the theory of susceptibilities, and bringing them over to physics. While neither a restoration nor a paradigm shift, Van Vleck was able to improve upon the ability of quantum theory to make experimental predictions.

Tongue-in-cheek description of an empirically verified of phlogiston theory. COURTESY: [8]

Now let us revisit the Kuhnian content of the ENCODE kerfuffle vis a vis this framework of positive/negative Kuhn loss and Kuhn recovery. Is this conceptual clash ultimately a chance for a gain in theoretical richness and conceptual improvement? Does the tension between computational and traditional views of biological function neccessitate Kuhn loss (positive or negative)? According to the standard dialectical view [9], the answer to the former would be yes. In such case, we might expect a paradigm shift that results in an improved version of the old framework (e.g. 'positive' Kuhn loss). But perhaps there is also a cultural mismatch at play here [10] that could be informative for all studies of Kuhn loss. Since these differing perspectives come from very different intellectual and methodological traditions, we could say that any Kuhn loss would be negative due to a mismatch. This is a bit different from the phlogiston example in that while both approaches come from a scientific view of the world, they use different sets of assumptions to arrive at a coherent framework. However, what is more likely is that computational approaches (as new as they are to the biological domain) will influse themselves with older theoretical frameworks, resembling more of Kuhnian recovery (the quantum/antecedent theory example) than a loss or gain.

It is this intellectual (and logical) reformulation that will mark the way forward in computational biology, using an integrative approach (as one might currently take for granted in biology) rather than reasoning through the biology and computation as parallel entities. While part of the current state of affairs involves a technology-heavy computation being used to solve theoretically-challenging biological problems, better logical integration of the theory behind computational analysis and the theory behind biological investigation might greatly improve both enterprises. This might lead to new subfields such a the computation of biology, in which computation would be more than a technical appendage. Similarly, such a synthetic subfield would view of biological phenomena much more richly, albeit with the same cultural biases as previous views of life. Most importantly, this does not take a revolution. It merely takes a logical reformulation, one that could be put into motion with the right model system.


NOTES:
[1] the word "paradigmatic", translated into binary. COURTESY: Ashbox Binary Translator.

[2] Kuhn, T.S.   The Structure of Scientific Revolutions. University of Chicago Press (1962).

[3] Hoyningen-Huene, P.   Reconstructing Scientific Revolutions. University of Chicago Press (1983).

[4] Doolittle, W.F.   Is junk DNA bunk? A critique of ENCODE. PNAS, 110(14), 5294-5300 (2013).

[5] The ENCODE Project Consortium   An integrated encyclopedia of DNA elements in the human genome. Nature, 489, 57-74 (2012).

[6] Vihalemm, R.   The Kuhn-loss Thesis and the Case of Phlogiston Theory. Science Studies, 13(1), 68 (2000).

[7] Midwinter, C. and Janssen, M.   Kuhn Losses Regained: Van Vleck from Spectra to Susceptibilities. arXiv, 1205.0179 [physics.hist-ph] (2012).

[8] DrKuha   The Phlogiston: Not Quite Vindicated. Spin One Half blog, May 19 (2009).

[9] what we should expect according to dialectical materialism: adherents of two ideologies struggle for dominance, with an eventual winner that is improved upon the both original ideologies. Not to be confused with the "argument to moderation".

[10] for more context (the difference between a scientific revolution and a scientific integration) please see: Alicea, B.   Does the concept of paradigm shift need a rethink? Synthetic Daisies blog, December 25 (2014).

May 25, 2015

Scientific Bytes and Pieces, May 2015

Bytes and Pieces is a collection of links and essays recently enncountered from across the internet, consisting mainly of scientific essays and applications.


Asterank: A web-based interface which visualizes all of the asteroids in our solar system and ranks them according to economic utility (e.g. asteroid mining). Thus, Asterank combines physical science with an optimistic futurism. See the associated Github repository for technical details.

Screenshot of Asterank Interface.


The Scientific Method is an Idea Ready for Retirement. Despite the provocative stance, this is indeed the view of one systems-level thinker (Melanie Swan) who argues against the power of reductionist hypothesis-testing in a high-throughput, multivariate world.


Writing at Nautil.us blog, Sam Arbesman brings us a tour of "robust yet fragile" systems. This essay explores the consequences and by-products of kludgeiness in complex systems. The "crawling horrors" that Arbesman refers to are small-scale errors that cause failures in systems that are otherwise error-tolerant.


Making Espresso In (Outer) Space. An Italian coffee company (LavAzza) is behind an effort to make Espressos in space (e.g. zero-gravity conditions on the ISS). Looks like a challenge to both make and drink enjoyably, although without gravity one does not have much of a choice. Next up on the exotic coffee wishlist: leveraging quantum foam to make yoctolattes.

The glamour and impracticality of old-fashioned space coffee.

Preparing a cup (or rather a pouch).

Not quite as advertised, but she will enjoy it!

Via Singularity Hub, we learn of Second Life founder Philip Rosedale's latest efforts: to build a virtual metaverse at the scale of planetary communities. The proposed platform (High Fidelity) would be an open-source virtual reality-based social network with a variety of potential uses. A planetary-scale metaverse will require large-scale, coordinated, three-dimensional computing resources, which means that this vision should be quite the technical challenge to realize.

High Fidelity wants you! As a technical expert and eventually a user, but still...


May 18, 2015

DevoWorm presentation, Indiana University


Last September, I gave a presentation on the DevoWorm project to the OpenWorm group. On May 20, I will be presenting another version of this talk to the Biocomplexity Institute at Indiana University.


Here is the abstract:
The nematode C. elegans provides a unique opportunity for developmental computational biology. The relatively small and invariant number of cells in the C. elegans adult (959 in males, 1031 in hermaphrodites) provides a means to build tractable representations of the entire organism. The deterministic nature of C. elegans embryogenesis itself allows for complete cell lineages to be constructed. This affords us an opportunity to approximate developmental processes without model underspecification. The unique biology of C. elegans also enables the discovery of fundamental statistical signatures that define non-regulative (mosaic) development and cellular differentiation more broadly. As the OpenWorm bioinformatics project (http://www.openworm.org/) is an attempt to emulate the whole organism (C. elegans), DevoWorm is an attempt to emulate developmental processes that lead to the adult C. elegans. Such a meta-emulation is useful in a number of ways, from providing crucial information about development itself to providing a combinatorial source of developmental outcomes for evaluating the potential functional roles of phenotypic variation.

In this talk, we will discuss not only how emulation of C. elegans development can proceed, but also how this is relevant to a broader developmental perspective. The talk will also highlight a few examples of what can be extracted from secondary data and computational representations. One involves the extraction and characterization of uniquely informative parameters. Another is application of the differentiation tree approach for purposes of providing multi-axial resolution to the process of cell division and differentiation in mosaic development. When combined with models of development physics, our two examples could help clarify the relationships between regulative and mosaic development. These examples can be augmented through the use of both computational representation and multiple datatypes such as gene expression, microscopy, and semantic metadata. To conclude, we will consider the limitations of developmental simulations and how they can be useful heuristics for enabling better cell, molecular, and computational biology.  
References:  
Alicea, B., McGrew, S., Gordon, R., Larson, S., Warrington, T., and Watts, M.   DevoWorm: differentiation waves and computation in C. elegans embryogenesis. bioRxiv, http://dx.doi.org/10.1101/009993  
Alicea, B.   Now Announcing the DevoWorm project. Synthetic Daisies blog, June 3 (2014).http://syntheticdaisies.blogspot.com/2014/06/now-announcing-devoworm-project.html  
Szigeti, B., Gleeson, P., Vella, M., Khayrulin, S., Palyanov, A., Hokanson, J., Currie, M., Cantrelli, M., Idili, G., and Larson, S.   OpenWorm: an open-science approach to modelling Caenorhabditis elegans. Frontiers in Computational Neuroscience, 8, 137 (2014).

May 12, 2015

Social Capital Meets Social Media in the Service of Peer Review

What is the proper reward for serving as a peer reviewer? Until now, the reward has been increased social capital [1] in the academic community. Yet like everything else, social media has served to quantify and formalize these relationships.

Regardless of their potential for success [2 3], two new services have attempted to "give credit" for the act of peer reviewing. While not explicity monetary, the idea is to formalize credit for an often thankless task that is a vital part of the academic community.

The first of these services is Publons. Named after the "least publishable unit", Publons allows you to formally publish and cite your peer reviews [4]. While the most prolific reviewers seem to be doing their work purely for within-site prestige, treating peer reviews like published manuscripts is an intriguing idea. Publons is also integrated with select proprietary and open-access publishers, making the service most than merely a self-contained curiosity.


The second is Academic Karma. As with Publons, peer reviews are made to be creditable and archivable. In addition, reviewers are unbundled from specific journals, which can either be a good thing or a bad thing depending on the context. The accounting system is linked to your ORCID account (almost every University-based academic is likely to have one), which makes the crediting system portable.

UPDATE (5-19): In keeping with the theme (in an appropriately timely manner), I was mentioned in a new PLoS One feature [5] as one of many reviewers who kept PLoS One publishing for the year of 2014.

NOTES:
[1] Social capital can be defined as social benefits derived from one's social network. Units of social capital are often derived from providing public goods, gifting, or the exchange of favors. However, social capital accumulation can also be an indicator of reputation (e.g. the more social capital one holds, the greater their reputation).

For a less-than-idyllic example from an academic context, please see: Graur, D.   Payback time for referee refusal. Nature, 505, 483 (2014).

[2] Hossenfelder, S.   Publons. Backreaction blog, April 17 (2015).

[3] Saunders, N.   Academic Karma: a case study in how not to use open data. What You're Doing is Rather Desperate blog, February 19 (2015).

[4] Van Noorden, R.   The Scientist Who Get Credit for Peer Review. Nature News, October 9 (2014).

[5] PLOS ONE 2014 Reviewer Thank You. PLoS One, 10(2), e0121093 doi:10.1371/journal. pone.0121093 (2015).

Printfriendly