NIMBioS Workshop: Information Theory and Entropy in Biological Systems

NIMBioS Workshop: Information Theory and Entropy in Biological Systems

Over the next few days, I’ll be maintaining a Storify story covering information related to and coming out of the Information Theory and Entropy Workshop being sponsored by NIMBios at the Unviersity of Tennessee, Knoxville.

For those in attendance or participating by watching the live streaming video (or even watching the video after-the-fact), please feel free to use the official hashtag #entropyWS, and I’ll do my best to include your tweets, posts, and material into the story stream for future reference.

For journal articles and papers mentioned in/at the workshop, I encourage everyone to join the Mendeley.com group ITBio: Information Theory, Microbiology, Evolution, and Complexity and add them to the group’s list of papers. Think of it as a collaborative online journal club of sorts.

Those participating in the workshop are also encouraged to take a look at a growing collection of researchers and materials I maintain here. If you have materials or resources you’d like to contribute to the list, please send me an email or include them via the suggestions/submission form or include them in the comments section below.

Resources for Information Theory and Biology

RSS Icon  RSS Feed for BoffoSocko posts tagged with #ITBio

 

//storify.com/ChrisAldrich/information-and-entropy-a-nimbios-investigative-wo/embed//storify.com/ChrisAldrich/information-and-entropy-a-nimbios-investigative-wo.js<span class=”mceItemHidden” data-mce-bogus=”1″><span></span><span class=”mceItemHidden” data-mce-bogus=”1″><span class=”mceItemHidden” data-mce-bogus=”1″><span class=”mceItemHidden” data-mce-bogus=”1″>[<a href=”//storify.com/<span class=” <span=”” class=”hiddenSpellError” pre=”class ” data-mce-bogus=”1″>hiddenspellerror</a></span><a href=”//storify.com/<span class=” <span=”” class=”hiddenSpellError” pre=”class ” data-mce-bogus=”1″>”=”” pre=”” data-mce-bogus=”1″><span class=”hiddenSpellError” pre=”” data-mce-bogus=”1″>ChrisAldrich</span>/<span class=”mceItemHidden” data-mce-bogus=”1″><span class=”hiddenSpellError” pre=”” data-mce-bogus=”1″>information-and-entropy-a-nimbios-investigative-wo</span></span>” target=”_blank”>View the story “Information and Entropy in Biological Systems” on Storify</a>]</span></span></span>

NIMBioS Workshop: Information Theory and Entropy in Biological Systems was originally published on Chris Aldrich | Boffo Socko

Announcement: Special issue of Entropy: “Information Theoretic Incentives for Cognitive Systems”

Announcement: Special issue of Entropy: “Information Theoretic Incentives for Cognitive Systems”

Editor’s Note: Dr. Christoph Salge asked me to cross-post the following notice from the Entropy site here. I’m sure many of the readers of the blog will find the topic to be of interest.


 

Logo for the journal Entropy

 

Dear Colleagues,

In recent years, ideas such as “life is information processing” or “information holds the key to understanding life” have become more common. However, how can information, or more formally Information Theory, increase our understanding of life, or life-like systems?

Information Theory not only has a profound mathematical basis, but also typically provides an intuitive understanding of processes, such as learning, behavior and evolution terms of information processing.

In this special issue, we are interested in both:

  1. the information-theoretic formalization and quantification of different aspects of life, such as driving forces of learning and behavior generation, information flows between neurons, swarm members and social agents, and information theoretic aspects of evolution and adaptation, and
  2. the simulation and creation of life-like systems with previously identified principles and incentives.

Topics with relation to artificial and natural systems:

  • information theoretic intrinsic motivations
  • information theoretic quantification of behavior
  • information theoretic guidance of artificial evolution
  • information theoretic guidance of self-organization
  • information theoretic driving forces behind learning
  • information theoretic driving forces behind behavior
  • information theory in swarms
  • information theory in social behavior
  • information theory in evolution
  • information theory in the brain
  • information theory in system-environment distinction
  • information theory in the perception action loop
  • information theoretic definitions of life

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1200 CHF (Swiss Francs).

Deadline for manuscript submissions: 28 February 2015

Special Issue Editors

Guest Editor
Dr. Christoph Salge
Adaptive Systems Research Group,University of Hertfordshire, College Lane, AL10 9AB Hatfield, UK
Website: http://homepages.stca.herts.ac.uk/~cs08abi
E-Mail: c.salge@herts.ac.uk
Phone: +44 1707 28 4490
Interests: Intrinsic Motivation (Empowerment); Self-Organization; Guided Self-Organization; Information-Theoretic Incentives for Social Interaction; Information-Theoretic Incentives for Swarms; Information Theory and Computer Game AI

Guest Editor
Dr. Georg Martius
Cognition and Neurosciences, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Website: http://www.mis.mpg.de/jjost/members/georg-martius.html
E-Mail: martius@mis.mpg.de
Phone: +49 341 9959 545
Interests: Autonomous Robots; Self-Organization; Guided Self-Organization; Information Theory; Dynamical Systems; Machine Learning; Neuroscience of Learning; Optimal Control

Guest Editor
Dr. Keyan Ghazi-Zahedi
Information Theory of Cognitive Systems, Max Planck Institute for Mathematics in the Sciences Inselstrasse 22, 04103 Leipzig, Germany
Website: http://personal-homepages.mis.mpg.de/zahedi
E-Mail: zahedi@mis.mpg.de
Phone: +49 341 9959 535
Interests: Embodied Artificial Intelligence; Information Theory of the Sensorimotor Loop; Dynamical Systems; Cybernetics; Self-organisation; Synaptic plasticity; Evolutionary Robotics

Guest Editor
Dr. Daniel Polani
Department of Computer Science, University of Hertfordshire, Hatfield AL10 9AB, UK
Website: http://homepages.feis.herts.ac.uk/~comqdp1/
E-Mail: d.polani@herts.ac.uk
Interests: artificial intelligence; artificial life; information theory for intelligent information processing; sensor Evolution; collective and multiagent systems

    Syndicated to:

Announcement: Special issue of Entropy: “Information Theoretic Incentives for Cognitive Systems” was originally published on Chris Aldrich | Boffo Socko

Information Theory and Paleoanthropology

A few weeks ago I had communicated a bit with paleoanthropologist John Hawks.  I wanted to take a moment to highlight the fact that he maintains an excellent blog primarily concerning his areas of research which include anthropology, genetics and evolution.  Even more specifically, he is one of the few people in these areas with at least a passing interest in the topic of information theory as it relates to his work. I recommend everyone take a look at his information theory specific posts.

silhouette of John Hawks from his blog

I’ve previously written a brief review of John Hawks’ (in collaboration with Anthony Martin) “Major Transitions in Evolution” course from The Learning Company as part of their Great Courses series of lectures. Given my interest in the MOOC revolution in higher education, I’ll also mention that Dr. Hawks has recently begun a free Coursera class entitled “Human Evolution: Past and Future“. I’m sure his current course focuses more on the area of human evolution compared with the prior course which only dedicated a short segment on this time period.  Given Hawks’ excellent prior teaching work, I’m sure this will be of general interest to readers interested in information theory as it relates to evolution, biology, and big history.

I’d love to hear from others in the area of anthropology who are interested in information theoretical applications.

 

    Syndicated to:

Information Theory and Paleoanthropology was originally published on Chris Aldrich | Boffo Socko

Book Review: “Complexity: A Guided Tour” by Melanie Mitchell

Complexity: A Guided TourComplexity: A Guided Tour by Melanie Mitchell
My rating: 5 of 5 stars

This is handily one of the best, most interesting, and (to me at least) the most useful popularly written science books I’ve yet to come across. Most popular science books usually bore me to tears and end up being only pedantic for their historical backgrounds, but this one is very succinct with some interesting viewpoints (some of which I agree with and some of which my intuition says are terribly wrong) on the overall structure presented.

For those interested in a general and easily readable high-level overview of some of the areas of research I’ve been interested in (information theory, thermodynamics, entropy, microbiology, evolution, genetics, along with computation, dynamics, chaos, complexity, genetic algorithms, cellular automata, etc.) for the past two decades, this is really a lovely and thought-provoking book.

At the start I was disappointed that there were almost no equations in the book to speak of – and perhaps this is why I had purchased it when it came out and it’s subsequently been sitting on my shelf for so long. The other factor that prevented me from reading it was the depth and breadth of other more technical material I’ve read which covers the majority of topics in the book. I ultimately found myself not minding so much that there weren’t any/many supporting equations aside from a few hidden in the notes at the end of the text in most part because Dr. Mitchell does a fantastic job of pointing out some great subtleties within the various subjects which comprise the broader concept of complexity which one generally would take several years to come to on one’s own and at far greater expense of their time. Here she provides a much stronger picture of the overall subjects covered and this far outweighed the lack of specificity. I honestly wished I had read the book when it was released and it may have helped me to me more specific in my own research. Fortunately she does bring up several areas I will need to delve more deeply into and raised several questions which will significantly inform my future work.

In general, I wish there were more references I hadn’t read or been aware of yet, but towards the end there were a handful of topics relating to fractals, chaos, computer science, and cellular automata which I have been either ignorant of or which are further down my reading lists and may need to move closer to the top. I look forward to delving into many of these shortly. As a simple example, I’ve seen Zipf’s law separately from the perspectives of information theory, linguistics, and even evolution, but this is the first time I’ve seen it related to power laws and fractals.

I definitely appreciated the fact that Dr. Mitchell took the time to point out her own personal feelings on several topics and more so that she explicitly pointed them out as her own gut instincts instead of mentioning them passingly as if they were provable science which is what far too many other authors would have likely done. There are many viewpoints she takes which I certainly don’t agree with, but I suspect that it’s because I’m coming at things from the viewpoint of an electrical engineer with a stronger background in information theory and microbiology while hers is closer to that of computer science. She does mention that her undergraduate background was in mathematics, but I’m curious what areas she specifically studied to have a better understanding of her specific viewpoints.

Her final chapter looking at some of the pros and cons of the topic(s) was very welcome, particularly in light of previous philosophic attempts like cybernetics and general systems theory which I (also) think failed because of their lack of specificity. These caveats certainly help to place the scientific philosophy of complexity into a much larger context. I will generally heartily agree with her viewpoint (and that of others) that there needs to be a more rigorous mathematical theory underpinning the overall effort. I’m sure we’re all wondering “Where is our Newton?” or to use her clever aphorism that we’re “waiting for Carnot.” (Sounds like it should be a Tom Stoppard play title, doesn’t it?)

I might question her brief inclusion of her own Ph.D. thesis work in the text, but it did actually provide a nice specific and self-contained example within the broader context and also helped to tie several of the chapters together.

My one slight criticism of the work would be the lack of better footnoting within the text. Though many feel that footnote numbers within the text or inclusion at the bottom of the pages detracts from the “flow” of the work, I found myself wishing that she had done so here, particularly as I’m one of the few who actually cares about the footnotes and wants to know the specific references as I read. I hope that Oxford eventually publishes an e-book version that includes cross-linked footnotes in the future for the benefit of others.

I can heartily recommend this book to any fan of science, but I would specifically recommend it to any undergraduate science or engineering major who is unsure of what they’d specifically like to study and might need some interesting areas to take a look at. I will mention that one of the tough parts of the concept of complexity is that it is so broad and general that it encompasses over a dozen other fields of study each of which one could get a Ph.D. in without completely knowing the full depth of just one of them much less the full depth of all of them. The book is so well written that I’d even recommend it to senior researchers in any of the above mentioned fields as it is certainly sure to provide not only some excellent overview history of each, but it is sure to bring up questions and thoughts that they’ll want to include in their future researches in their own specific sub-areas of expertise.

View all my reviews

Salviati’s (Galileo’s voice) response to Simplicio (Pope Urban VIII)
Galileo Galilei in Dialogue Concerning the Two Chief World Systems

 

Galileo's Dialogo Title Page
Title Page from Galileo’s Dialogo

God Could Have Caused Birds to Fly With Their Bones Made of Solid Gold was originally published on Chris Aldrich