🔖 Human Evolution: Our Brains and Behavior by Robin Dunbar (Oxford University Press) marked as want to read.
Official release date: November 1, 2016
09/14/16: downloaded a review copy via NetGalley
The story of human evolution has fascinated us like no other: we seem to have an insatiable curiosity about who we are and where we have come from. Yet studying the “stones and bones” skirts around what is perhaps the realest, and most relatable, story of human evolution – the social and cognitive changes that gave rise to modern humans.
In Human Evolution: Our Brains and Behavior, Robin Dunbar appeals to the human aspects of every reader, as subjects of mating, friendship, and community are discussed from an evolutionary psychology perspective. With a table of contents ranging from prehistoric times to modern days, Human Evolution focuses on an aspect of evolution that has typically been overshadowed by the archaeological record: the biological, neurological, and genetic changes that occurred with each “transition” in the evolutionary narrative. Dunbar’s interdisciplinary approach – inspired by his background as both an anthropologist and accomplished psychologist – brings the reader into all aspects of the evolutionary process, which he describes as the “jigsaw puzzle” of evolution that he and the reader will help solve. In doing so, the book carefully maps out each stage of the evolutionary process, from anatomical changes such as bipedalism and increase in brain size, to cognitive and behavioral changes, such as the ability to cook, laugh, and use language to form communities through religion and story-telling. Most importantly and interestingly, Dunbar hypothesizes the order in which these evolutionary changes occurred-conclusions that are reached with the “time budget model” theory that Dunbar himself coined. As definitive as the “stones and bones” are for the hard dates of archaeological evidence, this book explores far more complex psychological questions that require a degree of intellectual speculation: What does it really mean to be human (as opposed to being an ape), and how did we come to be that way?
Methods originally developed in Information Theory have found wide applicability in computational neuroscience. Beyond these original methods there is a need to develop novel tools and approaches that are driven by problems arising in neuroscience.
A number of researchers in computational/systems neuroscience and in information/communication theory are investigating problems of information representation and processing. While the goals are often the same, these researchers bring different perspectives and points of view to a common set of neuroscience problems. Often they participate in different fora and their interaction is limited.
The goal of the workshop is to bring some of these researchers together to discuss challenges posed by neuroscience and to exchange ideas and present their latest work.
The workshop is targeted towards computational and systems neuroscientists with interest in methods of information theory as well as information/communication theorists with interest in neuroscience.
Inspiration for artificial biologically inspired computing is often drawn from neural systems. This article shows how to analyze neural systems using information theory with the aim of obtaining constraints that help to identify the algorithms run by neural systems and the information they represent. Algorithms and representations identified this way may then guide the design of biologically inspired computing systems. The material covered includes the necessary introduction to information theory and to the estimation of information-theoretic quantities from neural recordings. We then show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely or redundantly or synergistically together with others. Last, we introduce the framework of local information dynamics, where information processing is partitioned into component processes of information storage, transfer, and modification – locally in space and time. We close by discussing example applications of these measures to neural data and other complex systems.