Archive for November, 2008

Using fMRI and a data-mining algorithm, researchers in the Netherlands have developed a method to make predictions about what the listener is hearing based solely on the fMRI data. There are two predictions that they algorithm can make: discriminations between three Dutch vowel sounds (/a/, /i/, and /u/), and discriminations between which of three different speakers is currently talking.

Before we get any further, here are the numbers. When asked to discriminate between two different vowels, the mean correctness levels of the algorithm were 0.65, 0.69, and 0.63. All of these standard errors were well above chance (0.5), with highly significant p-values. When asked to discriminate between two different speakers (there were 3 total), the mean correctness levels were 0.70, 0.67, and 0.62, once again with highly significant p-values.

While this is impressive, the next part of their experiment is even more so. The researchers trained the algorithm to differentiate between only two vowels or two speakers, and tested to see their effectiveness in discriminating between novel stimuli–the third vowel or speaker paired with the first two. In order to remain effacious, the algorithm would have to be able to operate in many different acoustical dimensions. Miraculously, the algorithm was almost just as effective, with vowel mean correctness values of 0.66, 0.62, and 0.62, and speaker mean correctness values of 0.62, 0.65, and 0.63.

Their findings also included the observation that a representation of a vowel or speaker occurs not only in the task-specific higher levels of cognitive processing but also in “lower” auditory regions of sound processing. This lends further evidence that sound processing (and visual processing) occurs early in brain processing through feedback loops and reentrant processing.

The potential uses for this technology become very interesting as the correctness levels approach 100% and as the vowel recognition extends to all sounds. That is an ambitious but worthy goal for research in this area, due to its many obvious applications, perhaps even as a lie detector.


Formisano E, De Martino F, Bonte M, Goebel R. 2008 “Who” is saying “what”? Brain-based decoding of human voice and speech. Science 322:970-973, doi:10.1126/science.1164318.


Read Full Post »

Zalc et al published a paper earlier this year in Current Biology hypothesizing that vertebrate myelination developed in the now extinct class placoderms. It is a horrificly concise paper (that is a good thing), and their evidence is simple. As compared to ostreostaci, a class that preceded them, placoderm fossils suggest oculomotor nerves that are 10 times longer, despite having oculomotor foramina of similar diameter.

In living animals, a higher ratio of nerve length to diameter is found whenever the nerves are myelinated. This makes sense, because myelin allows action potentials to propagate down nerves faster and with less energy because the potentials can “hop” from one node to the next. Without myelin, another strategy to make nerves longer is to increase axon diameter, which also increases impulse speed, although it comes at the cost of higher energy and space constraints. Therefore, if placoderms are shown to have had a high ratio of nerve length to diameter compared to unmyelinated species, it means that their nerves were most likely myelinated.

Myelin evolution is interesting in particular because some species have the ability to regenerate their myelin when it is damaged (for example, in goldfish), but at some point this ability was lost. If we could figure out the exact mechanism by which myelin sheathing evolved, we could perhaps develop a drug that allowed for myelin regeneration in humans as well.


Zalc B, Gouget D, Colman D. 2008 The origin of the myelination program in vertebrates. Current Biology (18): 511-512.

Read Full Post »

The list was put together by Cal Tech’s Computation and Neural Systems department, and the questions are ones that they expect PhD students ought to be able to answer. This can serve two purposes, either testing yourself to make sure that you are up to speed, or testing other people to weed out imposters! Here are some of my favorite questions:

  • Discuss the major features of at least two very different nervous systems (i.e. jellyfish, locust, lamprey, octopus, owl, rat, monkey). In what ways might the features of each system affect neural processing?
  • Describe and discuss the two principal models of individual neurons (integrate-and-fire and mean-rate neurons). What assumptions do they make about encoding. In what way are they faithful to real neurons and how do they fall short?
  • Consider a surprising visual stimulus in your animal of choice.  How quickly will information arrive at various places in the brain, and what implications does this have for neural coding?
  • What do you know about population coding in subcortical (e.g. superior colliculus) or cortical structures? Why would the nervous system use population rather than single cell coding? How would cross-correlation among cells affect this?
  • Define a “psychometric curve.” How would it look for a spatial hyperacuity task? Can you explain the performance of the system on dot-discrimination and line-alignment (Vernier) tasks in terms of the mechanisms of the retina and the primary visual cortex?
  • Discuss the role of the nonlinearity in multi-layer neural networks. Compare thresholding versus soft sigmoidal functions and bump functions in terms of network function and learning capability.
  • We can read everywhere that the brain is a “complex adaptive system”. What is meant by this vague statement?  Give at least two formal definitions of complexity.
  • A person rolls two fair dice and records the total number of points. You can ask a sequence of yes/no questions to find out this number. The answer to a question may affect your choice of the following questions. Devise and justify a strategy that achieves the minimum possible average number of questions.

These questions are non-trivial. The last one especially is a mind bender and a fun one at that. Perhaps I will look for the answers and blog some of them; that would be an educational experience all around.

Read Full Post »

V.S. Ramachandran’s book is short and has enough mind-nuggets to make it well worth the read, although I do think that he extends some of his theories with a bit too much alactricity. Here are some of my notes:

  • Capgras syndrome is where the connection between the visual centers and the amygdala (ie, emotions) are cut by an accident. “So he looks at his mother and thinks, ‘She looks just like my mother, but if it’s my mother why don’t I feel anything toward her? No, this can’t possibly be my mother, it’s some stranger pretending to be my mother.’”
  • Chronic pain and acute pain are different phenomenon. Acute pain is meant to tell you to reflexively withdraw a body part, chronic pain is meant to make you immobilize your arm until it can heal.
  • Synesthesia is where musical notes or numbers are consistently associated with a color. It is fairly common–it affects 1 in 200 people.
  • He has a theory of laughter that it is nature’s way of signaling that there is no danger. I don’t necessarily agree, and I don’t see why there is such a rush to reduce all laughter to one core cause.
  • There is an old and new system of visual awareness–one is unconscious and one is conscious.
  • He falls victim to some mirror neuron hype, arguing that “around 50,000 years ago, maybe the mirror neuron system became sufficiently sophisticated that there was an explosive evolution of this ability to mime complex actions.” The evidence currently does not support this theory.
  • He goes over some of his “universal rules of art”: peak shift, grouping, contrast, and isolation. This is one of the most interesting parts of the book, drawing on evidence from animal behavior.
  • “It is well known that there cannot be two overlapping patterns of neural activity simulatenously. even though the human brain contains a hundred billion nerve cells, no two patterns may overlap.” This leads to the classic explanation of a “bottleneck” of attention.
  • Three conditions that strange phenomenon must have in order to be accepted by mainstream science: “First, it must be a demonstrably real phenomenon… it has to be reliably repeatable under controlled conditions. Second, there must be a candidate mechanism that explains the phenomenon in terms of previously known principles. And third, it has to have significant implications beyond the phenomenon itself.” Otherwise it will not be accepted, for example synesthesia was once cast aside as simple deliria.
  • Cotard’s syndrome: “…nothing in the world has any significance, no object or person, no tactile sensation, no sound–nothing–has emotional impact. The only way in which a patient can interpret this complete emotional desolation is to believe that he or she is dead.”
  • It seems that he believes in temporal asymmetry: “Our brains are essentially model-making machines. We need to construct useful, virtual reality simulations of the world that we can act on. Within the simulation, we need also to construct models of other people’s minds because we primates are intensely social creatures. We need to do this so that we can predict their behavior.”
  • If humans died immediately, which animal would take over the earth? Assume no nuclear fallout, the environment would be unchanged. He thinks it would be orangutans: “among the great apes, orangutans alone are reputed to display imitation of sophisticated skills… often watching the keeper and picking locks or even paddling across a river in a canoe.”

He says that his book doesn’t really start until the endnotes, which may be true but it was sort of hard to follow. This book had a number of interesting ideas, and has a low opportunity cost because it is short and devoid of high-fallutin’ language. I recommend it highly, you can find it here on Amazon.

Read Full Post »

That is the report from Nature News. They note that:

The electrode is different to others used for brain–computer interfaces, most of which are fixed to the skull rather than within a specific part of the brain. This means that the electrodes can move around, making it difficult to record from the same neurons every time or to leave the electrode in place in for more than a few months at a time.

The electrode used by Guenther’s team is impregnated with neurotrophic factors, which encourage neurons to grow into and around the electrode, anchoring it in place and allowing it to be recorded from for a much longer time.

As more public goodwill is seen from the results of this research, more money will probably be poured into it. The use of neurotrophic factors is interesting, and success in this area would raise serious questions about whether it could be used on healthy people as well to augment their senses or even to provide additional ones.

Read Full Post »

Does primacy trump recency?

One of the more nuanced critiques of the literature on human cognitive biases is that some of them posit conflicting effects. Probably the most glaring discrepancy is the difference between the primacy and the recency effect in hypothesis formation. Which factor is more important? Marsh and Ahn take on the challenge in their 2006 paper,

Some studies have shown a recency effect: Information that is presented later in a sequence is more heavily reflected in judgments than is information that is presented earlier. Other studies have shown a primacy effect: Early-presented information is reflected in judgment more than is later information…

Throughout this study, we have maintained the position that the primacy effect is obtained because people form a hypothesis from earlier data and underadjust this hypothesis… The primacy effect was found to be moderated by the cognitive load required by the hypothesis-testing nature of the task and by the size of the verbal working memory capacity available to process information.

Unless the subject is overloaded with information, the primacy effect is dominant. A real world application can be found in the work of Trevon Logan, who analyzed college football votes in order to see which games had more of an impact on the rankings of AP voters. Consistent with the idea that primacy is more salient than recency, he found that it is better to lose later in the season than earlier.

In order to be rational, you must obfuscate your initial opinions and see the whole story before you begin to draw conclusions. If you must choose, err on the side of weighing the later data more heavily in order to compensate for your cognitive flaws.


Marsh JK, Ahn, W. 2006 Order effects in contingency learning: The role of task complexity. Memory & Cognition 34:568-576. Abstract.

Read Full Post »