October 08

Tags

The Rise and Fall of Tools in Neuroscience, With a Case Study in Dyslexia

Human beings have only ever been as good as their tools. With inventions such as fire, agriculture, written language, modern medicine and computing, human civilizations have “advanced” in leaps and bounds. Each tool engenders changes – and challenges – that could barely be conceived of prior to the tool’s invention. Our understanding of the brain, and diseases of the brain, is no different. From post-mortem anatomical examinations, to single cell electrophysiology recordings, to recent advances like optogenetics (which enables researchers to control activity of neurons with light), the field has bounded forward. As each new technology becomes accessible, it is pushed to its limits in application to every conceivable question, often across multiple organisms.

Neuroscience evolves, not in isolation, but along with other biological and physical science disciplines. Projects like the Human Genome Project informed Neuroscientists’ understanding of how the brain works by making it possible to study genes mutated in common neurological disorders in animal models, and provided clues about how to treat diseases. The advances made in the field of genetics during this time paved the way for future Neuroscience technologies like optogenetics, which depends on the ability to manipulate genes with specificity.

As progress in the field allow Neuroscientists to record from and visualize neurons in vivo (meaning in a living organism) in unprecedented numbers, and to visualize the human brain in new and improved ways, novel techniques have emerged. This new generation of tools is not always physical; strategies for dealing with problems of clustering (e.g. grouping neurons with like activity or distinguishing between activity of two adjacent neurons being simultaneously recorded), categorizing (e.g. predicting if patients are in healthy or diseased groups based on their brain activity patterns) and codifying (e.g. developing hypotheses about how populations of networks interact to generate brain waves) data have become vitally important in Neuroscience.

Time May Change Tools, But Tools Can’t Change Time

It can be difficult to evaluate the importance of a tool, from both inside and outside the field, especially in a field as young as Neuroscience. Historically, graduate school functioned as an apprenticeship in a sense, with students learning a technique from a mentor and leaving prepared to use that tool to address questions of their own. Requirements for publication in scientific journals increasingly demand multiple angles to address any given question, and consequently, the scope of techniques each student must master is broadening. As the field ramps up in popularity and stature in a rapidly changing technological world, the rate of “turnover” – caveated below – for new tools is increasing at a seemingly exponential rate.

Below are a few graphs generated using the “Advanced Search” function in PubMed (an online database of scientific articles maintained by United States National Library of Medicine (NLM) and the National Institutes of Health (NIH)), admittedly limited and biased by the author’s own background and knowledge of the field, that demonstrate some of these changes in a techniques popularity over time (see Methods for more details).

Fig1_neurosciencetitleFig. 1 – Generated by an Advanced search for “Neuroscience” in the title field, and a search for publication date ( e.g. for “1950”, start year=1950, end year =1951). N.B. all following graphs use “brain” as a search term

fig_1_gen_1950normalFig. 2 – Generated by an Advanced search for each tool in the title field (e.g. “Anatomy”), a search for “brain” in the text, and a search for publication date ( e.g. for “1950”, start year=1950, end year =1951). The normalization process aims to control for the inflation in the overall number of articles in the field. The stacked bars do not reflect the total number of articles, rather, the gaps indicate techniques that are not represented in this sampling.

fig_1_gen_1950normal_lineFig. 3 – Same data as in Fig. 2, represented in a different form.

compFig. 4 – A subset of the data represented in Fig. 2, to demonstrate the relative popularity of “Computational” Approaches relative to “Optogenetics” N.B. Not all studies using optogenetic techniques will mention this in the title, where “Computational” can be applied more broadly.

This representation, while limited, serves to demonstrate the rise and fall of tools in Neuroscience. As a host of novel approaches grows up, tried and true methods slowly beginning to decrease in popularity. While older tools and techniques do not cease to be useful or valid simply because they are old, as time goes on, the “low-hanging fruit” have been plucked. While they remain integral to the research process and provide a foundation for many new techniques (as in the example of genetics and optogenetics), more seasoned approaches cease to be “trendy”. Neuroscience, as a field, aggregates knowledge based on the available tools, and then applies it to new approaches and questions previously impossible to ask. To illustrate this in action, we turn to a case study, in a highly specific neurodevelopmental disorder, dyslexia.

Dyslexia: A Case Study

Dyslexia refers to difficulties with reading that are unrelated to other deficits in intelligence. The best treatment for dyslexia is early identification and behavioral therapy. Musical training has been suggested to have some benefit for the unique perceptual difficulties dyslexics experience. Dyslexia is characterized by abnormalities in the structure of the cortex and in its connectivity to other brain regions, such as the thalamus. Identifying genetic risk factors for dyslexia helped researchers to identify the cause of the disorder, namely disruptions in the normal connection patterns between cortex and thalamus, as stated in this 2006 review:

“Our perspective on the state of the art in dyslexia research, as illustrated in this abridged review of the extensive literature on developmental dyslexia, is that variant function in any of a number of genes involved in cortical development, including but probably not restricted to the known candidate dyslexia susceptibility genes DYX1C1, KIAA0319, DCDC2 and ROBO1, can be responsible for subtle cortical malformations involving neuronal migration and axon growth, which in turn leads to abnormal cortico-cortical and cortico-thalamic circuits that affect sensorimotor, perceptual and cognitive processes critical for learning.”

Because of difficulties in mimicking complex and subtle human developmental disorders in animal models, even with purported genetic targets, dyslexia evades understanding provided by electrophysiological and optogenetic studies (to name a few techniques emphasized in this article), and the precise mechanism by which abnormalities in connectivity manifest as difficulties with reading are not well understood. Recently, Bayesian frameworks have been applied to the study of dyslexia in various manners, from further parsing the deficits in connection, to early prediction with the goal of improved treatment, to conceptualizing dyslexia as a deficit in the brain’s ability to integrate prior examples to inform cortical processing (a framework in which the the brain itself is Bayesian).

In its simplest terms, a Bayesian representation of the brain is based on the idea that the brain holds a model of reality, which is constantly updated with new information – as the model is updated, its predictive ability grows in proportion to the strength of its existing (prior) knowledge. Models and machine learning algorithms based on Bayesian statistics are increasingly used to process new types of data described in the first section. Interestingly, while the original work on the Bayes Theorem took place in the 18th century, the technique only rose to popularity in Neuroscience in the last 25 years.

fig_2_dys_1970normal_lineFig. 5 – Generated by an Advanced search for each tool in the Title/Abstract field (e.g. “Anatomy”), a search for “dyslexia” in the Title/Abstract field, and a search for publication date ( e.g. for “1970”, start year=1970, end year =1971). The normalization process aims to control for the inflation in the overall number of articles in the field of Neuroscience.

The Tool’s the Limit

Without the genetic tools and analysis that identified abnormal thalamocortical connections as integral to dyslexia, our understanding of dyslexia would be incomplete, but neither could identifying genetic risk factors provide much in the way of potential treatments, or specifics about which connections were most affected. Newer techniques are helping further the field in novel ways.

The development of tools for Neuroscience research is driving largely by the outputs of the previous tools, whether in the sense of making use of available resources, or striving to fill an unmet need. If we take PubMed to be a reflection of the number of papers published per year in the scientific community, we can utilize the information stored therein to observe how trends in the field have evolved and to predict where they are headed. Each technique listed here, and many not listed here, will likely contribute to answering outstanding questions about the brain. But it is in the interaction and aggregation of insights provided by each tool that knowledge of the brain increases. In reflection of this, the BRAIN Initiative, which has set lofty goals for the rapid advancement of the field, focuses heavily on funding new tools and techniques.

A take away from this analysis is that research on the brain should never be confined to the bounds of available resources, but should strive for new ways of analyzing, experimenting and conceptualizing the brain. In our quest to understand the brain, in all its complexity, the tool’s the limit.

Methods:

Numerical values were determined by search function in PubMed and calculations (coming soon). Normalization was achieved by generating a ratio of the number of publications in each of the listed years returned by a search for “brain” in the text to the number of publications returned by the same search in the year to which it is normalized (1950 or 1970).
Title photo: https://upload.wikimedia.org/wikipedia/commons/8/81/Toolbox_(7263382550).jpg

Advertisements