What do ‘Bohemian Rhapsody,’ ‘Macbeth,’ and a list of Facebook Friends All Have in Common?

New research finds that works of literature, musical pieces, and social networks have a similar underlying structure that allows them to share large amounts of information efficiently.

Examples of statistical network analysis of characters in two of Shakespeare’s tragedies. Two characters are connected by a line, or edge, if they appear in the same scene. The size of the circles that represent these characters, called nodes, indicate how many other characters one is connected to. The network’s density relates to how complete the graph is, with 100% density meaning that it has all of the characters are connected. (Image: Martin Grandjean)

 

By Erica K. Brockmeier

To an English scholar or avid reader, the Shakespeare Canon represents some of the greatest literary works of the English language. To a network scientist, Shakespeare’s 37 plays and the 884,421 words they contain also represent a massively complex communication network. Network scientists, who employ math, physics, and computer science to study vast and interconnected systems, are tasked with using statistically rigorous approaches to understand how complex networks, like all of Shakespeare, convey information to the human brain.

New research published in Nature Physics uses tools from network science to explain how complex communication networks can efficiently convey large amounts of information to the human brain. Conducted by postdoc Christopher Lynn, graduate students Ari Kahn and Lia Papadopoulos, and professor Danielle S. Bassett, the study found that different types of networks, including those found in works of literature, musical pieces, and social connections, have a similar underlying structure that allows them to share information rapidly and efficiently.

Technically speaking, a network is simply a statistical and graphical representation of connections, known as edges, between different endpoints, called nodes. In pieces of literature, for example, a node can be a word, and an edge can connect words when they appear next to each other (“my” — “kingdom” — “for” — “a” — “horse”) or when they convey similar ideas or concepts (“yellow” — “orange” — “red”).

The advantage of using network science to study things like languages, says Lynn, is that once relationships are defined on a small scale, researchers can use those connections to make inferences about a network’s structure on a much larger scale. “Once you define the nodes and edges, you can zoom out and start to ask about what the structure of this whole object looks like and why it has that specific structure,” says Lynn.

Building on the group’s recent study that models how the brain processes complex information, the researchers developed a new analytical framework for determining how much information a network conveys and how efficient it is in conveying that information. “In order to calculate the efficiency of the communication, you need a model of how humans receive the information,” he says.

Continue reading at Penn Today.

To Err is Human, to Learn, Divine

Researchers develop a new model for how the brain processes complex information: by striking a balance between accuracy and simplicity while making mistakes along the way.

By Erica K. Brockmeier

New research finds that the human brain detects patterns in complex networks by striking a balance between simplicity and complexity, much like how a pointillist painting can be viewed up close to see the finer details or from a distance to see its overall structure.

The human brain is a highly advanced information processor composed of more than 86 billion neurons. Humans are adept at recognizing patterns from complex networks, such as languages, without any formal instruction. Previously, cognitive scientists tried to explain this ability by depicting the brain as a highly optimized computer, but there is now discussion among neuroscientists that this model might not accurately reflect how the brain works.

Now, Penn researchers have developed a different model for how the brain interprets patterns from complex networks. Published in Nature Communications, this new model shows that the ability to detect patterns stems in part from the brain’s goal to represent things in the simplest way possible. Their model depicts the brain as constantly balancing accuracy with simplicity when making decisions. The work was conducted by physics Ph.D. student Christopher Lynn, neuroscience Ph.D. student Ari Kahn, and Danielle Bassett, J. Peter Skirkanich Professor in the departments of Bioengineering and Electrical and Systems Engineering.

This new model is built upon the idea that people make mistakes while trying to make sense of patterns, and these errors are essential to get a glimpse of the bigger picture. “If you look at a pointillist painting up close, you can correctly identify every dot. If you step back 20 feet, the details get fuzzy, but you’ll gain a better sense of the overall structure,” says Lynn.

To test their hypothesis, the researchers ran a set of experiments similar to a previous study by Kahn. That study found that when participants were shown repeating elements in a sequence, such as A-B-C-B, etc., they were automatically sensitive to certain patterns without being explicitly aware that the patterns existed. “If you experience a sequence of information, such as listening to speech, you can pick up on certain statistics between elements without being aware of what those statistics are,” says Kahn.

To understand how the brain automatically understands such complex associations within sequences, 360 study participants were shown a computer screen with five gray squares corresponding to five keys on a keyboard. As two of the five squares changed from gray to red, the participants had to strike the computer keys that corresponded to the changing squares. For the participants, the pattern of color-changing squares was random, but the sequences were actually generated using two kinds of networks.

The researchers found that the structure of the network impacted how quickly the participants could respond to the stimuli, an indication of their expectations of the underlying patterns. Responses were quicker when participants were shown sequences that were generated using a modular network compared to sequences coming from a lattice network.

Continue reading on Penn Today.

This paper was also profiled on the website Big Think.

Danielle Bassett Named AIMBE Fellow

Danielle Bassett, Ph.D.

Danielle Bassett, J. Peter Skirkanich Professor of Bioengineering, has been named an American Institute for Medical and Biological Engineering (AIMBE) Fellow.

Election to the AIMBE College of Fellows is among the highest professional distinctions accorded to a medical and biological engineer. College membership honors those who have made outstanding contributions to “engineering and medicine research, practice, or education” and to “the pioneering of new and developing fields of technology, making major advancements in traditional fields of medical and biological engineering, or developing/implementing innovative approaches to bioengineering education.”

Bassett was nominated, reviewed, and elected by peers and members of the College of Fellows for “significant contributions to the application of neural network theory for understanding both physio and patho-physiological brain function.”

As a result of health concerns, AIMBE’s annual meeting and induction ceremony scheduled for March 29–30, 2020, was cancelled. Under special procedures, Bassett was remotely inducted along with 156 colleagues who make up the AIMBE College of Fellows Class of 2020.

Originally posted on the Penn Engineering blog.

Listen: Danielle Bassett Uses Network Science to Find Links in Human Curiosity

Danielle Bassett, Ph.D.

Danielle Bassett, J. Peter Skirkanich Professor of Bioengineering and Electrical and Systems Engineering, is a curious scientist.

Featured on a recent episode of “Choosing to be Curious” on WERA 96.7 Radio Arlington, Bassett discussed her work in studying curiosity and the potential neural mechanisms behind it. In her work, Bassett strives to re-conceptualize curiosity itself, defining it as not just seeking new bits information, but striving to understand the path through which those bits are connected.

Bassett is a pioneering researcher in the field of network science and how its tools can be applied to understand the brain. Now, Bassett and her research team are using the tools of network science and complex systems theory to uncover what common styles of curiosity people share and how individual styles differ. In addition, the team is exploring if there are canonical types of curiosity among humans or if each person’s curiosity architecture is unique.

This isn’t the first time Bassett has combined the tools of disparate fields to pursue her research. For as long as she can remember, Bassett has been insatiably curious and, while she was homeschooled as a child, she often wandered from one subject to the next and let her own interest guide her path. For Bassett, studying curiosity with the tools of physical, biology, and engineering is a natural step in her research journey.

In her interview with host Lynn Borton, Bassett says:

“What took me to curiosity is the observation that there’s a problem in defining the ways in which we search for knowledge. And that perhaps the understanding of curiosity could be benefitted by a scientific and mathematical approach. And that maybe the tools and conceptions that we have in mathematics and physics and other areas of science are useful for understanding curiosity. Which most people would consider to be more in the world of the humanities than the sciences….“Part of what I’m hoping to do is to illustrate that there are connections between disciplines that seem completely separate. Sometimes some of the best ideas in science are inspired not by a scientific result but by something else.”

To hear more about Bassett’s research on curiosity, listen to the full episode of Choosing to Be Curious.

Originally posted on the Penn Engineering blog.

Dr. Danielle Bassett and Dr. Jason Burdick Named to Highly Cited Researchers List

by Sophie Burkholder

One way to measure the success or influence of a researcher is to consider how many times they’re cited by other researchers. Every published paper requires a reference section listing relevant earlier papers, and the Web of Science Group keeps track of how many times different authors are cited over the course of a year.

Danielle Bassett, Ph.D.

In 2019, two members of the Penn Bioengineering department, Jason Burdick, Ph.D., and Danielle Bassett, Ph.D., were named Highly Cited Researchers, indicating that each of them placed within the top 1% of citations in their field based on the Web of Science’s index. For the past year, only 6,300 researchers were recognized with this honor, a number that makes up a mere 0.1% of researchers worldwide. Bassett’s lab looks at the use of knowledge, brain, and dynamic networks to understand bioengineering problems at a systems-level analysis, while Burdick’s lab focuses on advancements in tissue engineering through polymer design and development.

Robert D. Bent Chair
Jason Burdick, PhD

Burdick’s and Bassett’s naming to the list of Highly Cited Researchers demonstrates that their research had an outsized influence over current work in the field of bioengineering in the last year, and that new innovations continue to be developed from foundations these two Penn researchers created. To be included among such a small percentage of researchers worldwide indicates that Bassett and Burdick are sources of great impact and influence in bioengineering advancements today.

Danielle Bassett Receives New Scholarly Chair

Danielle Bassett, Ph.D.

Danielle Bassett has been named the J. Peter Skirkanich Professor of Bioengineering.

Dr. Bassett is a Professor in the department of Bioengineering at the School of Engineering and Applied Science. She holds a Ph.D. in Physics from the University of Cambridge and completed her postdoctoral training at the University of California, Santa Barbara, before joining Penn in 2013.

Dr. Bassett has received numerous awards for her research, including an Alfred P Sloan Research Fellowship, a MacArthur Fellowship, an Office of Naval Research Young Investigator Award, a National Science Foundation CAREER Award and, most recently, an Erdos-Renyi Prize in Network Science to name but a few. She has authored over 190 peer-reviewed publications as well as numerous book chapters and teaching materials. She is the founding director of the Penn Network Visualization Program, a combined undergraduate art internship and K-12 outreach program bridging network science and the visual arts.

Continue reading at the Penn Engineering blog.

Penn Researchers’ Model Optimizes Brain Stimulation Therapies, Improving Memory in Tests

The researchers’ model involves mapping the connections between different regions of an individual’s brain while they performed a basic memory task, then using that data to predict how electrical stimulation in one region would affect activity throughout the network. Individuals’ improved performance on the same memory task after stimulation suggests the model could eventually be generalized toward a variety of stimulation therapies.

Brain stimulation, where targeted electrical impulses are directly applied to a patient’s brain, is already an effective therapy for depression, epilepsy, Parkinson’s and other neurological disorders, but many more applications are on the horizon. Clinicians and researchers believe the technique could be used to restore or improve memory and motor function after an injury, for example, but progress is hampered by how difficult it is to predict how the entire brain will respond to stimulation at a given region.

In an effort to better personalize and optimize this type of therapy, researchers from the University of Pennsylvania’s School of Engineering and Applied Science and Perelman School of Medicine, as well as Thomas Jefferson University Hospital and the University of California, Riverside, have developed a way to model how a given patient’s brain activity will change in response to targeted stimulation.

To test the accuracy of their model, they recruited a group of study participants who were undergoing an unrelated treatment for severe epilepsy, and thus had a series of electrodes already implanted in their brains. Using each individual’s brain activity data as inputs for their model, the researchers made predictions about how to best stimulate that participant’s brain to improve their performance on a basic memory test.

The participants’ brain activity before and after stimulation suggest the researchers’ models have meaningful predictive power and offer a first step towards a more generalizable approach to specific stimulation therapies.

Danielle Bassett and Jennifer Stiso.

The study, published in the journal Cell Reports, was led by Danielle Bassett, J. Peter Skirkanich Professor in Penn Engineering’s Department of Bioengineering, and Jennifer Stiso, a neuroscience graduate student in Penn Medicine and a member of Bassett’s Complex Systems Lab.

Read the full post on the Penn Engineering Medium blog. Media contact Evan Lerner.

Penn Researchers Detect Brain Differences between Fast and Slow Learners

By Lauren Salig

These 12 object-number value pairs were taught to the participants, who had to properly learn the associations to succeed in value judgement tests. The researchers investigated the differences in their brain activity patterns to see why some were faster learners than others.

Why do some people naturally excel at learning instruments, languages or technology while others take longer to pick up new knowledge? Learning requires the brain to encode information, changing its neural “wiring” and creating networks between brain regions.

In a new study, researchers at the University of Pennsylvania’s School of Engineering and Applied Science and the Max Planck Institute for Dynamics and Self-Organization in Germany looked at how brain activation patterns might affect how long it takes for new information to really stick in the brain.

Earlier research has suggested that part of what might slow down learners is over-thinking. A 2015 study led by Danielle Bassett, Eduardo D. Glandt Faculty Fellow and associate professor in the Department of Bioengineering, showed a correlation between slow learning and cognitive control: the brain’s ability to regulate itself by activating the necessary networks and inhibiting unnecessary activity. In that study, when people unnecessarily engaged parts of the brain linked to cognitive control, they were more likely to take longer to learn a simple task.

But beyond what might make an individual learn more slowly, the researchers want to know what sort of geometric patterns of brain activity make for better learning.

Evelyn Tang and Danielle Bassett

Their new study was led by Bassett and Evelyn Tang, who was an Africk Family Postdoctoral Fellow in Bassett’s Complex Systems Lab before starting at the Max Planck Institute this fall. Sharon Thompson-Schill, Christopher H. Browne Distinguished Professor and chair of Psychology, also contributed to the study.

The study was published in the journal Nature Neuroscience.

Read the full story at the Penn Engineering Medium Blog.

BE’s Danielle Bassett Profiled in Science Magazine

Danielle Bassett, PhD

Danielle Bassett, Eduardo D. Glandt Faculty Fellow and Associate Professor in theDepartment of Bioengineering, grew up in central Pennsylvania where she and her 10 siblings were homeschooled. Back then, Bassett had aspirations to become a professional pianist, a dream shattered by stress fractures in her arm at age 16.

Now, Bassett is a renowned physicist and MacArthur fellow who has pushed the field of network science, which studies connections and interactions between parts of a whole, to new realms. Bassett’s research focuses on brain function, including work on how brains of people with schizophrenia are organized, how brain communication changes with learning, and how the brain is able to switch between tasks.

Kelly Servick of Science sat down with Bassett to talk through her incredible journey from child pianist to leading network scientist:

““By 17, discouraged by her parents from attending college and disheartened at her loss of skill while away from the keys, she expected that responsibilities as a housewife and mother would soon eclipse any hopes of a career. ‘I wasn’t happy with that plan,’ she says.

Instead, Bassett catapulted herself into a life of research in a largely uncharted scientific field now known as network neuroscience. A Ph.D. physicist and a MacArthur fellow by age 32, she has pioneered the use of concepts from physics and math to describe the dynamic connections in the human brain. ‘She’s now the doyenne of network science,’ says theoretical neuroscientist Karl Friston of University College London. ‘She came from a formal physics background but was … confronted with some of the deepest questions in neuroscience.’”

Continue reading about Bassett’s career path and evolving research interests at Science.

Reposted from the Penn Engineering Medium blog. Media contact Evan Lerner.

Brain Network Control Emerges over Childhood and Adolescence

network control

 

The developing human brain contains a cacophony of electrical and chemical signals from which emerge the powerful adult capacities for decision-making, strategizing, and critical thinking. These signals support the trafficking of information across brain regions, in patterns that share many similarities with traffic patterns in railway and airline transportation systems. Yet while air traffic is guided by airport control towers, and railway routes are guided by signal control rooms, it remains a mystery how the information traffic in the brain is guided and how that guidance changes as kids grow.

In part, this mystery has been complicated by the fact that, unlike transportation systems, the brain is not hooked up to external controllers. Control must happen internally. The problem becomes even more complicated when we think about the sheer number of routes that must exist in the brain to support the full range of human cognitive capabilities. Thus, the controllers would need to produce a large set of control signals or use different control strategies. Where internal controllers might be, how they produce large variations in routing, and whether those controllers and their function change with age are important open questions.

A recent paper published in Nature Communications – a product of collaboration among the Departments of Bioengineering and Electrical & Systems Engineering at the University of Pennsylvania and the Department of Psychiatry of Penn’s Perelman School of Medicine – offers some interesting answers. In their article, Danielle Bassett, Ph.D., Eduardo D. Glandt Faculty Fellow and Associate Professor in the Penn BE Department, Theodore D. Satterthwaite, M.D., Assistant Professor in the Penn Psychiatry Department, postdoctoral fellow Evelyn Tang, and their colleagues suggest that control in the human brain works in a similar way to control in man-made robotic and other mechanical systems. Specifically, controllers exist inside each human brain, each region of the brain can perform multiple types of control, and this control grows as children grow.

As part of this study, the authors applied network control theory — an emerging area of systems engineering – to explain how the pattern of connections (or network) between brain areas directly informs the brain’s control functions. For example, hubs of the brain’s information trafficking system (like Grand Central Station in New York City) show quite different capacities for and sensitivities to control than non-hubs (like Newton Station, Kansas). Applying these ideas to a large set of brain imaging data from 882 youths in the Philadelphia area between the ages of 8 and 22 years old, the authors found that the brain’s predicted capacity for control increases over development. Older youths have a greater predicted capacity to push their brains into nearby mental states, as well as into distant mental states, indicating a greater potential for diversity of mental operations than in younger youths.

The investigators then asked whether the principles of network control could explain the specific manner in which connections in the brain change as youths age. They used tools from evolutionary game theory – traditionally used to study Darwinian competition and evolving populations in biology – to ‘evolve’ brain networks in silico from their 8-year old state to their 22-year-old state. The results demonstrated that the optimization of network control is a principle that explains the observed changes in brain connectivity as youths develop over childhood and adolescence. “One of the observations that I think is particularly striking about this study,” Bassett says, “is that the principles of network controllability are sufficient to explain the observed evolution in development, suggesting that we have identified a quintessential rule of developmental rewiring.”

This research informs many possible future directions in scientific research. “Showing that network control properties evolve during adolescence also suggests that abnormalities of this developmental process could be related to cognitive deficits that are present in many neuropsychiatric disorders,” says Satterthwaite. The discovery that the brain optimizes certain network control functions over time could have important implications for better understanding of neuroplasticity, skill acquisition, and developmental psychopathology.