Twin academics Dani S. Basset, J. Peter Skirkanich Professor and director of the Complex Systems Lab, and Perry Zurn, a professor of philosophy at American University, were recently featured as guests on NPR radio show “Detroit Today” to discuss their new book, “Curious Mind: The Power of Connection.”
In their book, Basset and Zurn draw on their previous research, as well as an expansive network of ideas from philosophy, history, education and art to explore how and why people experience curiosity, as well as the different types it can take.
Basset, who holds appointments in the Departments of Bioengineering and Electrical and Systems Engineering, as well as the Department of Physics and Astronomy in Penn Arts & Science, and the Departments of Neuroscience and Psychiatry in Penn Perelman’s School of Medicine, and Zurn spoke with “Detroit Today” producer Sam Corey about what types of things make people curious, and how to stimulate more curiosity in our everyday lives.
According to the twin experts, curiosity is not a standalone facet of one’s personality. Basset and Zurn’s work has shown that a person’s capacity for inquiry is very much tied to the overall state of their health.
“There’s a lot of scientific research focusing on intellectual humility and also openness to ideas,” says Bassett. “And there are really interesting relationships between someone’s openness to ideas, someone’s intellectual humility and their curiosity and also their wellbeing or flourishing,”
From smartphones and fitness trackers to social media posts and COVID-19 cases, the past few years have seen an explosion in the amount and types of data that are generated daily. To help make sense of these large, complex datasets, the field of data science has grown, providing methodologies, tools, and perspectives across a wide range of academic disciplines.
As part of its $750 million investment in science, engineering, and medicine, the University has committed to supporting the future needs of this field. To this end, the Innovation in Data Engineering and Science (IDEAS) initiative will help Penn become a leader in developing data-driven approaches that can transform scientific discovery, engineering research, and technological innovation.
“The IDEAS initiative is game-changing for our University,” says President Amy Gutmann. “This new investment allows us to boost our interdisciplinary efforts across campus, recruit phenomenal additional team members, and generate an even more sound foundation for discovery, experimentation, and design. This initiative is a clear statement that Penn is committed to taking data science head-on.”
“One of the unique things about data science and data engineering is that it’s a very horizontal technology, one that is going to be impacting every department on campus,” says George Pappas, Electrical and Systems Engineering Department chair. “When you have a horizontal technology in a competitive area, we have to figure out specific areas where Penn can become a worldwide leader.”
To do this, IDEAS aims to recruit new faculty across three research areas: artificial intelligence (AI) to transform scientific discovery, trustworthy AI for autonomous systems, and understanding connections between the human brain and AI.
In the area of neuroscience and how the human brain is similar to AI and machine learning approaches, research from PIK Professor Konrad Kording and Dani Bassett’sComplex Systems lab exemplifies the types of cross-disciplinary efforts that are essential for addressing complex questions. By recruiting additional faculty in this area, IDEAS will help Penn make strides in bio-inspired computing and in future life-changing discoveries that could address cognitive disorders and nervous system diseases.
Dani S. Bassett, J. Peter Skirkanich Professor in Bioengineering and in Electrical and Systems Engineering
Bassett runs the Complex Systems lab which tackles problems at the intersection of science, engineering, and medicine using systems-level approaches, exploring fields such as curiosity, dynamic networks in neuroscience, and psychiatric disease. They are a pioneer in the emerging field of network science which combines mathematics, physics, biology and systems engineering to better understand how the overall shape of connections between individual neurons influences cognitive traits.
Jason A. Burdick, Robert D. Bent Professor in Bioengineering
Burdick runs the Polymeric Biomaterials Laboratory which develops polymer networks for fundamental and applied studies with biomedical applications with a specific emphasis on tissue regeneration and drug delivery. The specific targets of his research include: scaffolding for cartilage regeneration, controlling stem cell differentiation through material signals, electrospinning and 3D printing for scaffold fabrication, and injectable hydrogels for therapies after a heart attack.
César de la Fuente, Presidential Assistant Professor in Bioengineering and Chemical & Biomedical Engineering in Penn Engineering and in Microbiology and Psychiatry in the Perelman School of Medicine
De la Fuente runs the Machine Biology Group which combines the power of machines and biology to prevent, detect, and treat infectious diseases. He pioneered the development of the first antibiotic designed by a computer with efficacy in animals, designed algorithms for antibiotic discovery, and invented rapid low-cost diagnostics for COVID-19 and other infections.
Carl H. June, Richard W. Vague Professor in Immunotherapy in the Perelman School of Medicine and member of the Bioengineering Graduate Group
June is the Director for the Center for Cellular Immunotherapies and the Parker Institute for Cancer Therapy and runs the June Lab which develops new forms of T cell based therapies. June’s pioneering research in gene therapy led to the FDA approval for CAR T therapy for treating acute lymphoblastic leukemia (ALL), one of the most common childhood cancers.
Vivek Shenoy, Eduardo D. Glandt President’s Distinguished Professor in Bioengineering, Mechanical Engineering and Applied Mechanics (MEAM), and in Materials Science and Engineering (MSE)
Shenoy runs the Theoretical Mechanobiology and Materials Lab which develops theoretical concepts and numerical principles for understanding engineering and biological systems. His analytical methods and multiscale modeling techniques gain insight into a myriad of problems in materials science and biomechanics.
The highly anticipated annual list identifies researchers who demonstrated significant influence in their chosen field or fields through the publication of multiple highly cited papers during the last decade. Their names are drawn from the publications that rank in the top 1% by citations for field and publication year in the Web of Science™ citation index.
Bassett and Burdick were both on the Highly Cited Researchers list in 2019 and 2020.
The methodology that determines the “who’s who” of influential researchers draws on the data and analysis performed by bibliometric experts and data scientists at the Institute for Scientific Information™ at Clarivate. It also uses the tallies to identify the countries and research institutions where these scientific elite are based.
David Pendlebury, Senior Citation Analyst at the Institute for Scientific Information at Clarivate, said: “In the race for knowledge, it is human capital that is fundamental and this list identifies and celebrates exceptional individual researchers who are having a great impact on the research community as measured by the rate at which their work is being cited by others.”
The full 2021 Highly Cited Researchers list and executive summary can be found online here.
The prestigious APS Fellowship Program signifies recognition by one’s professional peers. Each year, no more than one half of one percent of the APS membership is recognized with this distinct honor. Bassett’s election and groundbreaking work in biological physics and network science will be recognized through presentation of a certificate at the APS March Meeting.
Bassett is a pioneer in the field of network neuroscience, an emerging subfield which incorporates elements of mathematics, physics, biology and systems engineering to better understand how the overall shape of connections between individual neurons influences cognitive traits. They lead the Complex Systems lab which tackles problems at the intersection of science, engineering, and medicine using systems-level approaches, exploring fields such as curiosity, dynamic networks in neuroscience, and psychiatric disease.
Bassett recently collaborated with Penn artist-in-residence Rebecca Kamen and other scholars on an interdisciplinary art exhibit on the creative process in art and science at the Katzen Art Center at American University. They have also published research modeling different types of curiosity and exploring gender-based citation bias in neuroscience publishing.
“I’m thrilled and humbled to receive this honor from the American Physical Society,” says Bassett. “I am indebted to the many fantastic mentees, colleagues, and mentors that have made my time in science such an exciting adventure. Thank you.”
Last month, the second annual Women in Data Science (WiDS) @ Penn Conference virtually gathered nearly 500 registrants to participate in a week’s worth of academic and industry talks, live speaker Q&A sessions, and networking opportunities.
Following welcoming remarks from Erika James, Dean of the Wharton School, and Vijay Kumar, Nemirovsky Family Dean of Penn Engineering, the conference began with a keynote address from President of Microsoft US and Wharton alumna Kate Johnson.
Conference sessions continued throughout the week, featuring panels of academic data scientists from around Penn and beyond, industry leaders from IKEA Digital, Facebook and Poshmark, and lightning talks from students speakers who presented their data science research.
All of the conference’s sessions are now available on YouTube and the 2021 WiDS Conference Recap, including a talk titled “How Humans Build Models for the World” by Danielle Bassett, J. Peter Skirkanich Professor in Bioengineering and Electrical and Systems Engineering.
Curiosity has been found to play a role in our learning and emotional well-being, but due to the open-ended nature of how curiosity is actually practiced, measuring it is challenging. Psychological studies have attempted to gauge participants’ curiosity through their engagement in specific activities, such as asking questions, playing trivia games, and gossiping. However, such methods focus on quantifying a person’s curiosity rather than understanding the different ways it can be expressed.
Efforts to better understand what curiosity actually looks like for different people have underappreciated roots in the field of philosophy. Varying styles have been described with loose archetypes, like “hunter” and “busybody” — evocative, but hard to objectively measure when it comes to studying how people collect new information.
A new study led by researchers at the University of Pennsylvania’s School of Engineering and Applied Science, the Annenberg School for Communication, and the Department of Philosophy and Religion at American University, uses Wikipedia browsing as a method for describing curiosity styles. Using a branch of mathematics known as graph theory, their analysis of curiosity opens doors for using it as a tool to improve learning and life satisfaction.
The interdisciplinary study, published in Nature Human Behavior, was undertaken by Danielle Bassett, J. Peter Skirkanich Professor in Penn Engineering’s Departments of Bioengineering and Electrical and Systems Engineering, David Lydon-Staley, then a post-doctoral fellow in her lab, now an assistant professor in the Annenberg School of Communication, two members of Bassett’s Complex Systems Lab, graduate student Dale Zhou and postdoctoral fellow Ann Sizemore Blevins, and Perry Zurn, assistant professor from American University’s Department of Philosophy.
“The reason this paper exists is because of the participation of many people from different fields,” says Lydon-Staley. “Perry has been researching curiosity in novel ways that show the spectrum of curious practice and Dani has been using networks to describe form and function in many different systems. My background in human behavior allowed me to design and conduct a study linking the styles of curiosity to a measurable activity: Wikipedia searches.”
Zurn’s research on how different people express curiosity provided a framework for the study.
The nature of scientific progress is often summarized by the Isaac Newton quotation, “If I have seen further it is by standing on the shoulders of giants.” Each new study draws on dozens of earlier ones, forming a chain of knowledge stretching back to Newton and the scientific giants his work referenced.
Scientific publishing and referencing has become more formal since Newton’s time, with databases of citations allowing for sophisticated quantitative analyses of that flow of information between researchers.
The Institute for Scientific Information and the Web of Science Group provide a yearly snapshot of this flow, publishing a list of the researchers who are in the top 1 percent of their respective fields when it comes to the number of times their work has been cited.
Danielle Bassett, J. Peter Skirkanich Professor in the departments of Bioengineering and Electrical and Systems Engineering, and Jason Burdick, Robert D. Bent Professor in the department of Bioengineering, are among the 6,389 researchers named to the 2020 list.
Bassett is a pioneer in the field of network neuroscience, which incorporates elements of mathematics, physics, biology and systems engineering to better understand how the overall shape of connections between individual neurons influences cognitive traits. Burdick is an expert in tissue engineering and the design of biomaterials for regenerative medicine; by precisely tailoring the microenvironment within these materials, they can influence stem cell differentiation or trigger the release of therapeutics.
In a ‘Wired’ feature, Bassett helps explain the growing field of network neuroscience and how the form and function of the brain are connected.
Early attempts to understand how the brain works included the pseudoscience of phrenology, which theorized that various mental functions could be determined through the shape of the skull. While those theories have long been debunked, modern neuroscience has shown a kernel of truth to them: those functions are highly localized to different regions of the brain.
Now, Danielle Bassett, Professor of J. Peter Skirkanich Professor of Bioengineering and Electrical and Systems Engineering, is pioneering a new subfield that goes even deeper into the connection between the brain’s form and function: network neuroscience.
In a recent feature article in Wired, Bassett explains the concepts behind this new subfield. While prior understanding has long relied on the idea that certain areas of the brain control certain functions, Bassett and other network neuroscientists are using advances in imaging and machine learning to reveal the role the connections between those areas play.
For Bassett, one of the first indicators that these connections mattered more than previously realized was the shape of the neurons themselves.
Speaking with Wired’s Grace Huckins, Bassett says:
“Neurons are not spherical — neurons have a cell body, and then they have this long tail that allows them to connect to many other cells. You can even look at the morphology of the neuron and say, ‘Oh, well, connectivity has to matter. Otherwise, it wouldn’t look like this.’”
Read more about Bassett and the field of network neuroscience in Wired.
New research finds that works of literature, musical pieces, and social networks have a similar underlying structure that allows them to share large amounts of information efficiently.
By Erica K. Brockmeier
To an English scholar or avid reader, the Shakespeare Canon represents some of the greatest literary works of the English language. To a network scientist, Shakespeare’s 37 plays and the 884,421 words they contain also represent a massively complex communication network. Network scientists, who employ math, physics, and computer science to study vast and interconnected systems, are tasked with using statistically rigorous approaches to understand how complex networks, like all of Shakespeare, convey information to the human brain.
New research published in Nature Physics uses tools from network science to explain how complex communication networks can efficiently convey large amounts of information to the human brain. Conducted by postdoc Christopher Lynn, graduate students Ari Kahn and Lia Papadopoulos, and professor Danielle S. Bassett, the study found that different types of networks, including those found in works of literature, musical pieces, and social connections, have a similar underlying structure that allows them to share information rapidly and efficiently.
Technically speaking, a network is simply a statistical and graphical representation of connections, known as edges, between different endpoints, called nodes. In pieces of literature, for example, a node can be a word, and an edge can connect words when they appear next to each other (“my” — “kingdom” — “for” — “a” — “horse”) or when they convey similar ideas or concepts (“yellow” — “orange” — “red”).
The advantage of using network science to study things like languages, says Lynn, is that once relationships are defined on a small scale, researchers can use those connections to make inferences about a network’s structure on a much larger scale. “Once you define the nodes and edges, you can zoom out and start to ask about what the structure of this whole object looks like and why it has that specific structure,” says Lynn.
Building on the group’s recent study that models how the brain processes complex information, the researchers developed a new analytical framework for determining how much information a network conveys and how efficient it is in conveying that information. “In order to calculate the efficiency of the communication, you need a model of how humans receive the information,” he says.
Researchers develop a new model for how the brain processes complex information: by striking a balance between accuracy and simplicity while making mistakes along the way.
By Erica K. Brockmeier
The human brain is a highly advanced information processor composed of more than 86 billion neurons. Humans are adept at recognizing patterns from complex networks, such as languages, without any formal instruction. Previously, cognitive scientists tried to explain this ability by depicting the brain as a highly optimized computer, but there is now discussion among neuroscientists that this model might not accurately reflect how the brain works.
Now, Penn researchers have developed a different model for how the brain interprets patterns from complex networks. Published in Nature Communications, this new model shows that the ability to detect patterns stems in part from the brain’s goal to represent things in the simplest way possible. Their model depicts the brain as constantly balancing accuracy with simplicity when making decisions. The work was conducted by physics Ph.D. student Christopher Lynn, neuroscience Ph.D. student Ari Kahn, and Danielle Bassett, J. Peter Skirkanich Professor in the departments of Bioengineering and Electrical and Systems Engineering.
This new model is built upon the idea that people make mistakes while trying to make sense of patterns, and these errors are essential to get a glimpse of the bigger picture. “If you look at a pointillist painting up close, you can correctly identify every dot. If you step back 20 feet, the details get fuzzy, but you’ll gain a better sense of the overall structure,” says Lynn.
To test their hypothesis, the researchers ran a set of experiments similar to a previous study by Kahn. That study found that when participants were shown repeating elements in a sequence, such as A-B-C-B, etc., they were automatically sensitive to certain patterns without being explicitly aware that the patterns existed. “If you experience a sequence of information, such as listening to speech, you can pick up on certain statistics between elements without being aware of what those statistics are,” says Kahn.
To understand how the brain automatically understands such complex associations within sequences, 360 study participants were shown a computer screen with five gray squares corresponding to five keys on a keyboard. As two of the five squares changed from gray to red, the participants had to strike the computer keys that corresponded to the changing squares. For the participants, the pattern of color-changing squares was random, but the sequences were actually generated using two kinds of networks.
The researchers found that the structure of the network impacted how quickly the participants could respond to the stimuli, an indication of their expectations of the underlying patterns. Responses were quicker when participants were shown sequences that were generated using a modular network compared to sequences coming from a lattice network.