Statistical physics and information theory perspectives on complex systems and networks
MetadataShow full item record
Complex physical, biological, and sociotehnical systems often display various phenomena that can't be understood using traditional tools of single disciplines. We describe work on developing and applying theoretical methods to understand phenomena of this type, using statistical physics, networks, spectral graph theory, information theory, and geometry. Financial systems--being highly stochastic, with agents in a complex environment--offer a unique arena to develop and test new ways of thinking about complexity. We develop a framework for analyzing market dynamics motivated by linear response theory, and propose a model based on agent behavior that naturally incorporates external influences. We investigate central issues such as price dynamics, processing and incorporation of information, and how agent behavior influences stability. We find that the mean field behavior of our model captures important aspects of return dynamics, and identify a stable-unstable regime transition depending on easily measurable model parameters. Our methods naturally connect external factors to internal market features and behaviors, and therefore address the crucial question of how system stability relates to agent behavior and external forces. Complex systems are often interconnected heterogeneously, with subunits influencing others counterintuitively due to specific details of their connections. Correlations are insufficient to characterize this due to, e.g., being symmetric and unable to discern directional relationships. We synthesize ideas from information and network theory to introduce a general tool for studying such relations in networks. Based on transfer entropy, we propose a measure--Effective Transfer Entropy Dependency--that measures influence by considering precisely how much of a source node's influence on targets is due to intermediates. We apply this to indices of the world's major markets, finding that our measure anticipates same-day correlation structure from lagged time-series data, and identifies influencers not found using standard correlations. Graphs are essential for understanding complex systems and datasets. We present new methods for identifying important structure in graphs, based on ideas from quantum information theory and statistical mechanics, and the renormalization group. We apply information geometry and spectral geometry to study the geometric structures that arise from graphs and random graph models, and suggest future extensions and applications to important problems like graph partitioning and machine learning.