Shannon's mutual information between two random variables is a fundamental and venerable concept in information and communication theory, statistics and beyond. What is a measure of mutual dependence among an arbitrary number of random variables? A notion of 'shared information' among multiple terminals, that observe correlated random variables ...
Narayan, Prakash (University of Maryland)
University of Minnesota, Institute for Mathematics and its Applications.