Showing 1 - 3 of 3
In 1948 Shannon advanced information theory (IT) as a new branch of mathematics and a powerful tool for understanding the intricacies of the communication process. Nine years later Jaynes’ conceived the maximum entropy principle and was able to shed with it much light onto statistical...
Persistent link: https://www.econbiz.de/10010588670
Fisher's information measures, as adapted to a nonextensive (Tsallis) environment, are discussed. For systems of particles that are in a general state of motion a lower bound to these information measures is derived with the help of a recently established upper bound to the entropy increase....
Persistent link: https://www.econbiz.de/10011057684
Fourteen years ago John Archibald Wheeler launched his “it from bit” project to try and derive physics from information theory. We advance here a conceptual framework for this Wheeler program (WP), and summarize some recent progress with reference to different aspects of the role that...
Persistent link: https://www.econbiz.de/10011064050