The talk will be suitable for a general audience.
Students are strongly encouraged to participate.
In the view of C.E. Shannon, the father of information theory, information
is a sort of substance communicated or disclosed by the occurrence of an
event in the course of a probabilistic "experiment". He defined
the quantity of information disclosed by such an event to be the logarithm
(in some base, usually two) of the reciprocal of the prior probability of
the event.
We'll see how Shannon apparently arrived at this definition, and look at the brilliant justification of Shannon's definition due to Aczel and Daroczy. We'll also look at two purely mathematical theorems, the Noiseless and Noisy Channel Coding Theorems, as providing philo- sophical evidence in favor of the validity of Shannon's approach.
The information on the future (and past) Colloquia can be also found on web at the address: