Java程序辅导

C C++ Java Python Processing编程在线培训 程序编写 软件开发 视频讲解

客服在线QQ:2653320439 微信:ittutor Email:itutor@qq.com
wx: cjtutor
QQ: 2653320439
Computer Laboratory – Course material 2008–09: Information Theory and Coding Skip over navigation | Access key help ||| Computer Laboratory Course material 2008–09   Computer Laboratory > Teaching > Course material 2008–09 > Information Theory and Coding Additional Topics Advanced Graphics Advanced Systems Topics Algorithms I Algorithms II Artificial Intelligence I Artificial Intelligence II Bioinformatics Business Studies Comparative Architectures Compiler Construction Complexity Theory Computation Theory Computer Design Computer Graphics and Image Processing Computer Systems Modelling Computer Vision Concepts in Programming Languages Concurrent Systems and Applications Databases Denotational Semantics Digital Communication I Digital Communication II Digital Electronics Digital Signal Processing Discrete Mathematics I Discrete Mathematics II Distributed Systems E-Commerce ECAD ECAD Labs » Economics and Law Floating-Point Computation Foundations of Computer Science Foundations of Functional Programming Group Project Group Project Briefing Hardware Practical Classes How to Study Computer Science Human-Computer Interaction Information Retrieval Information Theory and Coding Introduction to Security Logic and Proof Long Vacation Java course Mathematical Methods for Computer Science Natural Language Processing Operating Systems I Optimising Compilers Part IB Assessed Exercise Briefing Probability Professional Practice and Ethics Programming Methods Programming in C and C++ Programming in Java Prolog Quantum Computing Registration Regular Languages and Finite Automata Security Semantics of Programming Languages Software Design Software Engineering Specification and Verification I Specification and Verification II System-on-Chip Design Topics in Concurrency Types Unix Tools   Information Theory and Coding 2008–09 Lecturer: Dr John Daugman Taken by: Part II Prerequisite courses: Probability; Mathematical Methods for Computer Science Number of lectures: 11 + 1 (Mon, Wed, Fri at 10, Lecture Theatre 2) Aims The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; how these are used to calculate the capacity of a communication channel, with and without noise; coding schemes, including error correcting codes; how discrete channels and measures of information generalize to their continuous forms; the Fourier perspective; and extensions to wavelets, complexity, compression, and efficient coding of audio-visual information. Lectures Foundations: probability, uncertainty, information. How concepts of randomness, redundancy, compressibility, noise, bandwidth, and uncertainty are related to information. Ensembles, random variables, marginal and conditional probabilities. How the metrics of information are grounded in the rules of probability. Entropies defined, and why they are measures of information. Marginal entropy, joint entropy, conditional entropy, and the Chain Rule for entropy. Mutual information between ensembles of random variables. Why entropy is the fundamental measure of information content. Source coding theorem; prefix, variable-, and fixed-length codes. Symbol codes. The binary symmetric channel. Capacity of a noiseless discrete channel. Error correcting codes. Channel types, properties, noise, and channel capacity. Perfect communication through a noisy channel. Capacity of a discrete channel as the maximum of its mutual information over all possible input distributions. Continuous information; density; noisy channel coding theorem. Extensions of the discrete entropies and measures to the continuous case. Signal-to-noise ratio; power spectral density. Gaussian channels. Relative significance of bandwidth and noise limitations. The Shannon rate limit and efficiency for noisy continuous channels. Fourier series, convergence, orthogonal representation. Generalised signal expansions in vector spaces. Independence. Representation of continuous or discrete data by complex exponentials. The Fourier basis. Fourier series for periodic functions. Examples. Useful Fourier theorems; transform pairs. Sampling; aliasing. The Fourier transform for non-periodic functions. Properties of the transform, and examples. Nyquist's Sampling Theorem derived, and the cause (and removal) of aliasing. Discrete Fourier transform. Fast Fourier Transform Algorithms. Efficient algorithms for computing Fourier transforms of discrete data. Computational complexity. Filters, correlation, modulation, demodulation, coherence. The quantised degrees-of-freedom in a continuous signal. Why a continuous signal of finite bandwidth and duration has a fixed number of degrees-of-freedom. Diverse illustrations of the principle that information, even in such a signal, comes in quantised, countable, packets. Gabor-Heisenberg-Weyl uncertainty relation. Optimal ``Logons''. Unification of the time-domain and the frequency-domain as endpoints of a continuous deformation. The Uncertainty Principle and its optimal solution by Gabor's expansion basis of ``logons''. Multi-resolution wavelet codes. Extension to images, for analysis and compression. Kolmogorov complexity and minimal description length. Definition of the algorithmic complexity of a data sequence, and its relation to the entropy of the distribution from which the data was drawn. Fractals. Minimal description length, and why this measure of complexity is not computable. Objectives At the end of the course students should be able to calculate the information content of a random variable from its probability distribution relate the joint, conditional, and marginal entropies of variables in terms of their coupled probabilities define channel capacities and properties using Shannon's Theorems construct efficient codes for data on imperfect communication channels generalise the discrete concepts to continuous signals on continuous channels understand Fourier Transforms and the main ideas behind efficient algorithms for them describe the information resolution and compression properties of wavelets Recommended book Cover, T.M. & Thomas, J.A. (1991). Elements of Information Theory. New York: Wiley. Classical paper by Claude Shannon (1948) for reference: "A Mathematical Theory of Communication" Lecture Notes (.pdf) (copy of Notes with corrected typos in the printed version high-lighted) (.pdf) Learning Guide and exercise problems (.pdf) Exercise assignments from the Learning Guide: 17 October 2008: Exercises 1 and 2, 14(A), 19(A), and 20(A). 24 October 2008: Exercises 3(A), 9(A,B); 5(A) and 8(A-D) 31 October 2008: Exercises 4, 5(B), 9(D,E), 10(3), 13, 20(D), and 21. 7 November 2008: Supplementary material from the final lecture (compressive properties of 2D Gabor wavelets, and reconstruction from them) may be found here. Some supplementary material related to this course, but which will not be covered this year, prepared by Dr Markus Kuhn, is deposited here. Syllabus Past exam questions   © 2008 Computer Laboratory, University of Cambridge Please send any comments on this page to Dr John Daugman Last modified 2008-10-20 11:44 by John Daugman