EE 8??3 - Information Theory Effective date: Spring Semester 1999 New catalog listing: EE 8??3: Information Theory. (3). Three hour lecture. Entropy; Mutual Information; Markov Chains; Source Coding; Hypothesis Testing; Fisher Information; Rate Distortion Theory; applications to detection/estimation theory, communications, and time series models. Prerequisite: Consent of instructor. 1. DETAILED COURSE OUTLINE LECTURE (6 hours) 1. Definitions of Entropy and Mutual Information A. Entropy B. Relative Entropy C. Chain Rules for Information Measures D. Jensen's Inequality E. Fano's Inequality F. The Asymptotic Equipartition Property (4 hours) 2. Markov Processes A. Markov Chains B. Entropy Rate C. Random Walk D. Hidden Markov Models (10 hours) 3. Optimal Coding A. Examples of Codes B. The Kraft Inequality C. Optimal Codes D. Uniquely Decodable Codes E. Huffman Coding F. Shannon-Fano Codes G. Arithmetic Coding H. Competitive Optimality of the Shannon Code I. Fair Coins J. The Entropy of English (3 hours) 4. Kolmogorov Complexity A. Complexity and Its Applications B. Occam's Razor C. The Kolmogorov Sufficient Statistic (4 hours) 4. Channel Capacity A. The Channel Coding Theorem B. Hamming Codes C. Feedback Capacity D. The Joint Source Channel Coding Theorem (4 hours) 4. Entropy and Gaussian Channels A. Discrete Entropy B. Relative Entropy and Mutual Information C. Properties of Differential Entropy D. The Maximum Entropy Theorem and Its Applications (4 hours) 5. Information Theory and Statistics A. The Law of Large Numbers B. The Conditional Limit Theorem C. Hypothesis Testing D. Lempel-Ziv Coding and the Cramer-Rao Inequality (4 hours) 6. Rate Distortion Theory A. Quantization B. Calculation of the Rate Distortion Function C. Characterization of the Rate Distortion Function D. Computation of the Channel Capacity (3 hours) 7. Inequalities in Information Theory A. Bounds on Entropy and Relative Entropy B. Entropy and Fisher Information C. Inequalities for Determinants (3 hours) 8. Exams A. Semester Exam No. 1 (typically after 15 lectures) B. Semester Exam No. 2 (typically after 30 lectures) C. Semester Exam No. 3 (typically after 42 lectures) 2. Method of Evaluation Semester Exams 75% Final Exam 25% 3. Justification This course is being added to the Electrical and Computer Engineering curriculum as an upper-level graduate school course designed to develop an in-depth theoretical understanding of information theory. Fundamentals concepts involving the measure of information are introduced, and many related theorems and proofs are studied. The course is expected to develop a student's ability to think at a conceptual level, and to improve the mathematical rigor of our advanced graduate students. This course introduces many concepts fundamental to many areas of statistical signal processing, including applications courses offered in communications, speech and image processing. 4. SUPPORT This course will be taught by existing personnel in the Department of Electrical and Computer Engineering. This new workload has been considered in the departmental staffing plan. Semester-long projects utilize existing computer facilities in the department. Current library holdings are adequate to meet the needs of this course. 5. INSTRUCTOR OF RECORD Joseph Picone 6. GRADUATE STUDENT REQUIREMENTS None. 7. PLANNED FREQUENCY Planned frequency is once every two years in the spring semester. 8. EXAMINATION OF DUPLICATION This new course will not duplicate any existing course in the MSU Bulletin. 9. METHOD OF INSTRUCTION CODE ??? 10. PROPOSED C.I.P. NUMBER ??? 11. PROPOSED 20-CHARACTER ABBREVIATION ECE INFO THEORY 12. PROPOSED SEMESTER EFFECTIVE Spring 1999 13. OTHER APPROPRIATE INFORMATION Textbook: T.M. Cover and J.A. Thomas, Elements of Information Theory, Wiley Interscience, 1991. 14. PROPOSAL CONTACT PERSON Mike Nosser, 325-3912