Course Title: Information and Coding Theory
Type of Course: Optional, Theory
Offered to: EEE
Pre-requisite Course(s): None
Entropy and Mutual Information: Entropy, joint entropy and conditional entropy, Relative entropy and mutual information, chain rules for entropy, relative entropy and mutual information, Jensen's inequality and log-sum inequality
Differential Entropy: Differential entropy and discrete entropy, joint and conditional differential entropy, properties of differential entropy, relative entropy and mutual information
Entropy Rates of Stochastic Process: Markov Chain, Entropy rate and hidden Markov models
Source Coding: Kraft inequality, optimal codes, Huffman code and its optimality, Shannon-Fano-Elias coding, arithmetic coding
Channel Capacity: Binary symmetric channels and properties of channel capacity, channel coding theorems, joint source and channel coding theorem
Block coding and decoding, BCH, RS codes, Convolutional coding, Viterbi Decoder, Turbo codes, decoding techniques STBC, SFBC, STFBC
Gaussian Channel: Introduction to Gaussian Channel, Band limited channel, Parallel Gaussian Channel, Gaussian Channel with feedback.
The main objective of this course is to introduce information theoretic concepts and develop the bounds on source coding and channel capacity.
Students will also become familiar with different source encoding and channel encoding techniques.
Basics of communication systems, random signals and processes.
COs | CO Statements | Corresponding POs | Learning Domain and Taxonomy Levels | Delivery Methods and Activities | Assessment Tools |
---|---|---|---|---|---|
1 | Understand entropy, mutual information. Differential entropy, entropy rate, source coding, channel coding, channel capacity | PO(a) | C1, C2 | Lectures, Tutorials, Homeworks | Assignment, Class test, Final exam |
2 | Employ source coding, channel coding theorems to solve various communication problems. | PO(a) | C1, C2, C3, C4 | Lectures, Tutorials, Homeworks | Assignment, Class test, Final exam |
Cognitive Domain Taxonomy Levels: C1 – Knowledge, C2 – Comprehension, C3 – Application, C4 – Analysis, C5 – Synthesis, C6 – Evaluation, Affective Domain Taxonomy Levels: A1: Receive; A2: Respond; A3: Value (demonstrate); A4: Organize; A5: Characterize; Psychomotor Domain Taxonomy Levels: P1: Perception; P2: Set; P3: Guided Response; P4: Mechanism; P5: Complex Overt Response; P6: Adaptation; P7: Organization
Program Outcomes (PO): PO(a) Engineering Knowledge, PO(b) Problem Analysis, PO(c) Design/development Solution, PO(d) Investigation,
PO(e) Modern tool usage, PO(f) The Engineer and Society, PO(g) Environment and sustainability, PO(h) Ethics, PO(i) Individual work and team work,
PO(j). Communication, PO(k) Project management and finance, PO(l) Life-long Learning
* For details of program outcome (PO) statements, please see the departmental website or course curriculum
K1 | K2 | K3 | K4 | K5 | K6 | K7 | K8 | CP1 | CP2 | CP3 | CP4 | CP5 | CP6 | CP7 | CA1 | CA2 | CA3 | CA4 | CA5 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
√ | √ | √ | √ | √ |
Lectures | Weeks | Topics (According to syllabus) | Mapping with COs |
---|---|---|---|
1-6 | 1-2 | Entropy and Mutual Information: Entropy, joint entropy and conditional entropy, Relative entropy and mutual information, chain rules for entropy, relative entropy and mutual information, Jensen's inequality and log-sum inequality | CO1 CO2 |
7-12 | 3-4 | Differential Entropy: Differential entropy and discrete entropy, joint and conditional differential entropy, properties of differential entropy, relative entropy and mutual information | CO1 CO2 |
13-18 | 5-6 | Entropy Rates of Stochastic Process: Markov Chain, Entropy rate and hidden Markov models | CO1 CO2 |
19-24 | 7-8 | Source Coding: Kraft inequality, optimal codes, Huffman code and its optimality, Shannon-Fano-Elias coding, arithmetic coding | CO1 CO2 |
25-30 | 9-10 | Channel Capacity: Binary symmetric channels and properties of channel capacity, channel coding theorems, joint source and channel coding theorem | CO1 CO2 |
31-36 | 11-12 | Block coding and decoding, BCH, RS codes, Convolutional coding, Viterbi Decoder, Turbo codes, decoding techniques STBC, SFBC, STFBC | CO1 CO2 |
37-42 | 13-14 | Gaussian Channel: Introduction to Gaussian Channel, Band limited channel, Parallel Gaussian Channel, Gaussian Channel with feedback | CO1 CO2 |
Class participation and attendance will be recorded in every class.
Four nos. of tests (Quiz, Assignment, Viva and Presentation) will be taken and best 3 nos. will be counted.
A comprehensive term final examination will be held at the end of the Term following the guideline of academic Council.
Class Participation 10%
Continuous Assessment 20%
Final Examination 70%
Total 100%
Elements of Information Theory by Joy A. Thomas and Thomas M. Cover
Other Resources (Online Resources or Others, if any)