欧美一级特黄aaaaaaa在线观看-欧美一级特黄aaaaaa在线看片-欧美一级特黄aa大片-欧美一级特黄刺激大片视频-深夜久久-深夜激情网站

熱門關鍵字:  聽力密碼  聽力密碼  新概念美語  單詞密碼  巧用聽寫練聽力

Information Theory & Coding信息論與編碼(英文版)

中國水利水電出版社
    【作 者】梁建武 等編著 【I S B N 】978-7-5084-5569-3 【責任編輯】徐雯 【適用讀者群】本科 【出版時間】2008-07-01 【開 本】16開本 【裝幀信息】平裝(光膜) 【版 次】2008年07月第1版 【頁 數】212 【千字數】 【印 張】 【定 價】24 【叢 書】21世紀高等院校規劃教材 【備注信息】
圖書詳情

      本書重點介紹經典信息論的基本理論,并力圖將信息論的基本理論和工程應用的編碼理論聯系起來,介紹一些關于這些理論的實際應用。全書分為7章,內容包括信息度量的基本理論、無失真信源編碼、限失真信源編碼、信道編碼及其應用等。

      本書注重基本概念,并且用通俗易懂的語言對它們加以詮釋。在當前信息、通信系統飛速發展的大背景下,本書力圖用較多的例子和圖表來闡述概念和理論,同時盡量避免糾纏于煩瑣難懂的公式證明之中。為了加深讀者對所講述知識的理解,每章最后都配有適量的練習題供讀者選用。

      本書可作為高等院校電子信息類學生雙語教學的教材或參考書,也可作為通信、電信、電子等領域從業人員的參考資料。

      Information theory is a mathematical theory of communication theory. It is science to study information measure and coding using mathematical statistics method. It can be divided into generalized information theory and narrow information theory which is also called classical theory of information or probabilistic information theory.

      This book is on the foundation of probabilistic information measures and their application to coding theorems for information sources and noisy channels. It pays attention to elaborate the basic theories, concepts and methods. Under this guideline, there are many examples throughout this book to make it clearer about these concepts and theorems. Too mathematical a book has the potential danger of scaring away beginner. So, it makes every effort to use simple mathematics tools in the argument of information theory, which may lead beginners to the way of mastering it.

      It can be used as teaching material in Electronics Department for college students of upper grade or for graduate students. The whole book is composed by seven chapters. They are both lively introductions and fairly detailed references to the fascinating world of information theory.

    Chapter 1 is introduction, a general illustration from the angle of history about the basic problems this book dealing with. For example, the concept of information and the measure method of it, the formation and development of information theory are all included.

    Chapter 2 deals with some basic concepts on information measurement. Through the study of this chapter you may gain the ability of knowing how much information you can get from a trial of tossing a coin and read the result of it. Also, this chapter introduces some theorems on continuous source.

    Chapter 3 is on discrete source information. The source model and the calculation of source entropy are introduced in this part. Through the study of this part, the reader can understand why there is space for compression.

    Chapter 4 is about channels. For channel, it uses channel capacity to measure the transfer ability. The channel model and the capacity calculation of some typical channels discrete or continuous ones are the focus. After the study of this part, readers can see the real situation of information transmission.

    Chapter 5 deals with lossless source coding theorem. In chapter 3, it knows that source needs to be compressed to suit for the transmission. In this part, it shows how to compress the information of source keeps.

    Chapter 6 deals with limited distortion source coding. Former chapters are all in condition of without distortion, but in real world, distortion in some degree can’t be realized by people’s sense, so the information in transmission or records can be less. This chapter is about the theorems and practice of this idea.

    Chapter 7 introduces the coding theory. Some famous and useful coding method is included in it, such as the CRC code and Hamming code.

      The arrangement of content for this book is from simple to complex. To improve the ability of analyzing and solving problems, there are some questions and exercises at the end of each chapter.

      This book is the harvest of hard work and wisdom of a group of people. The members participating the writing of the book are LiangJianwu(Chapter1,2,3,4), GuoYing(Chapter7), LuoXiying(from Hunan University of Science and Technology, Chapter5),LiuJunjun(Chapter6), LongXiaomei(verify), TianYe(proofreading), ZhouYuanyuan(typesetting). Otherwise, HeZhibin, LiHuawei, FuShifeng, TanHailong, WenZheng, all did great favors for the writing of this book. We have tried our best to free the book from all errors, but unfortunately there does not exist a foolproof error control technique for that. So, you are welcome to point out our mistakes and give us your opinions.

     

      Authors of this book in Central South University

    Chapter 1  Introduction 1
    Contents 1
    Before it starts, there is something must be known 1
    1.1  What is Information 2
    1.2  What’s Information Theory? 4
    1.2.1  Origin and Development of Information Theory 4
    1.2.2  The application and achievement of Information Theory methods 6
    1.3  Formation and Development of Information Theory 7
    Questions and Exercises 8
    Biography of Claude Elwood Shannon 8
    Chapter 2  Basic Concepts of Information Theory 11
    Contents 11
    Preparation knowledge 11
    2.1  Self-information and conditional self-information 12
    2.1.1  Self-Information 12
    2.1.2  Conditional Self-Information 14
    2.2  Mutual information and conditional mutual information 14
    2.3  Source entropy 16
    2.3.1  Introduction of entropy 16
    2.3.2  Mathematics description of source entropy 17
    2.3.3  Conditional entropy 20
    2.3.4  Union entropy(Communal entropy) 20
    2.3.5  Basic nature and theorem of source entropy 21
    2.4  Average mutual information 26
    2.4.1  Definition 26
    2.4.2  Physics significance of average mutual information 27
    2.4.3  Properties of average mutual information 28
    2.5  Continuous source 38
    2.5.1  Entropy of the continuous source (also called differential entropy) 39
    2.5.2  Mutual information of the continuous random variable 44
    Questions and Exercises 44
    Additional reading materials 46
    Chapter 3  Discrete Source Information 51
    Contents 51
    3.1  Mathematical model and classification of the source 51
    3.2  The discrete source without memory 54
    3.3  Multi-marks discrete steady source 60
    3.4  Source entropy of discrete steady source and limit entropy 67
    3.5  The source redundancy and the information difference 71
    3.6  Markov information source 71
    Exercise 77
    Chapter 4  Channel and Channel Capacity 79
    Contents 79
    4.1  The model and classification of the channel 79
    4.1.1  Channel Models 79
    4.1.2  Channel classifications 80
    4.2  Channel doubt degree and average mutual information 82
    4.2.1  Channel doubt degree 82
    4.2.2  Average mutual information 82
    4.2.3  Properties of mutual information function 83
    4.2.4  Relationship between entropy, channel doubt degree and mutual information 86
    4.3  The discrete channel without memory and its channel capacity 88
    4.4  Channel capacity 89
    4.4.1  Concept of channel capacity 89
    4.4.2  Discrete channel without memory and its channel capacity 91
    4.4.3  Continuous channel and its channel capacity 99
    Chapter 5  Lossless source coding 106
    Contents 106
    5.1  Lossless coder 106
    5.2  Lossless source coding 110
    5.2.1  Fixed length coding theorem 110
    5.2.2  Unfixed length source coding 113
    5.3  Lossless source coding theorems 115
    5.3.1  Classification of code and main coding method 115
    5.3.2  Kraft theorem 116
    5.3.3  Lossless unfixed source coding theorem (Shannon First theorem) 116
    5.4  Pragmatic examples of lossless source coding 120
    5.4.1  Huffman coding 120
    5.4.2  Shannon coding and Fano coding 128
    5.5  The Lempel-ziv algorithm 130
    5.6  Run-Length Encoding and the PCX format 132
    Questions and Exercises 134
    Chapter 6  Limited distortion source coding 137
    Contents 137
    6.1  The start point of limit distortion theory 138
    6.2  Distortion measurement 140
    6.2.1  Distortion function 140
    6.2.2  Average distortion 142
    6.3  Information rate distortion function 143
    6.4  Property of R(D) 145
    6.4.1  Minimum of D and R(D) 145
    6.4.2  Dmax and R(Dmax) 151
    6.4.3  The under convex function of R(D) 154
    6.4.4  The decreasing function of R(D) 154
    6.4.5  R(D) is a continuous function of D 155
    6.5  Calculation of R(D) 156
    6.5.1  Calculation of R(D) of binary symmetric source 156
    6.5.2  Calculation of R(D) of Gauss source 158
    6.6  Limited distortion source encoding theorem 159
    Additional material for this chapter 161
    Questions and exercises 168
    Chapter 7  Channel Coding Theory 170
    Contents 170
    7.1  Channel coding theorem for noisy channel 170
    7.2  Introduction: the generator and parity-check matrices 174
    7.3  Syndrome decoding on q-ary symmetric channels 177
    7.4  Hamming geometry and code performance 179
    7.5  Hamming codes 180
    7.6  Cyclic code 181
    7.7  Syndrome decoding on general q-ary channels 191
    Questions and exercises 194
    Bibliography 197
最新評論共有 0 位網友發表了評論
發表評論
評論內容:不能超過250字,需審核,請自覺遵守互聯網相關政策法規。
用戶名: 密碼:
匿名?
注冊
主站蜘蛛池模板: 欧美不卡视频在线观看| 欧美激情网站| 999网站| 国产婷婷综合在线视频中| 色综合视频一区二区三区| 黄视频在线免费| 一区二区三区在线视频观看| 精品国产bdsm| 亚洲色图视频在线观看| 国产精品免费大片| 日本一区二区三区国产| 91久久国产情侣真实对白| 免费无遮挡毛片| 中文福利视频| 国产免费69成人精品视频| 色综合合久久天天综合绕视看| www.一区二区三区| 免费视频国产| 伊人2| 国产成人久久精品激情91| 欧美成人三级视频| 五月激情婷婷丁香| 国产亚洲一区二区三区在线| 色呦呦视频在线观看| 日本精品视频一区二区| 精品中文字幕乱码一区二区| 亚洲国产天堂久久综合图区 | 亚洲国产精品自在在线观看| 成人夜夜| 久久国产精品高清一区二区三区| 亚洲图片欧美小说| 大色视频| 久草热在线观看| 天天视频入口| 亚洲se吧| 国产精品第九页| 久久亚洲精品成人综合| 亚洲激情在线| 甘城光辉游乐园在线观看| 国产一区在线看| 看全色黄大色黄大片免责看|