分享
ebooksclub.org__Probability_and_Random_Processes_for_Electrical_and_Computer_Engineers.pdf
下载文档

ID:3616847

大小:4.96MB

页数:642页

格式:PDF

时间:2024-06-26

收藏 分享赚钱
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,汇文网负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。
网站客服:3074922707
ebooksclub org_Probability_and_Random_Processes_for_Electrical_and_Computer_Engineers
This page intentionally left blankPROBABILITY AND RANDOM PROCESSES FORELECTRICAL AND COMPUTER ENGINEERSThe theory of probability is a powerful tool that helps electrical and computerengineers explain,model,analyze,and design the technology they develop.Thetextbeginsattheadvancedundergraduatelevel,assumingonlyamodestknowledgeofprobability,andprogressesthroughmorecomplextopicsmasteredatthegraduatelevel.The first five chapters cover the basics of probability and both discrete andcontinuous random variables.The later chapters have a more specialized coverage,including random vectors,Gaussian random vectors,random processes,MarkovChains,and convergence.Describing tools and results that are used extensively inthe field,this is more than a textbook:it is also a reference for researchers workingin communications,signal processing,and computer network traffic analysis.Withover 300 worked examples,some 800 homework problems,and sections for exampreparation,thisisanessentialcompanionforadvancedundergraduateandgraduatestudents.Further resources for this title,including solutions,are available online atwww.cambridge.org/9780521864701.John A.Gubner has been on the Faculty of Electrical and ComputerEngineering at the University of Wisconsin-Madison since receiving his Ph.D.in 1988,from the University of Maryland at College Park.His research interestsinclude ultra-wideband communications;point processes and shot noise;subspacemethods in statistical processing;and information theory.A member of the IEEE,he has authored or co-authored many papers in the IEEE Transactions,includingthose on Information Theory,Signal Processing,and Communications.PROBABILITY AND RANDOMPROCESSES FOR ELECTRICAL ANDCOMPUTER ENGINEERSJOHN A.GUBNERUniversity of Wisconsin-Madisoncambridge university pressCambridge,New York,Melbourne,Madrid,Cape Town,Singapore,So PauloCambridge University PressThe Edinburgh Building,Cambridge cb2 2ru,UKFirst published in print formatisbn-13978-0-521-86470-1isbn-13978-0-511-22023-4 Cambridge University Press 20062006Information on this title:www.cambridge.org/9780521864701This publication is in copyright.Subject to statutory exception and to the provision ofrelevant collective licensing agreements,no reproduction of any part may take placewithout the written permission of Cambridge University Press.isbn-100-511-22023-5isbn-100-521-86470-4Cambridge University Press has no responsibility for the persistence or accuracy of urlsfor external or third-party internet websites referred to in this publication,and does notguarantee that any content on such websites is,or will remain,accurate or appropriate.Published in the United States of America by Cambridge University Press,New Yorkwww.cambridge.orghardbackeBook(EBL)eBook(EBL)hardbackTo Sue and JoeContentsChapter dependenciespagexPrefacexi1Introduction to probability11.1Sample spaces,outcomes,and events61.2Review of set notation81.3Probability models171.4Axioms and properties of probability221.5Conditional probability261.6Independence301.7Combinatorics and probability34Notes43Problems48Exam preparation622Introduction to discrete random variables632.1Probabilities involving random variables632.2Discrete random variables662.3Multiple random variables702.4Expectation80Notes96Problems99Exam preparation1063More about discrete random variables1083.1Probability generating functions1083.2The binomial random variable1113.3The weak law of large numbers1153.4Conditional probability1173.5Conditional expectation127Notes130Problems132Exam preparation1374Continuous random variables1384.1Densities and probabilities1384.2Expectation of a single random variable1494.3Transform methods1564.4Expectation of multiple random variables1624.5Probability bounds164Notes167Problems170Exam preparation1835Cumulative distribution functions and their applications1845.1Continuous random variables1855.2Discrete random variables1945.3Mixed random variables1975.4Functions of random variables and their cdfs2005.5Properties of cdfs2055.6The central limit theorem2075.7Reliability215viiviiiContentsNotes219Problems222Exam preparation2386Statistics2406.1Parameter estimators and their properties2406.2Histograms2446.3Confidence intervals for the mean known variance2506.4Confidence intervals for the mean unknown variance2536.5Confidence intervals for Gaussian data2566.6Hypothesis tests for the mean2626.7Regression and curve fitting2676.8Monte Carlo estimation271Notes273Problems276Exam preparation2857Bivariate random variables2877.1Joint and marginal probabilities2877.2Jointly continuous random variables2957.3Conditional probability and expectation3027.4The bivariate normal3097.5Extension to three or more random variables314Notes317Problems319Exam preparation3288Introduction to random vectors3308.1Review of matrix operations3308.2Random vectors and random matrices3338.3Transformations of random vectors3408.4Linear estimation of random vectors(Wiener filters)3448.5Estimation of covariance matrices3488.6Nonlinear estimation of random vectors350Notes354Problems354Exam preparation3609Gaussian random vectors3629.1Introduction3629.2Definition of the multivariate Gaussian3639.3Characteristic function3659.4Density function3679.5Conditional expectation and conditional probability3699.6Complex random variables and vectors371Notes373Problems375Exam preparation38210 Introduction to random processes38310.1 Definition and examples38310.2 Characterization of random processes38810.3 Strict-sense and wide-sense stationary processes39310.4 WSS processes through LTI systems40110.5 Power spectral densities for WSS processes40310.6 Characterization of correlation functions41010.7 The matched filter41210.8 The Wiener filter417Contentsix10.9 The WienerKhinchin theorem42110.10Mean-square ergodic theorem for WSS processes42310.11Power spectral densities for non-WSS processes425Notes427Problems429Exam preparation44011 Advanced concepts in random processes44311.1 The Poisson process44311.2 Renewal processes45211.3 The Wiener process45311.4 Specification of random processes459Notes466Problems466Exam preparation47512 Introduction to Markov chains47612.1 Preliminary results47612.2 Discrete-time Markov chains47712.3 Recurrent and transient states48812.4 Limiting n-step transition probabilities49612.5 Continuous-time Markov chains502Notes507Problems509Exam preparation51513 Mean convergence and applications51713.1 Convergence in mean of order p51813.2 Normed vector spaces of random variables52213.3 The KarhunenLo eve expansion52713.4 The Wiener integral(again)53213.5 Projections,orthogonality principle,projection theorem53413.6 Conditional expectation and probability53713.7 The spectral representation545Notes549Problems550Exam preparation56214 Other modes of convergence56414.1 Convergence in probability56414.2 Convergence in distribution56614.3 Almost-sure convergence572Notes579Problems580Exam preparation58915 Self similarity and long-range dependence59115.1 Self similarity in continuous time59115.2 Self similarity in discrete time59515.3 Asymptotic second-order self similarity60115.4 Long-range dependence60415.5 ARMA processes60615.6 ARIMA processes608Problems610Exam preparation613Bibliography615Index618Chapter dependencies1 Introduction to probability2 Introduction to discrete random variables3 More about discrete random variables6 Statistics7 Bivariate random variables10 Introduction to random processes13 Mean convergence and applications14 Other modes of convergence15 Self similarity and longrange dependence11.1 The Poisson process11.211.4 Advanced concepts in random processes5 Cumulative distribution functions and their applications4 Continuous random variables8 Introduction to random vectors9 Gaussian random vectors12.112.4 Discretetime Markov chains12.5 Continuoustime Markov chainsxPrefaceIntended audienceThis book is a primary text for graduate-level courses in probability and random pro-cesses that are typically offered in electrical and computer engineering departments.Thetext starts from first principles and contains more than enough material for a two-semestersequence.The level of the text varies from advanced undergraduate to graduate as thematerial progresses.The principal prerequisite is the usual undergraduate electrical andcomputer engineering course on signals and systems,e.g.,Haykin and Van Veen 25 orOppenheim and Willsky 39(see the Bibliography at the end of the book).However,laterchapters that deal with random vectors assume some familiarity with linear algebra;e.g.,determinants and matrix inverses.How to use the bookA first course.In a course that assumes at most a modest background in probability,thecore of the offering would include Chapters 15 and 7.These cover the basics of probabilityand discrete and continuous random variables.As the chapter dependencies graph on thepreceding page indicates,there is considerable flexibility in the selection and ordering ofadditional material as the instructor sees fit.A second course.In a course that assumes a solid background in the basics of prob-ability and discrete and continuous random variables,the material in Chapters 15 and 7can be reviewed quickly.In such a review,the instructor may want include sections andproblems marked with a?,as these indicate more challenging material that might notbe appropriate in a first course.Following the review,the core of the offering wouldinclude Chapters 8,9,10(Sections 10.110.6),and Chapter 11.Additional material fromChapters 1215 can be included to meet course goals and objectives.Level of course offerings.In any course offering,the level can be adapted to thebackground of the class by omitting or including the more advanced sections,remarks,and problems that are marked with a?.In addition,discussions of a highly technicalnature are placed in a Notes section at the end of the chapter in which they occur.Pointersto these discussions are indicated by boldface numerical superscripts in the text.Thesenotes can be omitted or included as the instructor sees fit.Chapter features Key equations are boxed:P(A|B):=P(AB)P(B).Important text passages are highlighted:Two events A and B are said to be independent if P(AB)=P(A)P(B).xixiiPreface Tables of discrete random variables and of Fourier transform pairs are found insidethe front cover.A table of continuous random variables is found inside the back cover.The index was compiled as the book was written.Hence,there are many cross-references to related information.For example,see“chi-squared random variable.”When cumulative distribution functions or other functions are encountered that do nothave a closed form,MATLABcommands are given for computing them;see“Matlabcommands”in the index for a list.The use of many commands is illustrated in theexamples and the problems throughout most of the text.Although some commandsrequire the MATLABStatistics Toolbox,alternative methods are also suggested;e.g.,the use of erf and erfinv for normcdf and norminv.Each chapter contains a Notes section.Throughout each chapter,numerical super-scripts refer to discussions in the Notes section.These notes are usually rather tech-nical and address subtleties of the theory.EachchaptercontainsaProblemssection.Therearemorethan800problemsthrough-out the book.Problems are grouped according to the section they are based on,andthis is clearly indicated.This enables the student to refer to the appropriate part ofthe text for background relating to particular problems,and it enables the instructorto make up assignments more quickly.In chapters intended for a first course,themore challenging problems are marked with a?.Problems requiring MATLABareindicated by the label MATLAB.Each chapter contains an Exam preparation section.This serves as a chapter sum-mary,drawing attention to key concepts and formulas.AcknowledgementsThe writing of this book has been greatly improved by the suggestions of many people.At the University of WisconsinMadison,the sharp eyes of the students in my classeson probability and random processes,my research students,and my postdocs have helpedme fix countless typos and improve explanations of several topics.My colleagues here havebeen generous with their comments and suggestions.Professor Rajeev Agrawal,now withMotorola,convinced me to treat discrete random variables before continuous random vari-ables.Discussions with Professor Bob Barmish on robustness of rational transfer functionsled to Problems 3840 in Chapter 5.I am especially grateful to Professors Jim Bucklew,YuHen Hu,and Akbar Sayeed,who taught from early,unpolished versions of the manuscript.Colleagues at other universities and students in their classes have also been generouswith their support.I thank Professors Toby Berger,Edwin Chong,and Dave Neuhoff,whohave used recent manuscripts in teaching classes on probability and random processes andhave provided me with detailed reviews.Special thanks go to Professor Tom Denney for hismultiple careful reviews of each chapter.Since writing is a solitary process,I am grateful to be surrounded by many supportivefamily members.I especially thank my wife and son for their endless patience and faithin me and this book,and I thank my parents for their encouragement and help when I waspreoccupied with writing.1Introduction to probabilityWhy do electrical and computer engineers need to study proba-bility?Probability theory provides powerful tools to explain,model,analyze,and design tech-nology developed by electrical and computer engineers.Here are a few applications.Signal processing.My own interest in the subject arose when I was an undergraduatetaking the required course in probability for electrical engineers.We considered the situa-tion shown in Figure 1.1.To determine the presence of an aircraft,a known radar pulse v(t)tXv()t+(radarv()tlinearsystemdetectorFigure 1.1.Block diagram of radar detection system.is sent out.If there are no objects in range of the radar,the radars amplifiers produce only anoise waveform,denoted by Xt.If there is an object in range,the reflected radar pulse plusnoise is produced.The overall goal is to decide whether the received waveform is noiseonly or signal plus noise.To get an idea of how difficult this can be,consider the signalplus noise waveform shown at the top in Figure 1.2.Our class addressed the subproblemof designing an optimal linear system to process the received waveform so as to make thepresence of the signal more obvious.We learned that the optimal transfer function is givenby the matched filter.If the signal at the top in Figure 1.2 is processed by the appropriatematched filter,we get the output shown at the bottom in Figure 1.2.You will study thematched filter in Chapter 10.Computer memories.Suppose you are designing a computer memory to hold k-bitwords.To increase system reliability,you employ an error-correcting-code system.Withthis system,instead of storing just the k data bits,you store an additional l bits(which arefunctions of the data bits).When reading back the(k+l)-bit word,if

此文档下载收益归作者所有

下载文档
你可能关注的文档
收起
展开