好文档就是一把金锄头!
欢迎来到金锄头文库![会员中心]
电子文档交易市场
安卓APP | ios版本
电子文档交易市场
安卓APP | ios版本

continuous distributionsand the poisson process.pdf

37页
  • 卖家[上传人]:aa****6
  • 文档编号:37033932
  • 上传时间:2018-04-06
  • 文档格式:PDF
  • 文档大小:3.98MB
  • / 37 举报 版权申诉 马上下载
  • 文本预览
  • 下载提示
  • 常见问题
    • CHAPTER EIGHT Continuous Distributions and the Poisson Process This chapter introduces the general concept of continuous random variables, focusing on two examples of continuous distributions: the uniform distribution and the expo­ nential distribution. We then proceed to study the Poisson process, a continuous time counting process that is related to both the uniform and exponential distributions. We conclude this chapter with basic applications of the Poisson process in queueing theory. 8.1. Continuous Random Variables 8.1.1. Probability Distributions in R The continuous roulette wheel in Figure 8.I has circumference I . We spin the wheel, and when it stops. the outcome is the clockwise distance X (computed with infinite precision) from the ··o·· mark to the arrow. The sample space Q of this experiment consists of all real numbers in the range [0, I). Assume that any point on the circumference of the disk is equally likely to face the arrow when the disk stops. What is the probability p of a given outcome x'? To answer this question, we recall that in Chapter I we defined a probability func­ tion to be any function that satisfies the following three requirements: 1. Pr(Q) = I; 2. for any event E, 0 .S Pr ( E ) .S I ; 3. for any (finite or enumerable) collection B of disjoint events, Pr( u r;) = L Pr(f:) . EEB EEB Let S(k) be a set of k distinct points in the range [0, 1), and let p be the probabil­ ity that any given point in [0, I) is the outcome of the roulette experiment. Since the probability of any event is bounded by I , 188 8.1 CONTINUOUS RANDOM VARIABLES Figure 8.1 : A continuous roulette wheel. Pr ( x E S ( k)) = k p S 1. \\·e can choose any number k of distinct points in the range [0, I). so we must have �/) s I for any integer k, which implies that p = 0. Thus, we observe that in an infinite 'ample space there may be possible events that have probability 0. Taking the comple­ ment of such an event, we observe that in an infinite sample space there can be events \\ ith probability I that do not correspond to all possible experimental outcomes. and thus there can be events with probability I that are, in some sense. not certain� lf the probability of each possible outcome of our experiment is 0. how do we define the probability of larger events with nonzero probability'? For probability distributions ,,\ er R, probabilities are assigned to inten·ais rather than to indi\ idual \ �tlue:-;. 1 The probability distribution of a random Yariable X i-; gi\ en by ih disrrihurion timc­:/1 111 F(x ), where for any x E lR we define F ( x ) = Pr( X s x ) . \Ve say that a random variable X is continuous if its distribution function F( x ) is ..1 ,.:ontinuous function of x. We will assume that our random \ ariable-; are continuous :hmughout this chapter. In this case, we must have that Pr( X = .r ) = ( ) for any spe­..:Itic \'alue x. This further implies that Pr( X s x) = Pr( X rrnal treatment of nondenumerably infinite probability spaces relies on measure theory and is beyond the --... 'r'-' of thi� book. We just note here that the probability function needs to be measurable on the set of events. T“';r, �.-·armot hold in general for the family of all subsets of the sample space. but it does always hold for the H, •rei -,et of intervals. 189 Because CONTINl.JOUS DISTRIBUTIONS AND THE POISSON PROCESS Pr(x t -' 1 . Definition 8.2: The random variables X andY are independent it: t(n· all x and y. Pr ( ( X _::::: X) n ( y _::::: y)) = Pr ( X _::::: X ) Pr (}' -:::: y ) . from the definition, two random variables are independent if and only if their joint Jhtribution function is the product of their marginal distribution functions: F(x, y) = Fx(x) Fy(y). It follows from taking the derivatives with respect to x and y that. if X and Y are inde- 0ndent, then f(x, y) = fx(x)fy(y), ..1nJ this condition is sufficient as well. As an example, let a and b be positive constants, and consider the joint distribution :unction for two random variables X and Y given by F(x, y) = 1 - e�ax -e�hr + e�(o r-+-/J\ I · '\ (T the range x, y ::::_ 0. We can compute that Fx(x) = F(x, oo) = 1 - e�ax, .mJ similarly F y (.v) = 1 -e�hr_ Alternatively, we could compute 191 CONTINUOUS DISTRIBUTIONS AND THE POISSON PROCESS y) abe-(ax+h\'), from which it follows that dy dx = j' x=O = I -e-a::.. We obtain F(x, y) e-ux-e-hr = Fx(x)Fy (y), so X and Y are independent. Alternatively, working with the density functions we ver­ ify their independence by (x) = ae-ax. fy(y) be-b\', f(x, )') = .fx (y). Conditional probability for continuous random variables introduces a nontrivial sub­ tlety. The natural definition, Pr(E I F) i s suitable when Pr(F) # 0 . For example, Pr ( X ::S 3 I Y a, the distributions are essentially the same. The probability distribution function of such an X is F(x) = I �-(/ h-o and its density function is f(x) = I ��“ if x _:s a, if a _:s x _:s b. if X 2: b, if x b. These are shown in Figure 8.2。

      点击阅读更多内容
      关于金锄头网 - 版权申诉 - 免责声明 - 诚邀英才 - 联系我们
      手机版 | 川公网安备 51140202000112号 | 经营许可证(蜀ICP备13022795号)
      ©2008-2016 by Sichuan Goldhoe Inc. All Rights Reserved.