Chapter 11: Problem 34
Suppose in Example \(11.19\) that no new customers are allowed in the system after time \(t_{0} .\) Give an efficient simulation estimator of the expected additional time after \(t_{0}\) until the system becomes empty.
Chapter 11: Problem 34
Suppose in Example \(11.19\) that no new customers are allowed in the system after time \(t_{0} .\) Give an efficient simulation estimator of the expected additional time after \(t_{0}\) until the system becomes empty.
All the tools & learning materials you need for study success - in one app.
Get started for freeFor a nonhomogeneous Poisson process with intensity function \(\lambda(t), t \geqslant 0\), where \(\int_{0}^{\infty} \lambda(t) d t=\infty\), let \(X_{1}, X_{2}, \ldots\) denote the sequence of times at which events occur. (a) Show that \(\int_{0}^{X_{1}} \lambda(t) d t\) is exponential with rate 1 . (b) Show that \(\int_{X_{i-1}}^{X_{i}} \lambda(t) d t, i \geqslant 1\), are independent exponentials with rate 1, where \(X_{0}=0\) In words, independent of the past, the additional amount of hazard that must be experienced until an event occurs is exponential with rate 1 .
Consider the following procedure for randomly choosing a subset of size \(k\)
from the numbers \(1,2, \ldots, n:\) Fix \(p\) and generate the first \(n\) time
units of a renewal process whose interarrival distribution is geometric with
mean \(1 / p-\) that is, \(P\\{\) interarrival time \(=k\\}=p(1-p)^{k-1}, k=1,2,
\ldots .\) Suppose events occur at times \(i_{1}
The Hit-Miss Method: Suppose \(g\) is bounded in \([0,1]-\) for instance, suppose
\(0 \leqslant g(x) \leqslant b\) for \(x \in[0,1]\). Let \(U_{1}, U_{2}\) be
independent random numbers and set \(X=U_{1}, Y=b U_{2}\) -so the point \((X, Y)\)
is uniformly distributed in a rectangle of length 1 and height \(b\). Now set
$$
I=\left\\{\begin{array}{ll}
1, & \text { if } Y
Let \(X_{1}, \ldots, X_{k}\) be independent with $$ P\left\\{X_{i}=j\right\\}=\frac{1}{n}, \quad j=1, \ldots, n, i=1, \ldots, k $$ If \(D\) is thê number of distinct values among \(X_{1}, \ldots, X_{k}\) show that $$ \begin{aligned} E[D] &=n\left[1-\left(\frac{n-1}{n}\right)^{k}\right] \\ & \approx k-\frac{k^{2}}{2 n} \quad \text { when } \frac{k^{2}}{n} \text { is small } \end{aligned} $$
Let \(X_{1}, \ldots, X_{n}\) be independent random variables with \(E\left[X_{i}\right]=\theta, \operatorname{Var}\left(X_{i}\right)=\sigma_{i}^{2}\) \(i=1, \ldots, n\), and consider estimates of \(\theta\) of the form \(\sum_{i=1}^{n} \lambda_{i} X_{i}\) where \(\sum_{i=1}^{n} \lambda_{i}=1\). Show that \(\operatorname{Var}\left(\sum_{i=1}^{n} \lambda_{i} X_{i}\right)\) is minimized when $$\lambda_{i}=\left(1 / \sigma_{i}^{2}\right) /\left(\sum_{j=1}^{n} 1 / \sigma_{j}^{2}\right), \quad i=1, \ldots, n$$ Possible Hint: If you cannot do this for general \(n\), try it first when \(n=2\). The following two problems are concerned with the estimation of \(\int_{0}^{1} g(x) d x=E[g(U)]\) where \(U\) is uniform \((0,1)\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.