Open in App
Log In Start studying!

Select your language

Suggested languages for you:

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from uniform distribution on an interval \((0, \theta)\). Show that $\left(\prod_{i=1}^{n} X_{i}\right)^{1 / n}\( is consistent estimator of \)\theta e^{-1}$.

Short Answer

Expert verified
Yes, \( \left(\prod_{i=1}^{n} X_{i}\right)^{1 / n}\) is a consistent estimator for \( \theta e^{-1}\) as for any \(\epsilon > 0\), \(\lim_{n \to \infty} P(|T_{n}-\theta e^{-1}| > \epsilon)=0 \). This is known as convergence in probability to \(\theta e^{-1}\) which is the defining property of a consistent estimator.
See the step by step solution

Step by step solution

Unlock all solutions

Get unlimited access to millions of textbook solutions with Vaia Premium

Over 22 million students worldwide already upgrade their learning with Vaia!

Step 1: Define the Geometric Mean

First, define the geometric mean \( T_n \) of a sample \((X_1, X_2, \ldots, X_n)\) as follows: \(T_{n}=\left(\prod_{i=1}^{n} X_{i}\right)^{1 / n} \)

Step 2: Determine the Expected Value and Variance

In order to prove that \(T_n\) is a consistent estimator, its expected value and variance must be determined. As per the theory, \( T_n \) converges in probability to \(\theta e^{-1}\) if and only if: \(E[T_n] = \(\theta e^{-1}\) and Var[T_n] decreases as \(n\) increases. The expected value of \(T_n\) is: \(E[T_n] = E\left[\left(\prod_{i=1}^{n} X_{i}\right)^{1 / n}\right] = \(\theta e^{-1}\), and the Variance is: Var[T_n] = E\left[\left(T_{n}-E[T_{n}]\right)^{2}\right]\)

Step 3: Show Consistency

To show consistency of \(T_n\) as \( n \) tends to infinity, it must be shown that \(T_n\) converges in probability to \(\theta e^{-1}\). By definition, an estimator \(T_n\) is considered consistent if, for any \(\epsilon > 0\), \(\lim_{n \to \infty} P(|T_{n}-\theta e^{-1}| > \epsilon)=0 \). By showing this condition holds, one can confirm that \(T_n\) is indeed a consistent estimator of \(\theta e^{-1}\)\.

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Access millions of textbook solutions in one place

  • Access over 3 million high quality textbook solutions
  • Access our popular flashcard, quiz, mock-exam and notes features
  • Access our smart AI features to upgrade your learning
Get Vaia Premium now
Access millions of textbook solutions in one place

Most popular questions from this chapter

Chapter 4

Prove that method of moment estimator is consistent for the estimation of \(r>0\) in the gamma family $$ f(x ; r)=\left\\{\begin{array}{ll} \frac{e^{-x} x^{r-1}}{\Gamma(r)} & 0

Chapter 4

Consider repeated observation on a \(m\) -dimensional random variable with mean $E\left(X_{i}\right)=\mu, i=1,2, \ldots, m, \quad \operatorname{Var}\left(X_{i}\right)=\sigma^{2}, i=1,2, \ldots, m$ and \(\operatorname{Cov}\left(X_{i}, X_{j}\right)=\rho \sigma^{2}, i \neq j .\) Let the \(i\) th observation be \(\left(x_{1 i}, \ldots, x_{m i}\right)\) \(i=1,2, \ldots, n\). Define $$ \begin{array}{c} \bar{X}_{i}=\frac{1}{m} \sum_{j=1}^{m} X_{j i} \\ W_{i}=\sum_{j=1}^{m}\left(X_{j i}-\bar{X}_{i}\right)^{2}, \\ B=m \sum_{i=1}^{n}\left(\bar{X}_{i}-\bar{X}\right)^{2}, \\ W=W_{1}+\cdots+W_{n} . \end{array} $$ where \(B\) is sum of squares between and \(W\) is sum of squares within samples. 1\. Prove (i) \(\left.W \sim(1-\rho) \sigma^{2} \chi^{(} m n-n\right)\) and (ii) \(B \sim(1+(m-1) \rho) \sigma^{2} \chi^{2}(n-1)\). 2\. Suppose \(\frac{(1-\rho) B}{(1+(m-1) \rho) W} \sim F_{(n-1),(m n-n)} .\) Prove that when \(\rho=0, \frac{W}{W+B}\) follows beta distribution with parameters \(\frac{m n-n}{2}\) and \(\frac{n-1}{2}\).

Chapter 4

Suppose that the random sample arises from a distribution with PDF $$ f(x ; \theta)=\left\\{\begin{array}{l} \theta x^{\theta-1}, \quad 0

Chapter 4

If \(X_{1}, X_{2}\) and \(X_{3}\) are three independent random variables having the Poisson distribution with the parameter \(\lambda\), show that $$ \hat{\lambda_{1}}=\frac{X_{1}+2 X_{2}+3 X_{3}}{6} $$ is an unbiased estimator of \(\lambda\). Also, compare the efficiency of \(\hat{\lambda_{1}}\) with that of the alternate estimator. $$ \hat{\lambda_{2}}=\frac{X_{1}+X_{2}+X_{3}}{3} $$.

Chapter 4

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from normal distributed population with mean \(\mu\) and variance \(\sigma^{2}\). Let \(s^{2}\) be the sample variance. Prove that \(\frac{(n-1) s^{2}}{\sigma^{2}}\) has \(\chi^{2}\) distribution with \(n-1\) degrees of freedom.

Join over 22 million students in learning with our Vaia App

The first learning app that truly has everything you need to ace your exams in one place.

  • Flashcards & Quizzes
  • AI Study Assistant
  • Smart Note-Taking
  • Mock-Exams
  • Study Planner
Join over 22 million students in learning with our Vaia App Join over 22 million students in learning with our Vaia App

Recommended explanations on Math Textbooks