Suppose that when I play mini-golf, the number of strokes before I get the ball in the hole has a geometric distribution with unknown parameter \(p\). Remember that the parameter for the geometric distribution is the probability of a success on each try and the PMF is \[P(X=k) = p(1-p)^{k}\] where \(k\) is the number of failures before the first success. Suppose that I get a hole in one. What is the likelihood function for the parameter p?
What if it takes me 3 strokes to get the ball in the hole. Then what is the likelihood function for \(p\)? What value of \(p\) maximizes this likelihood function?
Suppose that a computer has been programmed to randomly generate a positive real number between 0 and \(N\) (uniformly distributed), but I don't know what the value of \(N\) is. If I use the program once and get an output of 15.37, then what is the likelihood function for the parameter \(N\)? Hint: Your answer should be a functions whose graph has a discontinuity at one point. You should be able to draw the graph and see the discontinuity.
If \(\theta\) is a parameter for a statistical model, then an unbiased estimator for \(\theta\) is a function of the observations \(\hat{\theta} = \hat{\theta}(X_1, \ldots, X_n)\) such that the expected value of \(\hat{\theta}\) is \(\theta\).
Show that if \(X \sim \operatorname{Unif}(0,N)\) is a single observation from a uniform distribution, then the estimator \(\hat{N} = 2x\) is an unbiased estimator for the parameter \(N\). That is, show that \(E(2X) = N\).
Suppose that \(X_1, X_2 \sim \operatorname{Unif}(0,N)\) are independent random variables. It is a fact that the expected value of the maximum of \(X_1\) and \(X_2\) is \(\frac{2}{3}N\) (i.e., \(E(\max(X_1,X_2)) = \frac{2}{3}N\)) and \(E(\min(X_1,X_2)) = \frac{1}{3}N\). Use this information to find two different unbiased estimators for \(N\), one that is a function of the minimum of \(X_1\) and \(X_2\), and the other that is a function of the maximum.
\[\verb|glm(spiders~grain.size,data=data,family="binomial")|\]