I'm working on a probability interview question that I found from some website. I'm having some problems.
A point is uniformly distribute on a disk with radius 1.
That is, the density is f(x,y) = C, where 0 <= x^2 + y^2 <= 1.
What is the probability that the distance from the origin is less than x, where 0 <= x <= 1?
From reading this question, it seems like for any given point (x,y) on the disk, I have to determine the probability that the distance of (x,y) from the origin is less than x.
Any help is appreciated? I'm looking for help on the thought process to arrive at the answer.
A point is uniformly distribute on a disk with radius 1.
That is, the density is f(x,y) = C, where 0 <= x^2 + y^2 <= 1.
What is the probability that the distance from the origin is less than x, where 0 <= x <= 1?
From reading this question, it seems like for any given point (x,y) on the disk, I have to determine the probability that the distance of (x,y) from the origin is less than x.
Any help is appreciated? I'm looking for help on the thought process to arrive at the answer.