WEBVTT
Kind: captions
Language: en
00:00:17.800 --> 00:00:29.369
Welcome friends to my MOOC's series of lectures
on statistical inference . This is lecture
00:00:29.369 --> 00:00:44.960
number 5 . If you remember in the last lecture
I finished with how to obtain
00:00:44.960 --> 00:01:07.229
the pdf
of a function of a random variable x ; when
00:01:07.229 --> 00:01:36.750
x is continuous with pdf fx on an interval
say a to b .
00:01:36.750 --> 00:02:06.500
And the function is monotonically
increasing or decreasing
00:02:06.500 --> 00:02:17.900
as I discussed that the monotonicity is important,
because then for any given y is equal to say
00:02:17.900 --> 00:02:50.110
H of x we can uniquely determine x for
given y .
00:02:50.110 --> 00:03:00.670
And the result was that pdf of y, if we call
it gy then what we obtained is that g y is
00:03:00.670 --> 00:03:28.250
equal to f at x into modulus of dx dy expressed
in y . So, with that result let us now look
00:03:28.250 --> 00:03:42.930
at some problems.
Suppose X is a normal variate with mu and
00:03:42.930 --> 00:03:52.310
sigma square , X is a normal variable with
expected value is equal to mu and variance
00:03:52.310 --> 00:04:11.910
of x is equal to sigma square . Consider Y
is equal to a plus b X , then what is the
00:04:11.910 --> 00:04:26.380
pdf of Y ?
So, we proceeded in the following way Y is
00:04:26.380 --> 00:04:39.669
equal to a plus bX therefore, given value
of y
00:04:39.669 --> 00:05:01.050
we can find the corresponding x is equal to
y minus a divided by b , without loss of generality
00:05:01.050 --> 00:05:13.120
let b be greater than 0 .
Since x is equal to y minus a by b therefore
00:05:13.120 --> 00:05:46.729
dx of dy is equal to 1 by b; therefore, by
using the theorem g of y is equal to f at
00:05:46.729 --> 00:06:01.270
y minus a by b multiplied by 1 by b , it is
the dx dy , and the modulus sign is not needed
00:06:01.270 --> 00:06:12.150
if because we are using b to be positive .
Now, we know that f is the density of a standard
00:06:12.150 --> 00:06:22.889
density of a normal variable with mean mu
variance sigma square. Therefore, this term
00:06:22.889 --> 00:06:37.960
is 1 over root over 2 pi sigma into e to the
power 1 upon 2 sigma square into y minus a
00:06:37.960 --> 00:07:00.669
upon b minus mu whole square , multiplied
by 1 by b . This is equal to 1 over root over
00:07:00.669 --> 00:07:21.719
2 pi into b sigma this becomes here e to the
power minus 1 upon 2 b square sigma square
00:07:21.719 --> 00:07:40.210
into y minus a minus b mu whole square .
So, we obtain the pdf of y is this . What
00:07:40.210 --> 00:07:53.629
can we say from here ? We can see that therefore,
y is normal with mean is equal to a plus b
00:07:53.629 --> 00:08:06.379
mu and variance is equal to b square sigma
square . Therefore, we find that if we make
00:08:06.379 --> 00:08:16.689
a linear transformation on a normal variable
x , then the resulting variable is also a
00:08:16.689 --> 00:08:25.770
normal distribution with appropriate adjustment
in the mean and in the variance .
00:08:25.770 --> 00:08:40.520
We can get the same result
00:08:40.520 --> 00:08:54.020
using moment generating function also . What
is the moment generating function of a plus
00:08:54.020 --> 00:09:13.360
bX ? Is equal to expected value of e to the
power a plus bX into t . This is is equal
00:09:13.360 --> 00:09:28.430
to e to the power a t multiplied by expected
value of e to the power b X t ; is equal to
00:09:28.430 --> 00:09:37.339
e to the power a t into expected value of
e to the power x into bt .
00:09:37.339 --> 00:09:55.250
Now, we know that the MGF of
00:09:55.250 --> 00:10:07.560
e to the power mu t plus half sigma square
t square . If we look at the term, we can
00:10:07.560 --> 00:10:18.440
find that it is basically the same term with
t replaced by bt, also we can see that there
00:10:18.440 --> 00:10:29.500
is a multiplier e to the power t.
therefore, this term we can write by comparing
00:10:29.500 --> 00:10:45.959
with this as e to the power a t into multiplied
by e to the power mu t plus mu bt plus half
00:10:45.959 --> 00:11:07.080
sigma square b square t square ; is equal
to e to the power mu b plus a t plus half
00:11:07.080 --> 00:11:30.589
bt square into sigma square .
Now, we know that this is the MGF of
00:11:30.589 --> 00:11:49.010
normal variable with mean is equal to a plus
mu b , and variance is equal to b square sigma
00:11:49.010 --> 00:12:04.620
square . Hence by uniqueness of
00:12:04.620 --> 00:12:19.269
moment generating function we can say that
y is equal to a plus bx is distributed as
00:12:19.269 --> 00:12:33.500
normal with mean a plus b mu and variance
b square sigma square . So, the same result
00:12:33.500 --> 00:12:42.639
we can get using moment generating function,
but the above theorem helps us to get it in
00:12:42.639 --> 00:13:01.209
a very simple way .
The above observation helps us in dealing
00:13:01.209 --> 00:13:23.120
with
normal mu sigma square very efficiently
00:13:23.120 --> 00:13:39.599
by transforming
x to y as y is equal to x minus mu by sigma
00:13:39.599 --> 00:13:56.639
. What is the advantage? The advantage is
expectation of Y is equal to 0 , and variance
00:13:56.639 --> 00:14:21.060
of Y is equal to 1 . Therefore, from arbitrary
normal variate with mu and sigma square, we
00:14:21.060 --> 00:14:35.540
can get standard normal distribution by doing
this linear transformation.
00:14:35.540 --> 00:14:44.209
And why we use normal 0 1? Because that makes
our life simple. Even if you look at it from
00:14:44.209 --> 00:14:50.019
moment generating function for standard normal
the moment generating function is e to the
00:14:50.019 --> 00:14:58.040
power t square by 2, but for arbitrary mu
and sigma the moment generating function becomes
00:14:58.040 --> 00:15:03.009
e to the power mu t plus half sigma square
t square. And therefore, dealing with that
00:15:03.009 --> 00:15:11.079
mathematically becomes more complicated .
Let us now look at slightly more complicated
00:15:11.079 --> 00:15:47.870
problem . Suppose X is a variate with normal
0 1 . We want to know
00:15:47.870 --> 00:16:02.140
the distribution of x square. Note X belongs
to minus infinity to plus infinity. Because
00:16:02.140 --> 00:16:09.350
it is a standard normal variable and it is
minus infinity to plus infinity and all of
00:16:09.350 --> 00:16:27.069
us know that it is symmetric around 0 . What
about X square ?
00:16:27.069 --> 00:16:42.759
X square as you can see is a positive random
variable .
00:16:42.759 --> 00:16:52.379
And another problem with respect to this transformation
is that this mapping is from x squared to
00:16:52.379 --> 00:17:21.380
x is not unique. Because 4 minus a and a for
both of them , since for minus a and a for
00:17:21.380 --> 00:17:30.620
both of them the value of x square is equal
to s square. Therefore, from x square when
00:17:30.620 --> 00:17:37.090
I go back to x this mapping is not unique,
as I have explained in my previous lecture
00:17:37.090 --> 00:17:43.570
with respect to a discrete random variable
if you remember I have taken minus 2 minus
00:17:43.570 --> 00:17:50.240
1 0 1 2, and from there I explained that the
inverse mapping is not there .
00:17:50.240 --> 00:18:02.750
Or in other words , we can see that in this
case y is equal to x square, the function
00:18:02.750 --> 00:18:15.210
is not monotonically increasing . And therefore,
this theorem does not hold as such . So, what
00:18:15.210 --> 00:18:34.410
we do? We make a small adjustment.
Y is equal to X square is not one to one therefore,
00:18:34.410 --> 00:19:05.210
to obtain the pdf of Y, we go as follows . Let
g be the pdf of Y , maybe I write it as g
00:19:05.210 --> 00:19:21.920
y and G y be the cumulative distribution function
. Therefore, G of y is equal to probability
00:19:21.920 --> 00:19:31.990
Y less than equal to y which is is equal to
probability X square less than equal to y
00:19:31.990 --> 00:19:43.910
. And because of the symmetricity around 0,
we can find this is equal to probability minus
00:19:43.910 --> 00:19:51.740
root y less than equal to x less than equal
to y .
00:19:51.740 --> 00:20:05.320
Now, since normal is symmetric , and if this
is minus root y and this is plus root y, then
00:20:05.320 --> 00:20:25.270
this probability is actually 2 times the probability
that x lies between 0 to root y . Therefore,
00:20:25.270 --> 00:20:36.660
probability minus root y less than equal to
x less than equal to root y is equal to 2
00:20:36.660 --> 00:21:02.730
times 0 to root y fx dx . And since f is normal
0 1 so, this becomes 2 times 0 to root y 1
00:21:02.730 --> 00:21:13.520
over root over 2 pi e to the power minus x
square by 2 dx .
00:21:13.520 --> 00:21:25.280
So, this is the value of Gy that is the cumulative
distribution function for the random variable
00:21:25.280 --> 00:21:37.930
y which is nothing but x square . Therefore,
gy is equal to G prime y ; that means, I am
00:21:37.930 --> 00:21:50.610
differentiating this with respect to y . So,
we know that that we get by first replacing
00:21:50.610 --> 00:21:58.490
x with root y in this formula, multiplied
by the derivative of root y with respect to
00:21:58.490 --> 00:22:08.950
y .
So, this is 2 times 1 over root 2 pi e to
00:22:08.950 --> 00:22:23.850
the power minus root y square is equal to
y by 2 multiplied by d root y d y. Because
00:22:23.850 --> 00:22:31.340
we know that it is dx dy expressed in terms
of y and x is equal to root y. Therefore,
00:22:31.340 --> 00:22:53.640
we can write it as half root y since ; so,
we replace this value here . Therefore, what
00:22:53.640 --> 00:23:04.070
we get ?
We get g of y is equal to root 2 upon root
00:23:04.070 --> 00:23:18.230
pi e to the power minus y by 2 into 1 upon
2 root y, which on simplification becomes
00:23:18.230 --> 00:23:29.560
1 over root 2 pi e to the power minus y by
2 y to the power minus half .
00:23:29.560 --> 00:23:44.280
So, we get a new type of density function
for y; where y lies in the interval 0 to infinity
00:23:44.280 --> 00:23:55.900
. Now is it a density that is completely unknown
to us . Perhaps most of you will say, yes
00:23:55.900 --> 00:24:18.280
. So, let us observe one thing , gamma of
half is equal to root over pi . I am sure
00:24:18.280 --> 00:24:24.740
you have come across this in your mathematics
course. So, I am utilizing this property,
00:24:24.740 --> 00:24:40.680
and writing this as g y is equal to half to
the power half 1 upon root 2 is equal to half
00:24:40.680 --> 00:24:54.490
to the power half upon gamma half e to the
power minus half y, y to the power half minus
00:24:54.490 --> 00:25:13.360
1 0 less than y less than infinity .
Do you remember this density ?
00:25:13.360 --> 00:25:23.080
I am sure you can. Because this is of the
form gamma lambda alpha which is lambda power
00:25:23.080 --> 00:25:30.700
alpha upon gamma alpha e to the power minus
lambda x, x to the power alpha minus 1, lambda
00:25:30.700 --> 00:25:35.600
greater than 0 alpha greater than 0 and x
greater than 0 .
00:25:35.600 --> 00:25:58.060
Therefore, we can say is equal to actually
gamma half comma half . This gamma half half
00:25:58.060 --> 00:26:23.390
is called chi square with one degrees of freedom
. Why it is called one degree of freedom?
00:26:23.390 --> 00:26:31.900
Because this chi square has come from one
normal distribution , that is why there is
00:26:31.900 --> 00:26:39.490
one independent random variable x which is
giving rise to this chi square distribution.
00:26:39.490 --> 00:26:49.590
And therefore, we call it chi square with
one degrees of freedom and notationally chi
00:26:49.590 --> 00:27:12.910
square with one degrees of freedom .
Now, the question is what happens to X 1 square
00:27:12.910 --> 00:27:44.390
plus X 2 square where X 1 and X 2 are independent
normal 0 1 . We know that X 1 will become
00:27:44.390 --> 00:27:58.610
gamma half, half. X 2 will become gamma half,
half .
00:27:58.610 --> 00:28:09.780
And therefore, given 2 different random variables
which are independent we want to find pdf
00:28:09.780 --> 00:28:26.330
of X 1 square plus X 2 square . This we can
do in many ways, let me first do it using
00:28:26.330 --> 00:28:35.880
moment generating function. In fact, I prove
something more general than just half half
00:28:35.880 --> 00:28:43.770
. In fact, I prove it for general gamma distribution
.
00:28:43.770 --> 00:28:58.160
Let x be a gamma lambda alpha lambda greater
than 0 alpha greater than 0 . Therefore, moment
00:28:58.160 --> 00:29:09.350
generating function of x is equal to expected
value of e to the power X t; is equal to 0
00:29:09.350 --> 00:29:19.620
to infinity e to the power x t multiplied
by the pdf of x , which is lambda power alpha
00:29:19.620 --> 00:29:31.790
upon gamma alpha e to the power minus lambda
x, x to the power alpha minus 1 dx, right?
00:29:31.790 --> 00:29:52.720
Is equal to so, I have used this and this
together multiplied by x to the power alpha
00:29:52.720 --> 00:30:04.700
minus 1 dx . And this will converge for t
less than lambda. Because in order to converge
00:30:04.700 --> 00:30:09.230
this part has to be positive. So, that along
with this minus it becomes negative .
00:30:09.230 --> 00:30:18.710
So, this is this will hold good for t less
than lambda . And we can easily find out this
00:30:18.710 --> 00:30:30.940
integral, because we know that when we are
integrating this part only . This is the pdf
00:30:30.940 --> 00:30:38.860
of gamma distribution. Therefore, this integrates
to 1 , and therefore, the integration of this
00:30:38.860 --> 00:30:47.910
part has to be gamma alpha upon lambda power
alpha so that it cancels out . Therefore,
00:30:47.910 --> 00:30:56.560
by comparing we can easily write that this
part is going to be instead of gamma alpha
00:30:56.560 --> 00:31:05.280
upon lambda power alpha it is going to be
gamma alpha upon lambda minus t power alpha
00:31:05.280 --> 00:31:16.960
, which says that the MGF is equal to lambda
upon lambda minus t whole to the power alpha
00:31:16.960 --> 00:31:25.250
.
So, this is the moment generating function
00:31:25.250 --> 00:31:49.540
for normal for gamma lambda alpha . Therefore,
if we take 2 independent gamma random variables,
00:31:49.540 --> 00:32:17.010
gamma lambda alpha and gamma lambda beta variates
. And we want to know the pdf of so, say this
00:32:17.010 --> 00:32:32.700
is called X and this is called Y X plus Y
then MGF of X plus Y t, we have already seen
00:32:32.700 --> 00:32:43.180
that if 2 random variables are independent,
then the MGF of their addition of their sum
00:32:43.180 --> 00:32:57.910
is product of their individual moment generating
functions .
00:32:57.910 --> 00:33:05.110
And just now we have found out that this moment
generating function is lambda upon lambda
00:33:05.110 --> 00:33:15.470
minus t whole to the power alpha . This is
similarly going to be lambda upon lambda minus
00:33:15.470 --> 00:33:23.030
t whole to the power beta . Therefore, the
product becomes lambda upon lambda minus t
00:33:23.030 --> 00:33:44.120
whole to the power alpha plus beta .
And therefore, by uniqueness theorem
00:33:44.120 --> 00:33:51.650
by uniqueness of moment generating function,
we can say that x plus y is distributed as
00:33:51.650 --> 00:33:59.380
gamma with lambda same, but this parameter
becomes alpha plus beta .
00:33:59.380 --> 00:34:14.679
So, what we obtain is that
00:34:14.679 --> 00:34:33.050
if X and Y are independent gamma variates
with same lambda , but the second parameter
00:34:33.050 --> 00:34:55.410
being alpha and beta, then X plus Y is also
a gamma variate with lambda alpha plus beta
00:34:55.410 --> 00:35:07.160
. Therefore, we find an interesting result
with respect to gamma random variable. That
00:35:07.160 --> 00:35:17.120
as we keep on adding independent random gamma
variables, with the same lambda then the summation
00:35:17.120 --> 00:35:23.260
of these random variables is also gamma with
the same parameter lambda, but the second
00:35:23.260 --> 00:35:30.870
parameter being the sum of the individual
parameters .
00:35:30.870 --> 00:35:54.530
The advantage now with respect to
chi square. Therefore, if X 1 and X 2 are
00:35:54.530 --> 00:36:14.490
2 independent normal 0 1 , then X 1 square
is basically a gamma variate with half and
00:36:14.490 --> 00:36:25.620
half .
X 2 square is also a gamma variate with half
00:36:25.620 --> 00:36:42.020
and half , therefore, X 1 square plus X 2
square is it also a gamma variate with parameters
00:36:42.020 --> 00:36:52.940
half and half plus half is equal to gamma
with half comma one; which we write as gamma
00:36:52.940 --> 00:37:04.800
with half 2 by 2. And this is called chi square
with 2 degrees of freedom . Why 2 degrees
00:37:04.800 --> 00:37:13.330
of freedom? Because we have used 2 independent
normal 0 1 .
00:37:13.330 --> 00:37:33.340
Can we therefore, generalize from there ? We
can, in fact, X 1 X 2 Xn are independent normal
00:37:33.340 --> 00:37:51.400
0 1, then X 1 square plus X 2 square plus
Xn square will be gamma with half. And the
00:37:51.400 --> 00:37:57.870
second parameter it is half for each on each
one of them therefore, when we add them up
00:37:57.870 --> 00:38:12.210
will get n by 2 . Therefore, a gamma half
with n by 2 is same as a chi square distribution
00:38:12.210 --> 00:38:20.820
with degrees of freedom n .
So, sum of square of in independent random
00:38:20.820 --> 00:38:33.590
variables is chi square with n degrees of
freedom, and we can write it is pdf very simply,
00:38:33.590 --> 00:38:46.450
which is is equal to lambda power alpha upon
gamma alpha into e to the power minus lambda
00:38:46.450 --> 00:38:57.140
x x to the power n by 2 minus 1 for 0 less
than x less than infinity .
00:38:57.140 --> 00:39:10.590
In this case , we could easily find the sum
of 2 independent random variables . In general,
00:39:10.590 --> 00:39:25.040
how do you find the sum of 2 arbitrary random
variables or why some why not weighted sum
00:39:25.040 --> 00:39:32.760
of 2 random variables, the difference between
2 random variables or even the product of
00:39:32.760 --> 00:39:41.460
2 random variables or division of x by y,
when y not equal to 0, that is also a random
00:39:41.460 --> 00:39:51.680
variable. So, is there any way to find the
pdf of a function of 2 random variables .
00:39:51.680 --> 00:40:04.240
So, the following theorem helps us in adding
the following theorem helps us to find the
00:40:04.240 --> 00:40:12.880
pdf of function of 2 random variables under
certain conditions. So, I am stating the theorem,
00:40:12.880 --> 00:40:19.850
but I am not going to prove it, because the
mathematics for proving that is beyond the
00:40:19.850 --> 00:40:45.940
scope of this lecture .
So, X and Y 2 random variables which we
00:40:45.940 --> 00:40:57.970
can write it as X Y. And therefore, I am writing
as a 2 dimensional random variable with joint
00:40:57.970 --> 00:41:19.920
pdf f ; that is, f of x y gives the pdf at
x comma y .
00:41:19.920 --> 00:41:48.010
Now, consider 2 functions H 1 and H 2 ; such
that Z is equal to H 1 of x y. So, Z is a
00:41:48.010 --> 00:41:56.480
function of X and Y, W is a function of X
Y . Therefore, basically from XY plane we
00:41:56.480 --> 00:42:18.150
are transforming it into another plane of
z w ; such that the pair of equations z is
00:42:18.150 --> 00:42:31.390
equal to H 1 x y and w is equal to H 2 x y
can be solved uniquely .
00:42:31.390 --> 00:42:58.250
Say, x is equal to G 1 of z w , and y is equal
to g 2 of z w .
00:42:58.250 --> 00:43:05.420
So, you notice the similarity with when we
are talking about function of a single random
00:43:05.420 --> 00:43:12.940
variable. We wanted the function to be monotonically
increasing or decreasing so that we can get
00:43:12.940 --> 00:43:23.420
the inverse unity. Similarly , given z and
w we want to obtain that from the values z
00:43:23.420 --> 00:43:32.500
and w we can identify the x and y uniquely.
So, this condition ensures that also.
00:43:32.500 --> 00:43:57.350
Another condition is that the partial derivatives
del x, del z, del x, del w, del y, del z,
00:43:57.350 --> 00:44:24.900
del y, del w exists and continuous .
So, we have from XY plane a mapping to Z W
00:44:24.900 --> 00:44:31.750
plane such that it is one to one. So, that
given any pair here we can uniquely determine
00:44:31.750 --> 00:44:39.750
the corresponding X Y. And also the partial
derivatives of x and y with respect to both
00:44:39.750 --> 00:44:59.680
z and w they exist and continuous . Then
the joint pdf k z w so, we are looking at
00:44:59.680 --> 00:45:11.880
the joint probability density function of
these 2 d random variables z w is f at G 1
00:45:11.880 --> 00:45:27.000
z w, G 2 z w ; that means, we are looking
at the pdf of original random variable x y,
00:45:27.000 --> 00:45:37.290
but expressed in terms of z w multiplied by
something which is called the jacobian . What
00:45:37.290 --> 00:45:51.119
is Jacobean ?
00:45:51.119 --> 00:46:20.120
Jacobian is the modulus of the following determinant
del x, del z, del x, del w, del y, del z,
00:46:20.120 --> 00:46:37.220
del y, del w . So, we compute the determinant
and the pdf f expressed in terms of z w multiplied
00:46:37.220 --> 00:46:51.980
by the determinant gives us the pdf for z
w
00:46:51.980 --> 00:47:03.240
the joint pdf of z and w .
What is the advantage ?
00:47:03.240 --> 00:47:28.440
The advantages if we want to find pdf of a
function z is equal to H of x y , then we
00:47:28.440 --> 00:47:45.490
consider another random variable w is equal
to say z is equal to H 1 . So, let us call
00:47:45.490 --> 00:48:06.550
it H 2 of X Y obtain k z w the joint pdf , and
just now I have shown the formula for obtaining
00:48:06.550 --> 00:48:25.130
it integrate over w to obtain the pdf of z
.
00:48:25.130 --> 00:48:35.880
Since our interest was only in this. In order
to get it is pdf, we need to find out first
00:48:35.880 --> 00:48:43.580
the joint pdf by appropriately defining w.
So, that this integration becomes easier,
00:48:43.580 --> 00:48:56.040
and from there we obtain the pdf of z . Therefore,
let us now look at the same problem of summation
00:48:56.040 --> 00:49:28.800
of 2 chi square distribution .
00:49:28.800 --> 00:49:54.800
We need to find out the pdf of X 1 square
plus X 2 square . Since X 1 is normal 0 1
00:49:54.800 --> 00:50:08.740
so, we can consider X 1 along this axis, and
X 2 along the y axis so that X 1 X 2 together
00:50:08.740 --> 00:50:17.530
can cover the entire 2 d plane, where you
are doing that? Because that guides us to
00:50:17.530 --> 00:50:21.119
take a very meaningful transformation . What
is that ?
00:50:21.119 --> 00:50:53.400
So, we transform X is equal to R cos theta
Y is equal to R sin theta . So, instead of
00:50:53.400 --> 00:51:03.680
X 1 and X 2, let me call them X and Y , therefore,
we are covering the entire 2 d plane therefore,
00:51:03.680 --> 00:51:13.430
minus infinity less than x less than infinity,
minus infinity less than y less than infinity
00:51:13.430 --> 00:51:22.619
or is the radius of transformation, therefore,
R is going to be from 0 to infinity, and theta
00:51:22.619 --> 00:51:32.540
is covering the entire plane. So, theta will
belong to 0 to 2 pi .
00:51:32.540 --> 00:51:52.830
Therefore Jacobean of transformation is equal
to dX dR which is cos theta . DX d theta which
00:51:52.830 --> 00:52:11.840
is minus R sin theta, dY d R which is sin
theta and dY d theta is equal to R cos theta
00:52:11.840 --> 00:52:19.190
.
Therefore determinant of J is equal to R cos
00:52:19.190 --> 00:52:35.240
square theta plus R sin square theta is equal
to R . Therefore, that joint pdf R theta is
00:52:35.240 --> 00:52:46.350
equal to since x and y are independent, we
can write the distribution as the product
00:52:46.350 --> 00:52:59.740
of their individual
00:52:59.740 --> 00:53:10.420
is equal to 1 over 2 pi e to the power minus
x square plus y square by 2 into R . Now this
00:53:10.420 --> 00:53:21.160
has to be expressed in terms of R theta , therefore,
this is 1 over 2 pi into e to the power minus
00:53:21.160 --> 00:53:30.370
R square by 2 into R.
Therefore, g of R is equal to now I have to
00:53:30.370 --> 00:53:39.350
integrate out theta from 0 to 2 pi e to the
power minus R square by 2 into R into d theta.
00:53:39.350 --> 00:53:54.080
And therefore, these 2 pi is being cancelled
therefore, what we get is , but we need to
00:53:54.080 --> 00:54:01.100
find out the density of R square.
Because we need x square plus y square is
00:54:01.100 --> 00:54:10.540
equal to R square cos square theta plus R
square sin square theta is equal to R square
00:54:10.540 --> 00:54:16.170
.
But so far we obtained the pdf of R, now we
00:54:16.170 --> 00:54:25.520
need to find out the pdf of R square, therefore,
we use the transformation of the first one,
00:54:25.520 --> 00:54:36.460
let me write it in capital . Therefore, g
of R square is equal to e to the power minus
00:54:36.460 --> 00:54:54.090
R square by 2 into R multiplied by d R upon
d R square ; is equal to these are therefore,
00:54:54.090 --> 00:55:04.140
we write as R square to the power half into
e to the power minus R square by 2 and dr
00:55:04.140 --> 00:55:15.730
upon dr square is equal to 1 upon 2 R .
So, this cancels with this , and what we are
00:55:15.730 --> 00:55:25.390
get it half e to the power minus R square
by 2 .
00:55:25.390 --> 00:55:45.290
Therefore if we write z is equal to R square,
then f at z is equal to half into e to the
00:55:45.290 --> 00:55:57.270
power minus z by 2 ; which is is equal to
gamma half gamma 1. So, this is the result
00:55:57.270 --> 00:56:07.120
that we have already obtained using moment
generating function . With that I stopped
00:56:07.120 --> 00:56:13.920
here , in the next class I will be talking
about some more different types of random
00:56:13.920 --> 00:56:30.020
variables , and we will try to obtain their
pdfs in a similar way .
00:56:30.020 --> 00:56:34.110
Thank you .