WEBVTT
Kind: captions
Language: en
00:00:14.250 --> 00:00:20.320
I will continue with the central limit theorem
and its applications. This example I have
00:00:20.320 --> 00:00:26.630
taken from Sheldon Ross’s book on probability
theory. See the idea here is that, civil
00:00:26.630 --> 00:00:32.300
engineers believe that W, the amount of weight
in units of 1000 pounds at a certain span
00:00:32.300 --> 00:00:38.300
of a bridge can withstand without structural
damage resulting is normally distributed
00:00:38.300 --> 00:00:46.550
with mean 400 and standard deviation 40. So,
the weight, which the bridge can withstand
00:00:46.550 --> 00:00:57.200
is random variable. And so it is normally
distributed with mean 400 and deviation 40.
00:00:57.200 --> 00:01:02.150
Suppose that the weight again in units of
1000 pounds of a car, is a random variable
00:01:02.150 --> 00:01:07.020
with
mean 3 and standard deviation 0.3. So, the
00:01:07.020 --> 00:01:10.680
different cars will have different weights.
Therefore again we have treated this as a
00:01:10.680 --> 00:01:13.770
random… I mean this example – the weight
of
00:01:13.770 --> 00:01:19.670
a car is treated as a random variable. And
therefore the distribution… And the
00:01:19.670 --> 00:01:28.090
distribution is normal – approximately normal
with mean 3 and standard deviation 0.3.
00:01:28.090 --> 00:01:32.860
How many cars would have to be on the bridge
span for the probability of structural
00:01:32.860 --> 00:01:42.270
damage to exceed 0.1. So, at a particular
time, how many cars are there, and then the
00:01:42.270 --> 00:01:49.380
weight of these cars exceeds the weight, which
can cause structural damage. And so you
00:01:49.380 --> 00:01:54.460
.want the probability of this whole random
of this event to be more than 0.1. So, you
00:01:54.460 --> 00:01:56.570
want
to estimate that. You want to estimate the
00:01:56.570 --> 00:02:00.790
number of cars that would be on the bridge,
so
00:02:00.790 --> 00:02:08.810
that the structural damage can occur. So,
we begin by defining P n as the probability
00:02:08.810 --> 00:02:12.610
that,
there are n cars on the bridge, whose weight
00:02:12.610 --> 00:02:18.200
exceeds W, because that is… So, the event
is this that, when it exceeds w, the structural
00:02:18.200 --> 00:02:26.230
damage can occur. Therefore, this is same
as P n. So, this is X 1 plus X 2 plus X n
00:02:26.230 --> 00:02:29.370
greater than or equal to W. And, that would
be…
00:02:29.370 --> 00:02:35.349
We can rewrite this as probability X 1 plus
X 2 plus X n minus W greater than or equal
00:02:35.349 --> 00:02:40.700
to 0.
Now, X i’s; where, X i’s is the weight
00:02:40.700 --> 00:02:44.500
of the i-th car; X i denotes the weight of
the i-th
00:02:44.500 --> 00:02:50.280
car. So, this is the total weight of the n
cars, which are on the bridge at that time.
00:02:50.280 --> 00:02:54.760
And
therefore, by central limit theorem, because
00:02:54.760 --> 00:02:57.620
for n large, we have said that, when they
are
00:02:57.620 --> 00:03:02.080
identically distributed random variables – independent
random variables, because of
00:03:02.080 --> 00:03:09.030
weight of each car is independent of the other.
So, then sigma X j, j varying from 1 to n
00:03:09.030 --> 00:03:16.700
would be approximately normal with mean 3
n and variance 0.09 n. Standard deviation
00:03:16.700 --> 00:03:19.560
is
0.3. So, the variance of the weight of a car
00:03:19.560 --> 00:03:23.340
is 0.09. And therefore, the variance of the
n
00:03:23.340 --> 00:03:27.680
cars is 0.09n. So, this is approximately this.
.
00:03:27.680 --> 00:03:35.250
Now, W is independent of the X i’s because
the weight that the bridge can withstand is
00:03:35.250 --> 00:03:41.120
independent of the weights of the individual
cars. And therefore, where I write sigma X
00:03:41.120 --> 00:03:45.280
i
minus W, this is also approximately normal;
00:03:45.280 --> 00:03:49.989
yes. And, we will again revisit all these
00:03:49.989 --> 00:03:55.269
.summation of random variables and their distributions.
But, right now, we have enough
00:03:55.269 --> 00:04:00.819
machinery with us to say that, sigma X i minus
W, because this is normal –
00:04:00.819 --> 00:04:07.310
approximately normal. This is normally distributed.
So, sigma X i minus W is also
00:04:07.310 --> 00:04:13.650
approximately normally distributed. And, the
expectation or the mean of this normal
00:04:13.650 --> 00:04:20.840
variate is 3 and minus 400 with minus W. So,
mean of sigma X i, i varying from 1 to n is
00:04:20.840 --> 00:04:26.250
3 n, and this is 400. And, the variance of
course, becomes with the plus sign comes with
00:04:26.250 --> 00:04:33.060
the plus sign, because they are independent.
So, variance of this plus variance of W,
00:04:33.060 --> 00:04:37.370
which is 1600. So, this is the variance. And
therefore, I can standardize.
00:04:37.370 --> 00:04:45.320
So, the whole idea is that, this is the variate
I am looking at. And, I have said that, this
00:04:45.320 --> 00:04:48.150
is
standard… This is normal distributed with
00:04:48.150 --> 00:04:54.780
mean 3 and minus 400 and variance this. So,
when I standardize, I will say this minus
00:04:54.780 --> 00:05:00.440
the mean divided by the standard deviation.
So,
00:05:00.440 --> 00:05:05.150
that is standardized. So, now, Z is the standard
normal variate and the event. So, when I
00:05:05.150 --> 00:05:11.480
standardize this, this probability now can
be written as probability Z greater than or
00:05:11.480 --> 00:05:14.590
equal
to. So, on this side, it will be minus of
00:05:14.590 --> 00:05:21.539
3 and minus 400; I mean bracket minus 400
divided by the standard deviation. So, when
00:05:21.539 --> 00:05:24.600
I do this operation, I mean this probability
is
00:05:24.600 --> 00:05:35.070
the same as this probability, because this
I have standardized the normal variate –
00:05:35.070 --> 00:05:40.900
standard normal variate by subtracting the
mean and dividing by the standard deviation.
00:05:40.900 --> 00:05:49.470
Therefore, this is equal to this. And so Z
is approximately normal.
00:05:49.470 --> 00:05:58.639
And now, we want this probability to be greater
than or equal to 0.1; yes. And so we look
00:05:58.639 --> 00:06:03.509
up the tables for the standard normal and
we find that, when Z is greater than or equal
00:06:03.509 --> 00:06:07.520
to
1.28, this is approximately 0.1. So, from
00:06:07.520 --> 00:06:11.120
the normal tables, I get that, this number
should
00:06:11.120 --> 00:06:17.560
be 1.28 for this to be equal to 0.1. And therefore,
greater than or equal to you want. See
00:06:17.560 --> 00:06:24.740
the whole idea is that, if the number of cars
and it is such… So, now, this number – I
00:06:24.740 --> 00:06:28.650
can
say that is equal to 1.28. Therefore, if you
00:06:28.650 --> 00:06:36.290
take it equal to 1.28, then you get an
approximation for n. And, in the sense that,
00:06:36.290 --> 00:06:42.380
if you write less than or equal to 1.28; then
obviously, this probability will be larger.
00:06:42.380 --> 00:06:45.110
And therefore, the whole thing will still
be
00:06:45.110 --> 00:06:52.220
larger than 0.1; this is the whole idea. So,
I get a value of n by equating this to 1.28.
00:06:52.220 --> 00:06:58.240
And then n should be greater than or equal
to 117. So, here of course, this is a little
00:06:58.240 --> 00:07:04.910
complex thing to solve, but you can do it
or you can start by putting in values of n;
00:07:04.910 --> 00:07:07.380
and
then you can find out for which value of n
00:07:07.380 --> 00:07:11.990
this is almost equal to this or little less
than
00:07:11.990 --> 00:07:17.080
this. So, one can… There are lot of numerical
ways of actually getting the value of n,
00:07:17.080 --> 00:07:22.380
which satisfies this inequality. So, we can
do that. And therefore, it turns out that,
00:07:22.380 --> 00:07:23.380
n
00:07:23.380 --> 00:07:28.690
.greater than or equal to 117 satisfies the
above inequality. And so… that means, if
00:07:28.690 --> 00:07:32.830
there
are more than 117 cars, then these structural
00:07:32.830 --> 00:07:39.650
damage may occur with probability 0.1. So,
there is a chance of 1 in 10 that, the bridge
00:07:39.650 --> 00:07:48.430
will suffer structural damage. So, this was
another interesting example. Actually, you
00:07:48.430 --> 00:07:53.910
can see the application in the sense that…
And then also I chose this example for the
00:07:53.910 --> 00:08:00.060
reason that, this also is a random variable.
And therefore, to convert this event to this
00:08:00.060 --> 00:08:09.280
event; and then to reduce this to use the
central limit theorem and transform this to
00:08:09.280 --> 00:08:12.449
a standard normal variate; and therefore,
get
00:08:12.449 --> 00:08:23.449
the estimate of the probability that, the
bridge may suffer structural damage. So, the
00:08:23.449 --> 00:08:26.879
interesting example of the central limit theorem.
.
00:08:26.879 --> 00:08:34.710
This is in a town of 20,000 people, 44 percent
support an upcoming referendum vote.
00:08:34.710 --> 00:08:41.930
Say for example, currently, the hot thing
is Anna Hazare going to form political party
00:08:41.930 --> 00:08:47.560
or
not. So, you might take a referendum; that
00:08:47.560 --> 00:08:53.130
means you might ask people to vote on this
whether he should do it or not. So, let us
00:08:53.130 --> 00:08:56.270
say… And, it is… That, the feeling is
there. So,
00:08:56.270 --> 00:09:02.330
maybe this is a small town; and, the feeling
is that, 44 percent will only support the
00:09:02.330 --> 00:09:09.610
upcoming referendum, but… So, then what
you do is you conduct a pre-vote poll. So,
00:09:09.610 --> 00:09:15.180
this happens very often; media person do it;
lot of magazines – they do it; they conduct
00:09:15.180 --> 00:09:22.300
their own pre-vote poll to get a feeling or
the opinion – and, of the eligible voters
00:09:22.300 --> 00:09:26.670
in the
town and surveys 100 people. Therefore, if
00:09:26.670 --> 00:09:33.390
for conducting a pre-vote poll of the eligible
voters in their town and surveyed 100 people,
00:09:33.390 --> 00:09:41.811
what is the probability that the, survey will
show that, the referendum will pass. So, one
00:09:41.811 --> 00:09:43.850
needs to understand what we mean by the
00:09:43.850 --> 00:09:50.510
.referendum will pass. In order for a referendum
to pass, it requires a majority vote or 51
00:09:50.510 --> 00:09:53.670
percent.
See even though the feeling is there that,
00:09:53.670 --> 00:09:56.050
44 percent support, but you never know at
the
00:09:56.050 --> 00:10:01.910
time of the voting, more people may vote for
the referendum and so on. Therefore, when
00:10:01.910 --> 00:10:12.180
you conduct a pre-vote poll and you surveyed
let us say 100 people, then if in that prevote
00:10:12.180 --> 00:10:18.700
poll, it turns out that, 51 percent or more
support the referendum; then you can say
00:10:18.700 --> 00:10:24.900
that, the pre-vote poll suggest that, the
referendum will pass. But, actually, when
00:10:24.900 --> 00:10:29.320
the
voting is done, and then if more than 51 percent
00:10:29.320 --> 00:10:33.920
people, who have voted; people who
have voted – the 51 percent of those people
00:10:33.920 --> 00:10:36.330
– if they have supported the referendum,
the
00:10:36.330 --> 00:10:41.750
referendum will pass. So, right now, this
is just conducting a pre-vote survey of 100
00:10:41.750 --> 00:10:47.600
people. So, then you want to know what is
the probability that, the referendum will
00:10:47.600 --> 00:10:51.160
pass.
Therefore, the question is… And therefore,
00:10:51.160 --> 00:10:53.990
that means, if you are taking a referendum
–
00:10:53.990 --> 00:10:59.700
if you are taking a survey of 100 people,
then you want 51 people to… Out of those
00:10:59.700 --> 00:11:03.079
100
people, 51 should say yes for the referendum
00:11:03.079 --> 00:11:06.370
or support the referendum. This is what you
want to find out.
00:11:06.370 --> 00:11:15.750
So, the probability – therefore, one can
model the situation using binomial random
00:11:15.750 --> 00:11:23.480
variables. So, X i is… I mean if the person
supports; if the voter or the people you are
00:11:23.480 --> 00:11:29.580
surveying – they support the referendum,
then X i will be counted as a success;
00:11:29.580 --> 00:11:36.200
otherwise, it is a failure. So, you will say
that, sigma X I; i varying from 1 to 100 is
00:11:36.200 --> 00:11:45.649
binomial 100 with mean as 0.44 into 100, because
probability of a success; that means, P
00:11:45.649 --> 00:11:53.950
is 0.44. So, I am writing here; I should have
written only 0.44. This is not… This is
00:11:53.950 --> 00:12:00.730
only… So, the P is 0.44. And then the mean
of the binomial distribution will be np. And,
00:12:00.730 --> 00:12:04.800
you want to find out the probability that,
the people that you are surveying – the
00:12:04.800 --> 00:12:09.000
100
people that you have surveyed, how many would
00:12:09.000 --> 00:12:13.089
support; that means, number of success
is here should be greater than or equal to
00:12:13.089 --> 00:12:21.410
51, because then the referendum will pass.
And, that is why I chose this, because this
00:12:21.410 --> 00:12:27.269
is depicting a new situation and we are just
trying to model it through this thing here
00:12:27.269 --> 00:12:29.899
and applying central limit theorem. So, this
is a
00:12:29.899 --> 00:12:37.890
whole idea. And therefore… So, I hope this
is clear that, this is sigma X i, i varying
00:12:37.890 --> 00:12:40.490
from
1 to 100 should be greater than or equal to
00:12:40.490 --> 00:12:43.320
51. So, from this 100 people, if they get
a
00:12:43.320 --> 00:12:49.980
feeling that, 51 or more will support the
referendum, then they can sort of advertise
00:12:49.980 --> 00:12:53.030
and
they can try to influence people and say that,
00:12:53.030 --> 00:12:58.720
the pre-vote poll says that, referendum will
pass and so on.
00:12:58.720 --> 00:12:59.720
..
00:12:59.720 --> 00:13:05.410
So, standardizing this variate – sigma X
i, i varying from 1 to 100; this will be sigma
00:13:05.410 --> 00:13:09.310
X I;
i varying 1 to 100 minus 44 – the mean of
00:13:09.310 --> 00:13:18.399
this random variable, which is np – 44 divided
by the variance, which is npq. So, 44 into
00:13:18.399 --> 00:13:25.350
0.56, because p is 0.44. So, q is 0.56.
Therefore, this is the variance. And so under
00:13:25.350 --> 00:13:31.160
root of that – the standard deviation.
Therefore, this probability is equal to this
00:13:31.160 --> 00:13:33.740
probability. So, this is greater than or equal
to
00:13:33.740 --> 00:13:40.899
51 minus 44 upon under root 44 into 0.56.
Now, as I have been telling you that, wherever
00:13:40.899 --> 00:13:49.250
you want to approximate a binomial
probability by standardizing the random variable
00:13:49.250 --> 00:13:55.029
and using a standard normal
probability, then you should also use the
00:13:55.029 --> 00:13:58.050
continuity correction factor, which I have
not
00:13:58.050 --> 00:14:02.250
done here. So, anyway. Therefore… So, that
would be… If you are saying greater than
00:14:02.250 --> 00:14:09.990
51, then it will be 50.5; that would be the
right figure. But, anyways, you can do that
00:14:09.990 --> 00:14:17.529
computation later on. So, right now, the whole
idea is just to see that. Therefore... So,
00:14:17.529 --> 00:14:19.820
to
get a feeling for the kind of numbers that
00:14:19.820 --> 00:14:22.750
you have that, will the referendum pass or
not.
00:14:22.750 --> 00:14:29.460
So, this is this. And therefore, under root
of this comes out to be 4.96. So, this
00:14:29.460 --> 00:14:33.899
probability; and, this is a standard normal
variate. Therefore, probability – this is
00:14:33.899 --> 00:14:36.269
equal
to; or, we are approximating this probability
00:14:36.269 --> 00:14:40.490
by probability Z greater than or equal to
7
00:14:40.490 --> 00:14:48.320
upon 4.96, which comes out to be 1.41. So,
Z greater than or equal to 1.41. So, this
00:14:48.320 --> 00:14:53.550
probability, which from the tables gives you
the number 0.079. Therefore, this is a very
00:14:53.550 --> 00:14:59.290
small probability. And hence, the chance of
the referendum passing is very slim.
00:14:59.290 --> 00:15:05.070
.Then, the town is 20,000 and you are only
surveying 100 people. And, when you know
00:15:05.070 --> 00:15:14.220
that, the chances of… There is a 44 percent
support the upcoming referendum. So, the
00:15:14.220 --> 00:15:22.870
probability of 51 percent or more support
the referendum is small; and, that is reflected
00:15:22.870 --> 00:15:28.310
here. So, through the central limit theorem,
you have made this approximation to the
00:15:28.310 --> 00:15:33.800
probability through the required probability
and it turns out to be 0.079. So, the chances
00:15:33.800 --> 00:15:39.990
of when you survey the 100 people and ask
for their opinion – whether they support
00:15:39.990 --> 00:15:44.120
the
referendum or not, it shows that, the chances
00:15:44.120 --> 00:15:47.360
are very small for the people.
.
00:15:47.360 --> 00:15:54.860
So, again, I mean one can go on and on about
the applications of central limit theorem
00:15:54.860 --> 00:16:00.949
and how to various different situations you
can apply it. When I want to get back to…
00:16:00.949 --> 00:16:06.790
And, other thing that, we had also sort of
use… We had used the central limit theorem,
00:16:06.790 --> 00:16:13.310
but probably did not… And, I have said that,
we will prove it later on. But, I just want
00:16:13.310 --> 00:16:16.649
to
add word of caution also to it. So, here this
00:16:16.649 --> 00:16:20.750
is that, we had X equal to… X is a binomial
n
00:16:20.750 --> 00:16:26.060
comma p; and then we said that, if you want
to compute this probability – X less than
00:16:26.060 --> 00:16:30.279
or
equal to s; then you will have to compute
00:16:30.279 --> 00:16:34.561
these numbers; and, this can be quite messy;
i
00:16:34.561 --> 00:16:40.720
varying from 0 to s. And, see i p raise to
i; 1 minus p raise to n minus 1. So, this
00:16:40.720 --> 00:16:43.570
can be
quite too cumbersome to compute. But, then
00:16:43.570 --> 00:16:49.019
we said that, we can approximate it by a
standardizing this thing. And so here this
00:16:49.019 --> 00:16:52.639
is X minus np divided by under root of npq
–
00:16:52.639 --> 00:16:57.019
the standard deviation. And then this is less
than or equal to s minus np.
00:16:57.019 --> 00:17:02.759
.Now, you add 0.5. Remember I had talked about
the correction factor when the binomial
00:17:02.759 --> 00:17:09.150
is a discrete random variable and we are approximating
it by a continuous distribution.
00:17:09.150 --> 00:17:15.709
Therefore, this continuity correction factor
is also added. So, you have 0.5 and this.
00:17:15.709 --> 00:17:22.970
Therefore, this probability – this cumbersome
thing can be approximated by the normal
00:17:22.970 --> 00:17:29.660
probability, which is s minus np plus 0.5
upon under root npq. And, we look up the
00:17:29.660 --> 00:17:34.700
normal tables and we can compute this number.
Now, the thing is that, of course, when
00:17:34.700 --> 00:17:39.980
you are approximating, the question does arise
– how good an approximation it is?
00:17:39.980 --> 00:17:47.660
And, see what happens is that, when for a
binomial distribution, if p is close to half,
00:17:47.660 --> 00:17:52.830
then
the binomial distribution is symmetric; in
00:17:52.830 --> 00:17:57.500
the sense that, the values keep on increasing
and decreasing in a symmetric manner. And
00:17:57.500 --> 00:18:04.860
then because normal itself is also symmetric
distribution about its mean; therefore, a
00:18:04.860 --> 00:18:11.650
normal distribution will give a good
approximation as long as p is close to half,
00:18:11.650 --> 00:18:17.070
because then you are approximating a
symmetric distribution – a discrete symmetric
00:18:17.070 --> 00:18:23.110
distribution by a continuous symmetric
distribution. And so… But, when the p is
00:18:23.110 --> 00:18:28.640
away from half, then the binomial will be
skewed may be to the right or to the left.
00:18:28.640 --> 00:18:32.840
And, in that case, it is not necessary that,
the
00:18:32.840 --> 00:18:38.200
normal distribution will give you a good approximation
of the binomial probabilities.
00:18:38.200 --> 00:18:45.559
Now, it is said often that, if np is greater
than or equal to 30 or np into 1 minus p is
00:18:45.559 --> 00:18:52.130
greater than or equal to 10, then the central
limit theorem will always give you a good
00:18:52.130 --> 00:18:58.520
approximation of the binomial probabilities,
but… And, these are empirical statements.
00:18:58.520 --> 00:19:05.600
And, in some cases, it may turn out that,
when you have np greater than or equal to
00:19:05.600 --> 00:19:08.510
30 or
np into 1 minus p greater than or equal to
00:19:08.510 --> 00:19:13.920
10, you may get good approximations, but it
cannot be said that, this will happen all
00:19:13.920 --> 00:19:16.600
the time, because certainly, symmetry plays
a
00:19:16.600 --> 00:19:26.580
role. And, for p small and n large such that
np equal to lambda is moderate; so then in
00:19:26.580 --> 00:19:33.400
that case, Poisson approximation may be a
good approximation. And, I had… When we
00:19:33.400 --> 00:19:38.730
were discussing discrete random variables,
I had shown you that, how a Poisson
00:19:38.730 --> 00:19:43.540
probabilities can approximate the binomial
probabilities. But, then of course, the
00:19:43.540 --> 00:19:51.100
condition was that, p is small and n is large,
and np is moderately small, is reasonable
00:19:51.100 --> 00:19:56.040
number; then Poisson may give a good approximation
for the binomial probabilities.
00:19:56.040 --> 00:20:02.929
So, with this word of caution, of course,
these approximations can be used and they
00:20:02.929 --> 00:20:05.870
are
very helpful. And so I just thought that,
00:20:05.870 --> 00:20:11.720
once we have talked about the central limit
theorem, we have proved it and shown its applications.
00:20:11.720 --> 00:20:17.220
I will just revisit what we had
done earlier when we talked about approximating
00:20:17.220 --> 00:20:19.160
the binomial probabilities by
00:20:19.160 --> 00:20:25.860
.standardizing the variate – normal variate
and reducing it to a standard normal variate,
00:20:25.860 --> 00:20:29.340
and then computing the probabilities.
.
00:20:29.340 --> 00:20:35.510
Now, which has problems for you to try on
Chebychev’s inequalities, central limit
00:20:35.510 --> 00:20:40.090
theorem and law of large numbers – weak
law of large numbers. Now, the first problem
00:20:40.090 --> 00:20:45.960
is straightforward; it says that a random
sample of size and equal to 81 is taken from
00:20:45.960 --> 00:20:49.720
a
distribution with mu equal to 128 and standard
00:20:49.720 --> 00:20:56.750
deviation sigma equal to 6.3. With what
probability can we assert that, the value
00:20:56.750 --> 00:21:02.820
we obtain for X bar will not fall between
126.6
00:21:02.820 --> 00:21:09.290
and 129. Chebychev’s inequality. So, you
can see that, it will be… You will have
00:21:09.290 --> 00:21:11.760
the
absolute value. So, X bar minus… When you
00:21:11.760 --> 00:21:22.230
essentially… I was saying that would be
greater than 129.4 and less than 126.6. So,
00:21:22.230 --> 00:21:27.309
I have given it specifically, because I want
you to then convert it to the form of the…
00:21:27.309 --> 00:21:31.670
when you are saying that, it is… when you
apply Chebychev’s inequality or the central
00:21:31.670 --> 00:21:38.700
limit theorem. So, we have already tried
given exam… I have discussed examples, where
00:21:38.700 --> 00:21:46.100
you can compute the
probabilities given
00:21:46.100 --> 00:21:50.011
that, n is 81 and the standard deviation and
mean are given to you.
00:21:50.011 --> 00:21:59.720
Now, you just want to make a comment here
is that, as we have seen through examples
00:21:59.720 --> 00:22:07.960
in the lectures that, the number n… For
example, the probability that you get – the
00:22:07.960 --> 00:22:12.020
bound
that you get by using Chebychev’s inequality
00:22:12.020 --> 00:22:18.960
on the required probability would be loose
bound; and, the central limit theorem will
00:22:18.960 --> 00:22:21.810
give you a tighter bound – a tighter this
thing
00:22:21.810 --> 00:22:29.330
on the probability. Now, the thing is… And,
of course, you can also say that… But, one
00:22:29.330 --> 00:22:35.230
.point that is important is that, the probability
when you compute it by the central limit
00:22:35.230 --> 00:22:42.179
theorem, may sometimes depend on the distribution
that you are handling; whereas,
00:22:42.179 --> 00:22:48.059
Chebychev’s is a universal inequality. And
therefore, it may give you a loose bound;
00:22:48.059 --> 00:22:50.860
but,
then the number does not change with respect
00:22:50.860 --> 00:22:57.660
to different distributions. So, Chebychev’s
is the general statement – a universal statement.
00:22:57.660 --> 00:23:02.610
And, later on when I have occasion, I
will again point out the difference between
00:23:02.610 --> 00:23:08.650
the… Even though we say that, Chebychev’s
is a looser bound, there are other advantages
00:23:08.650 --> 00:23:15.789
of using the Chebychev’s inequality.
Question 2 – that the random variables Y
00:23:15.789 --> 00:23:19.220
n have a distribution that is binomial n,
p;
00:23:19.220 --> 00:23:27.820
prove that, Y n by n converges to p in probability.
So, this is the use of weak law of large
00:23:27.820 --> 00:23:35.020
numbers. I may have already done it for you
in the lectures; but, anyway go through it
00:23:35.020 --> 00:23:44.320
and try to prove it by yourself. Then, the
third problem is consider the sequence X n
00:23:44.320 --> 00:23:49.421
of
random variables, where p n is X; probability
00:23:49.421 --> 00:23:56.600
of X n equal to X is 1, if x is 4 plus 2 by
n
00:23:56.600 --> 00:24:04.641
and 0 otherwise. So, now, here the probability…
So, X n is equal to X – the probability
00:24:04.641 --> 00:24:11.100
of that is equal to 1, if X is 4 plus 2 by
n. Does it converge in distribution to some
00:24:11.100 --> 00:24:17.779
random variable X? So, that means, find out
the… You will define the cumulative
00:24:17.779 --> 00:24:25.460
distribution function. As n goes to infinity,
can you find distribution bridge. If so find
00:24:25.460 --> 00:24:29.100
the
distribution function of X; show that the
00:24:29.100 --> 00:24:37.780
sequence X n converges in probability to X
also. So, should be interesting thing, but
00:24:37.780 --> 00:24:40.690
we go by the basic definitions and then try
to
00:24:40.690 --> 00:24:42.279
solve the problem.
.
00:24:42.279 --> 00:24:48.170
.Upon X to X n are identically independently
distributed random variables with density
00:24:48.170 --> 00:24:56.460
function F X equal to 1 by theta and 0 otherwise.
It should be equal to this – 0 less than
00:24:56.460 --> 00:25:06.650
theta less than infinity. Let M n be max of
X 1, X 2, X n. So, M n is the random variable,
00:25:06.650 --> 00:25:13.990
which is the maximum of these n sample values;
find the distribution function F n of M
00:25:13.990 --> 00:25:20.500
n. Does F n converge to sum F’s? Yes, it
will. And, see but, we will not talk much
00:25:20.500 --> 00:25:23.860
about
it because… The second part is a little
00:25:23.860 --> 00:25:26.950
difficult part, but you can certainly see
that, F n
00:25:26.950 --> 00:25:35.460
will converge to some F. So, find the distribution
function F n of M n. So, that part is
00:25:35.460 --> 00:25:40.580
okay; that you can do through the tools that
you have already learnt, because when you
00:25:40.580 --> 00:25:46.080
find out the… To find the distribution function,
you have to say probability M n less than
00:25:46.080 --> 00:25:50.510
or equal to t.
Now, since M n is the max of X 1, X 2, X n,
00:25:50.510 --> 00:25:53.990
this will reduce to probability that, each
X 1
00:25:53.990 --> 00:26:00.290
is less than t; X 2 is less than t; X n is
less than or equal to t. And, since they are
00:26:00.290 --> 00:26:05.520
independent, this will reduce to probability
X 1 less than or equal to t raise to n.
00:26:05.520 --> 00:26:11.460
Therefore, you can sort of do it in the regular
way, and then see if you can get a feeling
00:26:11.460 --> 00:26:19.289
for convergence of F n; that is all; we will
not talk in detail about it, because this
00:26:19.289 --> 00:26:25.809
becomes a little complex. If you have given
that F X is 1 upon X square and X varies
00:26:25.809 --> 00:26:29.779
from 1 to infinity, 0 elsewhere.
So, this is how you are defining this pdf;
00:26:29.779 --> 00:26:35.330
and, this is the pdf of a random variable
X.
00:26:35.330 --> 00:26:43.831
Consider a random sample of size 72 from the
distribution having this pdf. So, that
00:26:43.831 --> 00:26:48.860
means, the sample – identically independently
distributed random variables – they are
00:26:48.860 --> 00:26:52.270
72
of them; compute approximately the probability
00:26:52.270 --> 00:26:59.059
that more than 50 of the items of the
random sample are less than 3. See the thing
00:26:59.059 --> 00:27:04.470
is now – that the problem… I have include
this problem, because these two steps. See
00:27:04.470 --> 00:27:11.919
first is that, you want the probability that,
more than 50 of the items of the random sample
00:27:11.919 --> 00:27:17.519
are less than 3. So, there is a probability
I will use this here.
00:27:17.519 --> 00:27:18.519
..
00:27:18.519 --> 00:27:27.650
See you are given that, f x is 1 by x square;
1 less than x less than infinity. So, you
00:27:27.650 --> 00:27:31.260
are
wanting to find probability x less than or
00:27:31.260 --> 00:27:35.710
equal to 3; this is the problem – that more
than
00:27:35.710 --> 00:27:41.149
50 of the items of a random sample are less
than 3. So, this is x less than or equal to
00:27:41.149 --> 00:27:48.400
3;
this will be 1 to 3 of 1 by x square dx – the
00:27:48.400 --> 00:27:52.610
probability that random variable, which has
this pdf. So, then the probability of x less
00:27:52.610 --> 00:27:55.149
than or equal to 3 will be given by this,
which
00:27:55.149 --> 00:28:06.160
is minus 1 by x from 1 to 3. So, this comes
out to be minus 1 by 3 plus 1, which is 2
00:28:06.160 --> 00:28:11.190
by
3. So, now, what I will do is you are selecting
00:28:11.190 --> 00:28:15.100
a sample of size 72 and we will say that,
if
00:28:15.100 --> 00:28:20.840
a sample has a value less than 3, then that
is a success. Therefore, the probability of
00:28:20.840 --> 00:28:24.860
a
success would be 2 by 3. So, now, this gets
00:28:24.860 --> 00:28:32.480
converted to a binomial situation; where we
are selecting a sample of size 72 and we say
00:28:32.480 --> 00:28:35.929
that, if a sample value is less than 3, then
it
00:28:35.929 --> 00:28:43.450
is a success. So, that means…
Now, the question is that, from a binomial
00:28:43.450 --> 00:28:49.050
72 comma p – 2 by 3, I want a sample of
the
00:28:49.050 --> 00:28:56.490
items. So, more than 50; that means, you want
that, if you are writing sigma X i; so
00:28:56.490 --> 00:29:02.529
random variable X coming from binomial 72…
Maybe I can write it here. So,
00:29:02.529 --> 00:29:11.610
essentially, what I am treating is that, X
is binomial 72 and this is this. So, I am
00:29:11.610 --> 00:29:14.330
wanting
that, probability X is greater than or equal
00:29:14.330 --> 00:29:27.429
to 50. And so when you standardize, this will
be X minus… The mean is 2 by 3 into 72,
00:29:27.429 --> 00:29:34.429
which is… This is 24. So, 48 – 48. And,
that
00:29:34.429 --> 00:29:44.890
will be 1 by 3. So, minus 9 – 16 – 4.
So, this is greater than or equal to 50 minus
00:29:44.890 --> 00:29:51.270
48; that
is, 4; this comes out to be 9 (.). So, this
00:29:51.270 --> 00:29:57.080
is the whole thing. So,
that is why I chose this example. Therefore,
00:29:57.080 --> 00:30:00.419
you have converted this to a binomial
00:30:00.419 --> 00:30:04.690
.situation and then you are computing the
approximate probability that, more than 50.
00:30:04.690 --> 00:30:07.600
So,
here again, I am now using the central limit
00:30:07.600 --> 00:30:15.360
theorem; I am standardizing the variate there
and then… Therefore, you are computing the
00:30:15.360 --> 00:30:20.530
approximate probability; because to
compute the actual probability, would be – you
00:30:20.530 --> 00:30:27.529
will have to sum up those 72, 50 and
beyond the binomial probabilities of 50, 51,
00:30:27.529 --> 00:30:31.950
52 and 72. So, this is this problem.
.
00:30:31.950 --> 00:30:39.930
Now, let us go to measurements are recorded
to several decimal places. Each of these 48
00:30:39.930 --> 00:30:46.290
numbers is rounded off to the nearest sum
of these integers. So, when you say rounding
00:30:46.290 --> 00:30:54.779
off; that means, if the sum of the original
48 numbers; if the decimal is below 5, then
00:30:54.779 --> 00:30:59.100
you
drop the decimal number point. And, if it
00:30:59.100 --> 00:31:02.900
is 0.6, 0.7, then you take it to the next
integer.
00:31:02.900 --> 00:31:08.520
This is how we say that, when you round off
the numbers is approximated by sum of
00:31:08.520 --> 00:31:15.850
these integers. If we assume that, the errors
made by rounding off are independent and
00:31:15.850 --> 00:31:22.549
have a uniform minus 1 by 2 comma 1 by 2 distribution,
compute approximately the
00:31:22.549 --> 00:31:27.309
probability that the sum of the integers is
within 2 units of the true sum.
00:31:27.309 --> 00:31:31.840
So, now, here we are assuming that, the errors
made by the rounding off are independent;
00:31:31.840 --> 00:31:39.419
surely, that you can expect because the errors
that occur are not dependent on each other.
00:31:39.419 --> 00:31:43.890
And then this… Therefore, the rounding off
that, you are doing is between minus 0.5and
00:31:43.890 --> 00:31:52.210
0.5; thus, I said if the number is something
like 10.4, then you will round it off to 10.
00:31:52.210 --> 00:31:55.640
If
the number is 9.7, you will round it off to
00:31:55.640 --> 00:32:01.649
10 – an integer. Therefore, you are assuming
that, the error part; that means, the actual
00:32:01.649 --> 00:32:05.340
number minus the rounding – that difference
is
00:32:05.340 --> 00:32:11.419
.uniformly distributed between minus 1 by
2 and 1 by 2. The approximate probability
00:32:11.419 --> 00:32:18.880
that, the sum of the integers is within 2
units of the true sum. Therefore, what we
00:32:18.880 --> 00:32:22.230
are
doing is… So, you have 48 errors – 48
00:32:22.230 --> 00:32:26.610
numbers that you are rounding off. So, sigma…
And, each is…
00:32:26.610 --> 00:32:27.610
.
00:32:27.610 --> 00:32:42.340
Yes, I can again write here that… See epsilon
i is the error in the i-th number. So, we
00:32:42.340 --> 00:32:45.029
are
wanting that, summation epsilon… And, each
00:32:45.029 --> 00:32:48.850
epsilon i is… And, this is uniform minus
1
00:32:48.850 --> 00:32:55.809
by 2, 1 by 2; each error is uniformly distributed.
Now, you are wanting the probability
00:32:55.809 --> 00:33:04.890
that, this thing should be less than or equal
to 2; I think this is the question that, the
00:33:04.890 --> 00:33:07.420
sum
of the integers is within 2 units of the true
00:33:07.420 --> 00:33:19.279
sum; which means that, total error that occurs
should be within 2 of the original; so that
00:33:19.279 --> 00:33:22.789
means, sigma epsilon i, i varying from 1 to
48
00:33:22.789 --> 00:33:28.460
– this should be within minus 2 and 2; the
error can occur either on the… when you
00:33:28.460 --> 00:33:32.770
round down or you round up.
Therefore, this total error we are saying,
00:33:32.770 --> 00:33:35.740
what is the probability that this error is
within
00:33:35.740 --> 00:33:41.250
two of the original sum of numbers. So, I
have added up the errors. And so this sum
00:33:41.250 --> 00:33:45.649
should be greater than or equal to minus 2
and less than or equal to 2. This is what
00:33:45.649 --> 00:33:48.090
we
want to approximate – this probability.
00:33:48.090 --> 00:33:53.679
And, that again by the use of central limit
theorem, we will say because, now, epsilon
00:33:53.679 --> 00:34:00.190
i's are all uniform. Therefore, sigma epsilon
i
00:34:00.190 --> 00:34:06.510
– expectation of this i varying from 1 to
48, is because the mean is 0. So, this is
00:34:06.510 --> 00:34:09.860
0. They
are all independent; the errors we have assumed
00:34:09.860 --> 00:34:10.860
are independent.
00:34:10.860 --> 00:34:19.610
.And similarly, the variance of sigma epsilon
i, i varying from 1 to 48 will be sum of the
00:34:19.610 --> 00:34:24.200
variances and which will come out to be…
So, the variance here is remember it is b
00:34:24.200 --> 00:34:33.160
minus a whole square raise to… b minus a
whole square divided by 12; b minus a whole
00:34:33.160 --> 00:34:40.230
square by 12. So, this is… The variance
here is 1 by 12 and so variance… This will
00:34:40.230 --> 00:34:47.270
be
48 by 12. So, the variance will be 48 by 12.
00:34:47.270 --> 00:34:53.020
And therefore, standard deviation will be
under root of 48 by 12. So, I standardize.
00:34:53.020 --> 00:34:56.380
And, here this is what we get; and then by
the
00:34:56.380 --> 00:35:01.180
normal this thing, it says that probability.
So, this is actually equal to probability
00:35:01.180 --> 00:35:07.000
mod z is
less than or equal to… This is 48 by 12.
00:35:07.000 --> 00:35:12.640
So, this is 1. And, that comes out to be 0.6826
from the normal tables.
00:35:12.640 --> 00:35:17.680
Of course, you have to do some more competitions
here and this will be… Therefore;
00:35:17.680 --> 00:35:23.780
that means the error can be kept within 2;
the total errors of rounding up and rounding
00:35:23.780 --> 00:35:31.770
down can be kept within 2 with probability
0.6826. So, that is a very… There is high
00:35:31.770 --> 00:35:36.300
probability. But, if you look at a loose upper
bound; that means if you are suppose
00:35:36.300 --> 00:35:41.500
rounding up all the numbers, then this will
be 0.5 into 48, which will be 24. So, that
00:35:41.500 --> 00:35:50.340
means, an upper bound on the number of total
errors – that can occur – can go up to
00:35:50.340 --> 00:35:54.200
24.
But, here the central limit theorem gives
00:35:54.200 --> 00:35:59.540
you the idea that, the probability that, the
errors
00:35:59.540 --> 00:36:06.730
will be within 2 is reasonably high.
So, this is something about the problem I
00:36:06.730 --> 00:36:10.530
wanted to talk to you about. Varying from
1, 2
00:36:10.530 --> 00:36:16.560
and so on is a sequence of identically independently
distributed random variables with
00:36:16.560 --> 00:36:23.090
expected value of psi and is mu, and variance
psi and is sigma square. Now, if S n is the
00:36:23.090 --> 00:36:30.010
sum of the first n sample values, show that
S n upon n goes to mu with probability p.
00:36:30.010 --> 00:36:36.820
This is again just reiteration of the weak
law of large numbers. Then, I want you to
00:36:36.820 --> 00:36:44.520
sit
down and work out the proof by yourself. X
00:36:44.520 --> 00:36:54.640
n… Show that mgf of… as n goes to
infinity; t greater than distribution of Y
00:36:54.640 --> 00:36:59.960
n… And, square. See the notation because
we
00:36:59.960 --> 00:37:00.960
could not get it.
00:37:00.960 --> 00:37:01.960
..
00:37:01.960 --> 00:37:08.231
So, X n I am saying is chi square n. So, we
have talked about the chi square n
00:37:08.231 --> 00:37:16.040
distribution also. So, here this is the notation;
it looks like… In the print, it looks like
00:37:16.040 --> 00:37:19.241
X n
square, but it is actually chi square. So
00:37:19.241 --> 00:37:24.950
X n is chi square n and then Y n is X n upon
n,
00:37:24.950 --> 00:37:29.470
because again we wrote X n instead of chi,
because chi was not coming out nicely. So,
00:37:29.470 --> 00:37:33.280
Y
n is X n upon n and show that, moment generating
00:37:33.280 --> 00:37:38.980
function of Y n will go to e raise to t
as n goes to infinity, for t greater than
00:37:38.980 --> 00:37:43.760
0. So, it is defined for t greater than 0.
So, this you
00:37:43.760 --> 00:37:48.960
can work out. And then what is the limiting
distribution of Y n.
00:37:48.960 --> 00:37:54.550
So, once you get the limiting mgf of Y n,
then you will be able to say what is a
00:37:54.550 --> 00:37:59.160
distribution of Y n – limiting distribution
of Y n. This is the whole idea through this
00:37:59.160 --> 00:38:04.580
exercise. And then show that X n minus n;
where so X n is actually chi square n. So,
00:38:04.580 --> 00:38:08.650
chi
square n has mean n and variance 2 n. Therefore,
00:38:08.650 --> 00:38:15.850
now we are standardizing this. So, this
is actually the use of central limit theorem,
00:38:15.850 --> 00:38:17.730
because remember – central limit theorem
is
00:38:17.730 --> 00:38:24.940
convergence in law. So, X n minus n upon under
root 2 n for n large will converge to a
00:38:24.940 --> 00:38:29.030
standard normal variate. So, this is again
the central limit theorem.
00:38:29.030 --> 00:38:35.160
That X 1, X 2, X n are independent random
variables with probability X i equal to 1
00:38:35.160 --> 00:38:37.690
– p
and probability X i equal to 0 – 1 minus
00:38:37.690 --> 00:38:40.839
p for i varying from 1 to 2n; that means,
each X
00:38:40.839 --> 00:38:46.140
i’s… So, X i's are identically independently
distributed Bernoulli random variables; p
00:38:46.140 --> 00:38:49.440
is
of course, between 0 and 1 and it is unknown.
00:38:49.440 --> 00:38:57.369
So, this is what we have to estimate. I will
get back to this thing. So, now, if you define
00:38:57.369 --> 00:39:00.920
S n as X 1 plus X 2 plus X n and you fix the
00:39:00.920 --> 00:39:08.250
.t, then the problem says using Chebychev’s
inequality, how large an n will guarantee
00:39:08.250 --> 00:39:14.650
that, the probability of S n upon n minus
p is greater than or equal to t? So, the
00:39:14.650 --> 00:39:20.520
probability of this event is less than or
equal to 0.01 no matter what value unknown
00:39:20.520 --> 00:39:25.570
p
has. So, obviously, we are trying to say that,
00:39:25.570 --> 00:39:31.320
we want to find out how many sample
values we should take – X 1, X 2, X n, so
00:39:31.320 --> 00:39:34.220
that this ratio S n upon n or the average
of the
00:39:34.220 --> 00:39:41.530
sample values is different from p by… So,
greater than or… t we have fixed. So, this
00:39:41.530 --> 00:39:47.230
difference greater than t – probability
of that is less than 0.01. So, you want to
00:39:47.230 --> 00:39:49.110
use
Chebychev’s inequality.
00:39:49.110 --> 00:39:50.110
.
00:39:50.110 --> 00:39:57.500
So, here by Chebychev’s inequality, as we
said, this is S n by n minus p. So, this you
00:39:57.500 --> 00:40:04.230
want greater than t – probability of this;
and, this you want less than or equal to 0.01.
00:40:04.230 --> 00:40:15.700
Now, by Chebychev’s inequality, because
this is the variance of S n by n is because
00:40:15.700 --> 00:40:24.329
remember – now, S n is what? Each X i is
a Bernoulli. Therefore, this is binomial.
00:40:24.329 --> 00:40:30.460
And
so this is the variance of S n, is npq. And
00:40:30.460 --> 00:40:36.800
so 1 by n; this will be n square. So, this
is pq by
00:40:36.800 --> 00:40:46.030
n. Therefore, by Chebychev’s inequality,
this probability is less than or equal to…
00:40:46.030 --> 00:40:52.660
This
is pq by n into t square. And, this you want
00:40:52.660 --> 00:40:57.950
to be less than or equal to 0.01. So, now,
what it says is p’s are known. So, q is
00:40:57.950 --> 00:41:01.290
also unknown. And therefore, no matter what
the
00:41:01.290 --> 00:41:08.180
value of p is…
Now, since maximum of pq… We have already
00:41:08.180 --> 00:41:14.410
gone through this in the lecture also;
maximum pq is 1 by 4. So, if I take the…
00:41:14.410 --> 00:41:17.640
If I write the maximum value here since n
is in
00:41:17.640 --> 00:41:23.910
.the denominator; so this will… I will get
the value of n, which is smaller. See what
00:41:23.910 --> 00:41:27.230
I am
saying is that, this probability is less than
00:41:27.230 --> 00:41:34.010
or equal to 1 by pq by… This 1 by 4 into
n t
00:41:34.010 --> 00:41:42.010
square. And, this we want less than or equal
to 0.01. So, suppose I put this equal to 0.01.
00:41:42.010 --> 00:41:50.010
And, this tells me that, n should be equal
to… From here n should be equal to… If
00:41:50.010 --> 00:41:55.569
you
take n to this side, it will be 0.04 into
00:41:55.569 --> 00:42:00.010
t square. And, since I have written, see…
So, this
00:42:00.010 --> 00:42:04.310
value has become 1 by 4, is the maximum value
of pq.
00:42:04.310 --> 00:42:09.349
So, now… that means, for n greater than
or equal to this, this will always be satisfied
00:42:09.349 --> 00:42:13.210
–
less than or equal to 0.01; can you see that?
00:42:13.210 --> 00:42:18.260
See here I am writing the maximum value;
this upon nt square is less than this. So,
00:42:18.260 --> 00:42:21.650
n would be greater than or equal to this.
So, I am
00:42:21.650 --> 00:42:26.730
taking it… So, if I put the maximum value
here, then obviously, I get a value of n,
00:42:26.730 --> 00:42:30.170
which
will meet this inequality, because n will
00:42:30.170 --> 00:42:33.130
be greater than… Otherwise, if I write the
actual
00:42:33.130 --> 00:42:39.140
value of pq, then what I get – the value
of n would be smaller than what I am getting
00:42:39.140 --> 00:42:45.230
here. Therefore, this will always satisfy
this inequality; this is the idea. Therefore,
00:42:45.230 --> 00:42:47.589
by
Chebychev’s inequality, this is the thing.
00:42:47.589 --> 00:42:55.060
Now, part 2 says that, using CLT, find the
approximate n needed, so that… Now, here
00:42:55.060 --> 00:43:00.420
you see it has put the word minimum of this
probability. And, the probability here is
00:43:00.420 --> 00:43:06.010
the
compliment of the event that you had in the
00:43:06.010 --> 00:43:11.710
part a. Therefore, it is a same thing, because
here the probability of less than t is greater
00:43:11.710 --> 00:43:17.930
than 0.99. So, exactly… But, the minimum
part I will explain again here, because this
00:43:17.930 --> 00:43:18.930
is now…
.
00:43:18.930 --> 00:43:27.980
.By central limit theorem, minimum this…
So, minimum this probability will be attained
00:43:27.980 --> 00:43:33.230
when I put the maximum value of this. And
therefore, the minimum… When you write
00:43:33.230 --> 00:43:41.930
this here, this will be twice 5 and root n
into t; and, this is 1 upon 4. So, 1 by 2,
00:43:41.930 --> 00:43:45.849
so that
the 2 also comes here. So, this minus 1. So,
00:43:45.849 --> 00:43:50.470
that satisfies this. And now, you want to
compute again. This you want to say is equal
00:43:50.470 --> 00:43:54.099
to 0.99; which means that, 2 of 5… 2 root
n
00:43:54.099 --> 00:44:04.400
into t is equal to 1.99. And, now, you can
continue. And, in fact, you will find out
00:44:04.400 --> 00:44:08.800
the
value of t, because I think… Maybe we will
00:44:08.800 --> 00:44:18.230
complete the problem. So, this is 2 divided
by this – 0.99; and, the corresponding Z
00:44:18.230 --> 00:44:20.550
value here; I think from the tables if you
look
00:44:20.550 --> 00:44:29.220
up, it says that 2 root n t is 2.57; I think
that is the thing. And so you can compute
00:44:29.220 --> 00:44:32.859
root n
from here. And now, what it says is again…
00:44:32.859 --> 00:44:42.520
Since you have the numbers… In this case,
n comes out to be equal to 2.57 divided by
00:44:42.520 --> 00:44:50.470
2 into t whole square.
And, in the third part, it asks you to…
00:44:50.470 --> 00:44:52.980
When you fix the value of t, I think the value
of t
00:44:52.980 --> 00:45:00.150
is given – 0.01. If you do this, then it
wants you to compare. So, for example, from
00:45:00.150 --> 00:45:05.040
here
when t is this, for t equal to 0.01, n comes
00:45:05.040 --> 00:45:13.891
out to be equal to 250000. And then when you
compare it with the central limit thing, I
00:45:13.891 --> 00:45:23.510
think this comes out to be… It is computed
somewhere; I have done it here – 16. So,
00:45:23.510 --> 00:45:29.140
n will be greater than or equal to 16500.
So,
00:45:29.140 --> 00:45:35.510
this is the idea, because the Chebychev’s
inequality gives you a loose upper bound.
00:45:35.510 --> 00:45:38.329
And
therefore, the numbers will be different.
00:45:38.329 --> 00:45:40.839
So, this is the idea behind this thing. And
now,
00:45:40.839 --> 00:45:48.710
you can sit down and work it out yourself
to get a better feeling. Is equal to 1.99
00:45:48.710 --> 00:45:52.260
and now
you can continue. And in fact, you will find
00:45:52.260 --> 00:45:59.200
out the value of t, because I think…
Maybe we will complete the problem. So, this
00:45:59.200 --> 00:46:07.550
is 2 divided by this – 0.99. And, the
corresponding Z value here – I think from
00:46:07.550 --> 00:46:11.510
the tables if you look up, it says that, 2
root nt
00:46:11.510 --> 00:46:19.520
is 2.57; I think that is the thing. And so
you can compute root n from here. And now,
00:46:19.520 --> 00:46:27.290
what it says is again since you have the numbers…
I mean in this case, n comes out to be
00:46:27.290 --> 00:46:38.460
equal to 2.57 divided by 2 into t whole square.
And, in the third part, it asks you to…
00:46:38.460 --> 00:46:44.450
When you fix the value of t; I think the value
of t is given 0.01 if you do this. Then, it
00:46:44.450 --> 00:46:50.819
wants you to compare.
So, for example, from here when t is this;
00:46:50.819 --> 00:46:53.920
for t equal to 0.01, n comes out to be equal
to
00:46:53.920 --> 00:47:02.570
250000. And then when you compare it with
the central limit thing, I think this comes
00:47:02.570 --> 00:47:12.609
out to be… It is computed somewhere; I have
done it here – 16… So, n will be greater
00:47:12.609 --> 00:47:21.190
than or equal to 16500. So, this is the idea,
because the Chebychev’s inequality gives
00:47:21.190 --> 00:47:25.310
you
a loose upper bound. And therefore, the numbers
00:47:25.310 --> 00:47:27.720
will be different. So, this is the idea
00:47:27.720 --> 00:47:31.750
.behind this thing. And now you can sit down
and work it out yourself to get a better
00:47:31.750 --> 00:47:38.120
feeling.
00:47:38.120 --> 00:47:39.120
.