IARE PTSP QUIZ Final1

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 35

INSTITUTE OF AERONAUTICAL ENGINEERING

(Autonomous)
Dundigal, Hyderabad - 500 043

PROBABILITY THOERY AND STOCHASTIC PROCESS

CONTINUOUS INTERNAL ASSESSMENT (CIA)

MULTIPLE CHOICE QUESTIONS FOR QUIZ

Regulations : IARE-R16

Academic Year : 2017-2018

Semester : III

Course Name : B. Tech

Course Code : R16 – AEC003

Branch : Electronics & Communication Engineering

Course Coordinator : Ms.G.Mary swarna latha, Assistant professor, Dept of ECE

Team of Instructors : R.Mahendhar Reddy, Associate Professor;


Mr.G.Anil kumar Reddy, Assistant professor,
Mrs.Ajitha.G.Assistant professor .
UNIT- I

PROBABILITY and RANDOM VARIABLE:


Probability introduced through Sets and PROBABILITY: Relative Frequency Experiments and
Sample Spaces, Discrete and Continuous Sample Spaces, Events, Probability Definitions and Axioms,
Mathematical Model of Experiments, Probability as a Relative Frequency, Joint Probability, Conditional
Probability, Total Probability, Bayes’ Theorem, and Independent Events.
RANDOM VARIABLE: Definition of a Random Variable, Conditions for a Function to be a Random
Variable, Discrete and Continuous, Mixed Random Variable.

1 Probability is always 0≤P(A)≤1 this statement is


A. False B
B. True
2 There is a Sample Space with finite samples, then P(S)=___
A. 0
B.1
C. ∞ B
D. 2

3 Probability of causes is also called ……………. Theorem


A. Multiplication
B. addition C
C. baye’s
D. subtraction
4 Two events are said to be in dependent, if happening or failure of one does
not affect the happening or failure of the other
B
A. false
B. true
5 Two events, A and B are said to be mutually exclusive if they ………. together
A. cannot occur
B. occur A
C. may be
D. may not be
6 P(A U B) ______ P(A) + P(B)
A. ≤
B. ≥ A
C. =
D. +
7 If P(B) > 0 then P(A|B) =
A.

B. B

C.
D. 0
8 Let S1 and S2 be the sample spaces of two sub experiments. The combined sample
space S is given by
A. S1 X S2
A
B. S1 - S2
C. S1 _ S2
D. S1 | S2
9
A. Relative frequency
B. P(H) A
C. P(A)
D. P(B)
10 Arrangement of n things in a specified manner is called……….
A. permutations
B. combinations A
C. probability
D. Relative Frequency
11 set containing only one element is called……set
A. null set
B. subset C
C. singleton
D. Doubleton
13 An experiment whose outcome can be predicted with maximum certainity is
called…..experiment
A. deterministic
A
B. non deterministic
C. periodic
D. aperiodic
14 If S= {1,2,3,4,5,6,7,8,9} then find P(S)=
A. 1
B. 0 A
C. -1
D. undefined
15
=
A. –
C
B. U
C.
D. 0
16 If ≠Ф m≠n=1,2,3,……..N then
A. 0
B. 1 C
C. S
D. infinite
17 Tickets numbered 1 to 20 are mixed up and then a ticket is drawn at random. What
is the probability that the ticket drawn has a number which is a multiple of 3 or 5?
A. 1/2
D
B. 2/5
C. 8/15
D. 9/20
18 What is the probability of getting a sum 9 from two throws of a dice?
A. 1/6
B. 1/8 C
C. 1/9
D. 1/12
19 Three unbiased coins are tossed. What is the probability of getting at most two D
heads?
A. 3/4
B. 1/4
C. 3/8
D. 7/8
20 From a pack of 52 cards, two cards are drawn together at random. What is the
probability of both the cards being kings?
A. 1/15
D
B. 25/57
C. 35/256
D. 1/221
21 Two dice are tossed. The probability that the total score is a prime number is:
A. 1/6
B. 5/12 B
C. 1/2
D. 7/9
22 If two events X and Y are considered as partially overlapping events then rule of
addition can be written as
A.P(X or Y) = P(X) - P(Y) + P(X and Y)
B. P(X or Y) = P(X) + P(Y) * P(X - Y)
C..P(X or Y) = P(X) * P(Y) + P(X - Y) D
D.P(X or Y) = P(X) + P(Y) - P(X and Y)

23 According to combination rule, if total number of outcomes are 'r' and distinct
outcome collection is 'n' then combinations are calculated as
A. n! ⁄ r!(n - r)!
A
B. n! ⁄ r!(n + r)!
C. r! ⁄ n!(n - r)!
D. r! ⁄ n!(n + r)!
24 Outcomes of an experiment are classified as
A. logged events
B. exponential results D
C. results
D. events
25 Types of probabilities for independent events must includes
A. joint events
B. marginal events D
C. conditional events
D. joint, conditional and marginal events
26 Joint probability of two statistical dependent events Y and Z can be written as P(Y
and Z) =
A.P(Z + Y) * P(Y|Z)
B
B.P(Y) * P(Z|Y)
C.P(Y) * P(Z|Y) + P(Z)
D.P(Y) * P(Z|Y) - P(Z + Y)
27 In a Venn diagram used to represent probabilities, sample space of events is
represented by
A. square
B. triangle D
C. circle
D. rectangle
28 Consider an event A, non occurrence of event A is represented by
A. union of A
B. complement of A B
C. intersection of A
D. A is equal to zero
29 Method of counting outcomes in which number of outcomes are determined while
considering ordering is classified as
A. intersection combinations
D
B. union combinations
C. listed combination
D. permutations
30 Important rules in computation of experimental outcomes includes
A. multiple experiments
B. permutations D
C. combinations
D. permutations, combinations and multiple experiments
31 For two events, probability of occurrence of both events at same time or occurrence
in series is classified as
A. joint probability
A
B. dependent probability
C. series probability
D. conditional probability
32 A card is drawn from a pack of 52 cards. The probability of getting a queen of club
or a king of heart is:
A. 1/13
C
B. 2/13
C. 1/26
D. 1/52
33 In probability theory, events are denoted by
A. Greek letters
B. capital letters B
C. small letters
D. Latin letters
34 If a brown sack consists of 4 white balls and 3 black balls then probability of one
randomly drawn ball will be white is
A
A.4 ⁄ 7
B.1 ⁄7
C.4 ⁄ 4
D.4 ⁄ 3
35 Difference between sample space and subset of sample space is considered as
A. numerical complementary events
B. equal compulsory events C
C. complementary events
D. compulsory events
36 If a bag contains 16 apples, 30 oranges and 20 some other fruits that is neither
oranges nor apples then probability of selecting an orange randomly is
A.0.3
B.0.45
B

C.0.65

D.0.034
37 Method in which previously calculated probabilities are revised with new
probabilities is classified as
A. updating theorem
C
B. revised theorem
C. Baye’s theorem
D. dependency theorem
38 Probability of events must lie in limits of
A. one to two
B. two to three D
C. one to two
D. zero to one
39 Event such as equal chance of heads or tails while tossing coin is an example of
A. numerical events
B. equally likely events B
C. unequal events
D. non-numerical events
40 In a Venn diagram used to represent probabilities, occurred events are represented
by
A. circle
A
B. rectangle
C. square
D. triangle
41 Measure of chance of an uncertain event in form of numerical figures is classified
as
A. probability A
B. variability
C. durability
D. likelihood
42 For mutually exclusive events, formula of calculating probability as n(A) ⁄ n(S) +
n(B) ⁄ n(S) is used for
A. rule of marginal count
C
B. rule of comparison
C. rule of addition
D. rule of division
43 If a coin is tossed one time then probability of occurrence of heads is
A.1⁄2
B.1⁄1 A
C.2⁄1
D.2⁄2
44 If a luggage bag contains two types of shirts, 40 percent are dress shirts, 45 percent
are T-shirts and 15 percent are blue jeans then probability of selecting a dress shirt
in random sample is
A.0.47 B
B.0.4
C.0.35
D.0.3
45 Probability of event A that does not occur in experiment is equal to
A.1 - P(A)
B.1 + P(A) A
C.1 × P(A)
D.2 - P(A)
46 If in an experiment A and B are two events, then occurrence of event B or event A
is represented by
A.A - B
B
B.A union B
C.A intersection B
D.A + B
47 If two events A anb B are statistically independent,then their joint probability is
A.P(A)
B.P(B) C
C.P(A)P(B)
D.P(A)+P(B)
48 Conditional probability of two events Y and Z written as P(Z|Y) = P(Y and Z) ⁄
P(Y) shows that events are
A. statistically dependent events
A
B. descriptive unaffected events
C. statistically independent events
D. statistically unaffected events
49 A bag contains 4 white, 5 red and 6 blue balls. Three balls are drawn at random
from the bag. The probability that all of them are red, is:
A.1/22
C
B.3/22
C.2/91
D.2/77
50 Number of favorable occurrences are divided by total number of possible
occurrences to calculate
A. probability of an event
A
B. total outcomes of an event
C.sample space of experiment
D. none of above
UNIT- II

Distribution and Density functions and OPERATION ON ONE RANDOM VARIABLE –


EXPECTATIONS: Distribution and Density functions: Distribution and Density functions and
Properties, Binomial, Poisson, Uniform, Gaussian, Exponential, Rayleigh, Conditional Distribution,
Methods of defining Conditioning Event, Conditional Density, Properties.
OPERATION ON ONE RANDOM VARIABLE – EXPECTATIONS : Introduction, Expected Value
of a Random Variable, Function of a Random Variable, Moments about the Origin, Central Moments,
Variance and Skew, Chebychev’s Inequality, Characteristic Function, Moment Generating Function,
Transformations of a Random Variable: Monotonic Transformations for a Continuous Random Variable,
No monotonic Transformations of Continuous Random Variable, Transformation of a Discrete Random
Variable.
1 X=X(s)=s2; where s={0< s ≤ 12} then X= ?
A.{0< x ≤ 12}
B.{0 ≤x < 144 } C
C. {0< x ≤ 144}
D.0
2 For a random variable X, P{X = -∞} ≠0 and P{X = ∞} =0, then which statement is
true
A.X is not a Random variable
A
B.X is one-sided random variable
C. X is mixed random variable
D. X is not a mixed Random Variable
3 Fx(-∞)=
A. ∞
B. -∞ C
C. 0
D.2
4 P{x1 < X ≤ x2}
A. Fx(x2)
B. Fx(x1) C
C. Fx(x2)- Fx(x1)
D. Fx(x1)- Fx(x2)
5 fX(x) is always >=
A.∞ C
B. - ∞
C. 0
D. 2
6 Mean of uniform random variable is
A. a+b
B. a-b C
C. (a+b)/2
D ab
7 Area under fX(x) is
A ∞
B- ∞ C
C 1
D 2
8 A random variable is called as Gaussian, if its density function has the form

B
B

C
D. 2
9 Most widely used random variable is
A.Uniform
B. Gaussian B
C .Binomial
D. Poisson
10 P( A | B) = P(A∩B) / ------
A. P(B)
B. P(A) A
C. P(A U B)
D. P(A)-P(B)
11
A.0
B. -1 D
C. - ∞
D. 1
12
A.1
B. 0 B
C. ∞
D. 2
13 In an experiment there are 2 outcomes, which random variable is suitable
A. Uniform
B. Gaussian C
C. Binomial
D. Poisson
14 If S= {1,2,3,4,5,6,7,8,9} then find which one below is true B
A. Continuous random variable can be definable
B. Only discrete random variable is definable
C. Both continuous and Discrete random variables are definable
D. All of the above
15 If the probability density function of a random variable is given by

Find ‘k’ value


A.3 C
B. 4
C. 3/2
D. 5
16 Is the function defined as follows a density function?

f(x) =
B
A. Yes
B. no
C. cant tell
D. un determined
17 A random variable X has the density function
f(x)=kx2 , 0<x<1
0, otherwise
Determine K
A.½ D
B. 1/
C.2
D.3
18 FX ( - ∞ / B)
A.0
B. ∞ A
C.100
D.1
19 ___ ≤ FX(x/B) ≤ 1
A.∞
B. 0 B
C.∞
D. 2
20 C
If P(x) =
Find P{X = 1 or 2}
A.0.2
B.0.5
C.0.3
D. 0.5
21 Mean and variance are equal for which random variable D
A. Uniform
B. Gaussian
C. Binomial
D. Poisson
22 The variance of the random variable, of getting number of heads if two coins are
tossed is
A.2
C`
B.3
C.1
D. 6
23 The moment generating function of X, Mx(v) is expressed
A.. E[ev ]
B. E[evx] B
C. ev
D E[e2x]
24 The PDF fx(x) is defined as
A. integral of CDF
B. derivative of CDF B
C. equal to CDF
D. partial derivative of CDF
25 Let S1 and S2 be the sample spaces of two sub experiments. The combined sample
space S is given by
A. S1 X S2
A
B. S1 - S2
C. S1 + S2
D. S1 | S2
26 For N random variables X1,X2,X3,……,XN, the k-dimensional marginal density
function is defined as
A. k-fold integration of the k-dimensional marginal distribution function
D
B. k-fold partial derivative of the n-dimensional marginal distribution function
C. k-fold partial integration of the n-dimensional marginal distribution function
D. k-fold partial derivative of the k-dimensional marginal distribution function
27 The probability density function of the sum of a large no. of random variables
approaches
A. Rayleigh distribution
C
B. Uniform distribution
C. Gaussian distribution
D. Poisson distribution
28
X -3 -2 -1 0 1 3

P (x) 0.5 0.2 0.2 0 0.1 0.0

A
Find E [X + 3]
A .2
B .-1
C .4
D. 6
29 The distribution function of one random variable X conditioned by a second
D
random variable Y With interval {ya ≤y ≤yb} is known as
A. Moment Generation
B. Point Conditioning
C .Expectation
D .Interval Conditioning
30 Central limiting theorem is mostly applicable to statistically
A. Dependent Random variables
B .Independent Random variables B
C .All Random variables
D .Any Random variables
31 E [X + Y] =
A m x + my
B mx * my A
C we can’t find
D m x + my
32 If the X and Y random variables are independent the E [X.Y] =
A .E [X].E [Y]
B . E [X] /E [Y] A
C .NOT POSSIBLE
D .Undetermined
33 |E [g(x)] | ≤
A.E [ |g(x)| ]
B. E [g(x) ] A
C. 0
D. 1
34 E [aX + Y/b] =
A.. Not possible
B. E[X] + E [Y] C
C. a E[X] + E [Y]/b
D.0
35 What is the variance of a sinusoidal voltage signal x(t) with resistance 1Ω.
A. Sinusoidal current signal
B. sinc signal
C
C. Power signal
D. energy signal

36 =
A .E [(X – E(X))2]
B . E [(X2 - E(X))] A
C . E [(E – E(X)2)]2
D .0
37 Variance of Uniform random variable is
A. (b + a)2/12
B. (b − a)/12 C
C. (b − a)2/12
D. (b − a)2
38 Mean and variance of normal Gaussian distribution are
A.0,1
B.1,0 A
C.1,1
D.1,2
39
X -3 -2 -1 0 1 3

P (x) 0.5 0.2 0.2 0 0.1 0.0

A
Find E [2X + 3]
A. 1
B. 3
C. -8
D. 8
40
X -3 -2 -1 0 1 3

P (x) 0.5 0.2 0.2 0 0.1 0.0


B
Find E [3X]
A.6
B. -3
C. 3
D.0
41 What is Moment about the origion (mn)
A. E(Xn)
B. E [(X – E [X])2) where E [X] is ‘0’ A
C.0
D. Only a but not b
42 Give the expression for E [Xn] =
A.
B.
A
C.
D. 0

43 Skew is a _________ order moment about _________


A.5, mean
B. 3, mean B
C. 5, origin
D.3, origin
44 _____ is a measure of asymmetry of the probability density function
A. Second order moment about origin
B. Second order moment about mean D
C. Third order moment about origin
D. Third order moment about mean
45 µ 3 = E [(X – E [X])3]= ___________
A. E [X3] – 3 E [X] - E [X]
B. E [X ] – 3 E [X]
3
- E [X]3 C
C. E [X3] – 3 E [X] - E [X]3
D. 0
46 Write Chebychey’s Inequality C
A. P { |X – E [X]| ≥ ε } ≤
B. P { |X – E [X]| ≥ ε } ≤ 0
C P { |X – E [X]2| ≥ ε } ≤
D. P { |X 2– E [X]| ≥ ε } ≤
47 Mean and variance of binomial random variable are

A. np and np(1+p)
B. np and np(1-p) B
C. np and np
D. np(1-p) and np(1-p)

48 How to calculate Nth order moments


A. Using Characteristic Function
B. Using Moment generating Function C
C. Using Characteristic and Moment generating Function
D. Not Possible
49 ØX(ω) =
A. E [ ]
B.. E [ ] B
C. E [ ]
D. 0
50 Moment generating Function
A. E [ ]
B. E [ ] A
C. E [ ]
D. E(X)

UNIT-III

MULTIPLE RANDOM VARIABLES : Vector Random Variables, Joint Distribution Function, Properties of Joint
Distribution, Marginal Distribution Functions, Conditional Distribution and Density – Point Conditioning,
Conditional Distribution and Density – Interval conditioning, Statistical Independence, Sum of Two Random
Variables, Sum of Several Random Variables, Central Limit Theorem, (Proof not expected). Unequal Distribution,
Equal Distributions.
OPERATIONS ON MULTIPLE RANDOM VARIABLES: Expected Value of a Function of Random Variables:
Joint Moments about the Origin, Joint Central Moments, Joint Characteristic Functions, Jointly Gaussian Random
Variables: Two Random Variables case, N Random Variable case, Properties, Transformations of Multiple Random
Variables, Linear Transformations of Gaussian Random Variables.

E [X - Y] =
A. mX + mY
B. mX * mY
1 D
C.we can’t find
D. mX - mY

If the X and Y random variables are independent then E [X2+ Y2]


A. E [X].E [Y]
2 D
B. E [X] /E [Y]
C. not possible
D. E [X2]+.E [Y2]
fxy(x,y)=xy/9; 0<x<2,0<y<3
=0; elsewhere
Then x and y are statistically
3 A. independent B
B. dependent
C.Uncorrelated
D. correlated
E [aX - Y/b] =
A. Not possible
4 B. E[X] + E [Y] C
C. aE[X] - E [Y]/b
D. . E[X] - E [Y]
f x1,x2(x1,x2)=1/16; |x1|<4,2<x2<4 are
A. Independent
B. Dependent
5 C. Orthogonal A
D. Not Orthogonal

Fxy(∞,-∞)=…………
A.1
6 B.2 C
C.0
D.3
Fxy(-∞,-∞)=…………
A.1
7 B.2 C
C.0
D.3
In an electrical circuit,both current and voltage cabn be………….. variables
A.1
8 B.2 B
C.0
D.3
If fx(x)=(k/4)x, 0<x<2 is the probability function of a random variable X, the k
=…….
A.4
9 B
B.2
C.1
D.3
If Fxy(x,y)=Fx(x).Fy(y) then x and y are statistically
A.Independent
10 B.Dependent A
C. correlated
D. Uncorrelated
If fxy(x,y)=fx(x).fy(y) then x and y are statistically
A.dependent
11 B.independent B
C. correlated
D. Uncorrelated
The density function of a sum of a large number of random variables will be
A.gaussian
12 B.poisson A
C.binomial
D. 0
If events A and B refer to the sample space ‘S’,while events X≤x and Y≤y refer
to the
A.joint momentum
13 B
B.Joint sample space
C.JPDF
D. PDF
The joint density function is
A. Integral of joint CDF
14 B. Partial derivative of Joint CDF C
C. Derivative of CDF
D. Equal to joint CDF
15 C
If Random variables are X and Y then Fxy(x,y) is
A. Between 0 and 1
16 B.0 A
C.1
D.2
Fxy(x,∞)=
A.FX(x)
17 B. Fx(y) A
C.0
D.1
Fxy(∞,y)=
A.Fx(x)
B. FY(y)
18 B
C.0
D.1

f xy(x,y) is
A.≤0
19 B.≥0 B
C.1
D.2
FX,Y(x,y) is a …………….function of both x and y
A. Decreasing
20 B. linear C
C. Non decreasing
D. Monotonically
If two random variables are having the same MGF ,then they are said to
be………
A. differently
21 B
B .identically
C. numerically
D. integrally
fX,Y(x/y)=
A.fX,Y(x,y)/fY(y)
22 B. f(x,y)/f(x) A
C. f(x,y)
D. f(x)/f(y)
The marginal density of Y is…………..
A. f(x)=∫-∞∞ fX,Y(x,y)dy
B..f(y) )=∫-∞∞ fX,Y(x,y)dx
23 B
C. 0
D. 1

The marginal density of X is……………….


A. f(x))=∫-∞∞ fX,Y(x,y)dy
24 B. f(y)=∫-∞∞f(x) f(y/x)dy A
C. 0
D. 1
Fxy(x,-∞)=…………
A.1
25 B.2 C
C.0
D.3

Cxy= 0, then X and Y are either


A.only independent
26 B.uncorrelated or independent B
C. corelated
D. Dependent
=
A. +
27 B. + ± cov (X,Y) B
C. + ± 2 cov (X,Y)
D. 0
If X and Y random variables are independent =
A. +
28 B. + ± cov (X,Y) A
C. + ± 2 cov (X,Y)
D. 0
Cxy =-E[X]E[Y] then X and Y are
A.Orthogonal
29 B.Normal A
C.independent
D.Dependent
Correlation coefficient of X and Y value is
A.-1≤ ρ≤1
30 B. .-0≤ ρ≤1 A
C. .-2≤ ρ≤1
D. .-3≤ ρ≤1
If the pdf of a random variable is symmetric about the origin, then its
31 C
moment generating function is
A. asymmetric
B. general
C. symmetric
D. not general
Characteristic function is used to find moments about ………………… of a
random variable
A. origin
32 A
B. mean
C. variance
D. Skew
Moment generating function of a random variable is used to generate
moments about …………………of the random variable
A. mean
33 B
B. origin
C. infinite
D. variance
The covariance of two independent random variable is………….
A.0
34 B.1 A
C.2
D.3
∫-∞∞∫-∞∞fxy(x,y)dxdy is
A.1
35 B.0 A
C.2
D.3
If X and Y are two random variables, then the covariance of aX,bY, Where ‘a’and
‘b’ are constants is ………………
A. Cov (aX,bY) = aCov (X,Y) = aCXY
36 B. Cov (aX,bY) = abCov (X,Y) = abCXY B
C. Cov (aX,bY) = bCov (X,Y) = bCXY
D. Cov (aX,bY) = Cov (X,Y) = CXY

Two random variables X and Y with identical moment generating functions


are…….. Distributed
A. Identically
37 A
B. differently
C. randomly
D. Unequally
If X and Y are two random variables, then the covariance of X+a,Y+b, Where
‘a’and ‘b’ are constants is
A. Cov (X+a,Y+b) = Cov (X,Y) = CXY
38
B. Cov (X+a,Y+b) = aCov (X,Y) = aCXY A
C. Cov (X+a,Y+b) = bCov (X,Y) = bCXY
D. Cov (X+a,Y+b) = abCov (X,Y) = abCXY
The joint moments about the origin for two random variables, X, Y is the expected
value of the function g(X,Y) =
A. E( Xn,Yk )
39 A
B. E( Xn )
C. E( Yk )
D. E( X,Y )
E(X+Y)=E(X)+E(Y) for
A.only independent X and Y
40 B. any X and Y B
C.Orthogonal X and Y
D.uncorrelated X and Y
In an Electrical circuit,current only can be ……. Variable
A.1
41 B.2 A
C.0
D.3
The first order joint moment about the origin is________
A.m2,2
42 B.m1,1 C
C.m0,1 and m1,0
D.m2,0
The second order joint moment about the origin is________
A.m2,2
43 B.m1,1 D
C.m0,1 and m1,0
D.m2,0,m0,2 ,m1,1
The jacobian of the transformation x=u/v; y=v is
A.v
44 B.-v C
C.1/v
D.-1/v
The density of a random variable X is given by fx(x)=k for a<x<b
=0 otherwise
If A=-1 and B=2, then P(|X|≤0.5) is
45 A.1/3 C
B.2/3
C.1/2
D.3/4
∫-∞∞E(Y/X=x).f(x).dx is equal to
A.E(X)
46 B.E(Y/X) C
C.E(Y)
D.f(y)
The jacobian of the transformation x=v; y=0.5(u-v) is
A.0.5
47 B.-0.5 B
C.1
d.2
If X1,X2….Xn are identically distributed random variables such that Xk=1 with prob.1/2
=2 with prob.1/3
=-1 with prob.1/6,then
E(X12+X22+…Xn2) is
48 B
A.n
B.2n
C.n/2
D.n2
X and Y are two statistically independent random variables.Then,variance of the
49 B
random variable Z=3X-Y is
A.9.var(X)-var(Y)+6.cov(X,Y)
B.9.var(X)+var(Y)
C.3.var(X)+var(Y)
D.3.var(X)-var(Y)
X and Y are Gaussianrandom variables with variances σ2x and σ2y.Then the random
variables V=X+kY and W=X-kY are statistically independent for k equal to
A.σx σy
50 B
B. σx/ σy
C. σy/ σx
D. σx +σy

UNIT-IV

STOCHASTIC PROCESSES – TEMPORAL CHARACTERISTICS: The Random Process Concept,


Classification of Processes, Deterministic and Nondeterministic Processes, Distribution and Density
Functions, concept of Stationary and Statistical Independence. First-Order Stationary Processes, Second-
Order and Wide-Sense Stationary, (N-Order) and Strict-Sense Stationary, Time Averages and Ergodicity,
Mean-Ergodic Processes, Correlation-Ergodic Processes, Autocorrelation Function and Its Properties,
Cross-Correlation Function and Its Properties, Covariance Functions, Gaussian Random Processes,
Poisson Random Process.

1 A random process X(t), has time averages equal to ensemble averages, such a
random process is called
A. Stationary
B
B. ergodic
C. cyclostationary
D. Non stationary
2 E[X+Y] = E[X] +E[Y] for
A. only independent X&Y
B .any X&Y B
C. orthogonal X&Y
D. uncorrelated X&Y
3 Two wide sense stationary processes x(t) and y(t) are jointly wide sense stationary
if
A. E[x(t) y(t)] = E[x(t) + E[y(t)]
D
B. cov[x(t) y(t)] = var[x(t)]+var[y(t)]
C. Rxy(t, t+ τ) = Rxy(τ)
D. E[x(t)] = const & E[y(t) ]= const, Rxy(t, t+ τ) = Rxy(τ)
4 A random process is a function of [ ]
A. sample space
B..time C
C.sample space and time
D. frequency
5 RXX (τ)= 4+ cosωτ . Find average power in the process which as zero mean. A
A. 4
B. 3
C. 2
D. 5
6 An ergodic random process has E[X(t)]=5, then A[X(t)]=
A.4
B3 D
C2
D5
7 The covariance of two independent random variables is
A. -1
B. 1 C
C. 0
D. 0.5
8 Two random variables X & Y are uncorrelated whose covariance is CXY = E[XY]-
20. What is the value of cross correlation RXY =
A. 10
B
B. 20
C. 5
D. 2
9 A process stationary to all orders N = 1, 2, ---- &. For Xi = X(ti) where i = 1, 2, -----
--N is called
A. Strict-sense stationary
B. Wide-sense stationary A
C. Strictly stationary
D. Independent
10 Let X(t) is a random process which is wide sense stationery then
A. E[x(t)] = constant
B. E[x(t) . X[t + τ)] = rxx (τ) C
C. E[x(t)] = constant and e[x(t) . X[t + τ)] = Rxx (τ)
D. E[x2(t)] = 0
11 A random process is defined as X(t) = cos(wot + θ), where θ is a uniform random
variable over (-π, π). The second moment of the process is
A. 0
B.0.5 B
C.0.25
D. 1
12 For the random process X(t) = A coswt where w is a constant and A is a uniform
R.V over (0, 1)
the mean square value is
A. 1/3 B
B.1/3 coswt
C. 1/3 cos(wt+ θ)
D. 1/9
13 The random process X(t) and Y(t) are said to be independent, if fXY (x1, y1 : t1, t2)
is =
C
A. Fx(x1 : t1)
B. Fy(y1 : t2)
C. Fx(x1 : t1) fy(y1 : t2)
D. 0
14 The auto covariance of random process X(t) is CovXX(t1, t2)
A. Rxx (t1, t2) - E[x(t1)]
B. Rxx (t1, t2) + E[x(t2)] C
C. Rxx (t1, t2) - E[x(t1)] . E[x(t2)]
D. Rxx (t1, t2) + E[x(t1)] + e[x(t2)]
15 Two R.V's X1 and X2 have variances K and 2 respectively. A random variable Y is
define as Y =
3X2 - X1. If var(Y) = 25. then K =
A. 6 B
B. 7
C. 8
D. 9
16 Consider a random process X(t) defined as X(t) = A coswt + B sinwt, where w is a
constant and A and B are random variables which of the following is a condition for
its stationary.
A. E (a) = 0, E(b) = 0 A
B. E (ab) ≠ 0
C. E(a) ≠ 0; E(b) ≠ 0
D. A and b should be independent
17 Let X(t) and Y(t) be two jointly stationary random process. Than RYX (-τ)=
A. Rxy(τ)
B. Ryx(τ) A
C. Rxy(-τ)
D. - Ryx(τ)
18 A stationary random process X(t) is periodic with period 2T. Its auto correlation
function is
A. Non periodic
B. Periodic with period t C
C. Periodic with period 2t
D. Periodic with period t/2
19 RXX(τ) =
A. E[x2 (t)]
B. E[x]. E[y] D
C. E[x].e[y2]
D. E[x(t) . X(t +τ)]
20 The auto correlation function of a stationary random process X(t) is RXX(τ) =
25+(4/1+ τ 2)The Value of variance is
A. 4
B. 2 D
C. 12
D. 5
21 The auto correlation function of a stationary random process X(t) is RXX(τ) =
25+(4/1+ τ 2)The Value of E[X] is
A. 4
B. 2 D
C. 12
D. 5
22 Two processes X(t) & Y(t) are called (mutually) orthogonal if for every t1 and t2. A
A. RXY(T1, T2) = 0
B. RXY (T1, T2) > 0
C. RXY (T1, T2) < 0
D. RXY (T1, T2) = 1
23 The auto correlation function of a stationary random process X(t) is RXX(τ) =
25+(4/1+ τ 2)The Value of E[X2] is
A. 4
B. 2 C
C. 29
D. 5
24 Cross-correlation function represented as
A.RXY(t,t+ τ)
B. RYY(t,t+ τ) A
C. RXX(t,t+ τ)
D. RYX(t,t+ τ)
25 Difference of two independent Poisson processes is
A. Poisson process
B. Not a poisson process B
C. Process with mean = 0 [λ1 t ≠ λ2t]
D. Process with variance = 0 [λ1 t ≠ λ2t]
26 A Gaussian random process (which is known to be WSS) the auto covariance and
autocorrelation function will depend only on
A. Time differences
B. Absolute time A
C. Mean
D. Mean & variance
25 If X(t) and Y(t) are independent then RXY(τ) Is
A.1
B.0 C
C.E[X]E[Y]
D.E[X]+E[Y]
26 29 Let X(t) and Y(t) be two random processes with respective auto correlation
functions RXX(τ) and RYY(τ). Then |RXX(τ)| is

A.

B. C

C.

D.

27 RXX(τ) is equals to
A. RXX(2τ)
B. RXX(-τ) B
C. RXX(τ2)
D.0
28 Time average of a quantity x(t) is defined as A[x (t)] = C
A.

B.

C.

D.
29 For an ergodic process
A. Mean is necessarily zero
B. Mean square value is infinity D
C. All time averages are zero
D. Mean square value is independent of time
30 A stationary continuous process X(t) with auto-correlation function RXX(τ) is
called autocorrelation-ergodic or ergodic in the autocorrelation if, and only if, for all
τ

A)

b) B

c)

d)

31 The auto correlation function of a stationary random process X(t) is


RXX(τ) = (4/1+ τ 2)The Value of E[X] is
A. 4
B. 2 A
C. 29
D. 5
32 If sample of X(t) is a R.V then the cumulative distribution function FX(x1 : t1) is
A. P(x(t1))
B. P(x(t1) ≤ 0) C
C. P(x(t1) ≤ x1)
D. P (x(t1) ≥ x1)
33 The collection of all the sample functions is referred to a as
A. ENSEMBLE
B. ASSEMBLE A
C. AVERAGE
D. SET
34 If the future value of a sample function cannot be predicted based on its past,
values, the process is referred to as.
A. Deterministic process
B. Non-deterministic process B
C. Independent process
D. Statistical process
35 If RXY = 0 then X and Y are
A. Independent
B. Orthogonal C
C. Independent & orthogonal
D. Statistically independent
36 If the future value of the sample function can be predicted based on its past values,
the process
is referred to as
A. Deterministic process A
B. Non-deterministic process
C. Independent process
D. Statistical process
37 The mean of random process X(t) is the expected value of the R.V X at time t i.e.,
the mean m(t)

A.

B. B

C.

D.

38 The auto covariance C(t1, t2) of a process X(t) is (assume that E[X(t) = η(t))

A. C(t1, t2) = r(t1, t2)


B
B. C(t1, t2) = r(t1, t2) – E(
C. C(t1, t2) = r2(t1, t2) +
D. C(t1, t2) = r2(t1, t2) -
39 If X(t) is ergodic, zero mean and has no periodic component then
B
A.
B.
C. Mean is 0
D. Constant mean
40 Two processes X(t) and Y(t)are called mutually orthogonal if for every t 1 and t2
A. RXY(t 1,t2)=0
B. RXY(t 1,t2)>0 A
C. RXY(t 1,t2)<0
D. RXY(t 1,t2)=1
41 A stationary random process X(t) will have its ------properties not affected by a shift
in time
A.mathematical
C
B.normal
C.statistical
D. natural
42 Nth order stationary process if mean is constant then
A.Random variable
B.Strict sense stationary
B
C.wide sense stationary
D.0

43 If all the statistical properties of x(t) are not affected by Time shift is referred as
A. SSS
B .WSS A
C. Non-stationary
D. Continuous
44 --------averages are computed by considering all the sample functions
A.time
B.ensemble B
C. Space
D. Cordinates
45 A random process X(t) is said to be mean ergodic or ergodic in the mean sense if its
statistical average is---------to its time average
A.equal
A
B.not equal
C.less than
D. greater than
46
X(t) is a random process with mean 3 and autocorrelation R(t1, t2) = .
The
covariance of the R.V's Z = X(5) and w = X(8) is
A. E-0.6 B
B. 4 e-0.6
C. 8 e-0.6
D.1/6. E-0.
47 The auto covariance of random process X(t) is CovXX(t1, t2)
A. Rxx (t1, t2) - e[x(t1)]
C
B. Rxx (t1, t2) + e[x(t2)]
C. Rxx (t1, t2) - E[x(t1)] . E[x(t2)]
D. Rxx (t1, t2) + e[x(t1)] + e[x(t2)]
48 A random process X(t) is called stationary if its ------ properties are unaffected by
time shift.
A. Statistical
A
B. Non statistical
C.Time
D.Frequency
49 RX,Y(τ)=
A. RX,Y(-τ)
B. RY,X(τ)
B
C. RYX(-τ)
D. -RX,Y(τ)

50 A random process is described by X(t)=A; where A is a continuous random


variable uniformly distributed on (0,1).The mean of X9t) is
A.1/2
A
B.0
C.1
D.1/3

UNIT-V

STOCHASTIC PROCESSES – SPECTRAL CHARACTERISTICS: The Power Spectrum: Properties,


Relationship between Power Spectrum and Autocorrelation Function, The Cross-Power Density Spectrum,
Properties, Relationship between Cross-Power Spectrum and Cross-Correlation Function. Spectral Characteristics of
System Response: Power Density Spectrum of Response, Cross-Power Density Spectrums of Input and Output of a
linear system
1 PSD of WSS is always
A.Negative
B. Non-negative B
C.0
D.can be negative or positive
2 For a periodic signal _________ is used for the study of its spectral behaviour
A. Fourier series
B. Fourier transform A
C. Z-transform
D. Laplace transform
3 For aperiodic signal _________ is used for the study of its spectral behaviour
A. Fourier series
B. Fourier transform B
C. Z-transform
D. Laplace transform
4 When two random variables which are stastically independent the RXY = ______
A.E[X].E[Y]
B. E[X.Y] A
C. E[X]/E[Y]
D. 0
5 Where RXY =0 then that random variables X and Y are
A. Statically Independent B
B. Uncorrelated
C. Correlated
D. Statically dependent
6 Time average of autocorrelation function and PSD form a ----- pair
A. Fourier series
B. Fourier transform B
C. Z-transform
D. Laplace transform
7 SXY(w)=0 if X(t) and Y(t) are
A.orthogonal
B. parallel A
C. vertical
D. dependent
8 A random process n(t) is called white noise if its power spectral density
is………..over the entire frequency range
A.Non uniform
B
B.uniform
C.0
D.1
9 The curve SXX(w)/A always encloses _______ area
A.double
B.unit B
C. triple
D. 0
10 Pxx(W) is always
A.0
B.1 C
C. greater than equal to zero
D. greater than zero
11 Pxx(-W) =
A Pxx(W)
B.- Pxx(W) A
C. 2Pxx(W)
D. 3Pxx(W)
12 Pxx(-W) is
A. Real
B. imaginary A
C. Fractional
D. Complex
13 The PSD of the derivative Of the random process X(t) is----------times the PSD of
X(t)
A .w
B
B. w2
C.0
D.1
14 Pxx(W) =Pyx(W)=
A. - Pxx(W)
B. 2 Pxx(W) C
C. Pyx*(W)
D. -Pyx*(W)
15 The average power of the periodic random process signal whose auto correlation B
function
A. 0
B. 1
C. 2
D. 3
16

If and

and X(t) and Y(t) are of zero mean, then let U(t) = X(t) + Y(t). Then SXU (w) is

A
D
B

17 If Y(t) = X(t) - X(t - a) is a random process and X(t) is a WSS process and a > 0, is
a constant, the PSD of y(t) in terms of the corresponding quantities of X(t) is

A
B

D
18 A random process is given by Z(t) = A. X(t) + B.Y(t) where 'A' and 'B' are real
constants and X(t)
and Y(t) are jointly WSS processes. The power spectrum SZZ(w) is
A. Ab sxy (w) + ab syx (w) C
B. A2 + b2, + ab sxy (w) + ab syx (w)
C. A2 sxx(w) + ab sxy (w) + ab syx (w) + b2 syy (w)
D. 0
19 A random process is given by Z(t) = A. X(t) + B.Y(t) where 'A' and 'B' are real
constants and X(t)
and Y(t) are jointly WSS processes. If X(t) and Y(t) are uncorrelated then SZZ (w)
B
A. A2 + b2
B. A2 sxx (w) + b2 syy (w)
C. Ab sxy(w) + ab syx (w)
D. 0
20 PSD is _________ function of frequency
A. even
B. odd A
C. periodic
D. asymmetric
21 For a WSS process, PCD at zero frequency gives
A. the area under the graph of power spectral density
B. area under the graph auto correlation of the process B
C. mean of the process
D. variance of the process
22 The mean square value of WSS process equals
A. the area under the graph of PSD
B the area under the graph of auto correlation of process A
C.zero
D.mean of the process
23 X(T) = A cos (wot + θ), where A and wo are constants and θ is a R.V uniformly
distributed over
(0, π). The average power of X(t) is

B B

D
24
A WSS process X(t) has an auto correlation function . The PSD is

B A

D
25 The real part and imaginary part of SYX (w) is ________ and ___________
function of w
respectively.
A. Odd, odd C
B. Odd, even
C. Even, odd
D. Even, even
26
If cross correlation is the cross
power spectrum is D
SXY (w) =
A

D
27 If X(t) and Y(t) are uncorrelated and of constant means E(X) and E(Y), respectively
then SXY (w)Is
A E(X) E(Y)
B
B 2πE(X) E(Y) δ (w)
C 2E(X) E(Y) δ (w)
D 0
28 The means of two independent, WSS processes X(t) and Y(t) are 2 and 3
respectively. Their crossspectral density is
A.6 δ (w)
B
B.12πδz(w)
C.5πδ(w)
D.δ(w)
29 Time average of cross correlation function and the cross spectral density function
from ________
Pair
A. Laplace transform C
B. Z-transform
C. Fourier transform
D. Convolution
30 SYX (w) =
A. Sxy (w)
B. Sxy (-w) B
C. Syx (-w)
D. - syx (w)
31 If X(t) and Y(t) are orthogonal then
A. Sxy (w) = 0
B. Sxy (w) = 1 A
C. Sxy (w) > 1
D. Sxy (w) < 1
32 The cross power formula Pxy is given by

A.
D
B.

C.
D.
33 The average power PXY is
A.
B. C
C.
D.
34 A random process n(t) has a PSD G(f)=η/2 for -∞ ≤ f ≤ ∞. The random process is
passed through alow pass filter which has a transfer function H(f)=2 for -fm ≤ f ≤
fm and H(f)=0 otherwise. Find the PSD of the waveform at the output of the filter.
A
A.2 η
B. η/3
C. η4
D. η/4
35

Then RXY (τ) is


A._
D
B.

C.

D.
36 Consider a random process x(t) = cos(ωt + Ө) where ω is a real constant and Ө is a
uniform random variable in (0, π/2). Find the average power in the process. [ ]
A.2/7
C
B. ¼
C. 1/2
D. 1/6
37 Consider a linear system as shown below. x(t) is the input y(t) is the output of the
system. The auto correlation of x(t) is Rxx(τ)=3δ(τ). Find the PSD auto correlation
function and mean square value of the output y(t).

A. 1/2
B. 1/3
C. 2/4
D. ¼
38

If then RXY(τ)
A.
B. B

C.
D.

39 A random process is given by Z(t) = A.X(t) + B.Y(t) where 'A' and 'B' are real
constant's ad X(t)
and Y(t) are jointly WSS processes. The cross power spectrum Sxz (w) is
A .SYX(W) + B SYY (W) C
B. A SXX(W) + B SYY (W)
C. A SXX (W) + B SXY (W)
D. A SYY (W) + B SXY (W)
40 For the above problem SYZ (w) is
A. A syx (w) + b syy(w)
B. A sxx (w) + b syy (w) A
C. A sxx (w) + b sxy (w)
D. A syy (w) + b sxx (w)
41
The auto correlation function of a WSS random process X(t) is
. The area
enclosed by the PSD curve of X(t) is
A. 6 A
B. 2
C. 8
D. 4
42 The time average of the autocorrelation function and the power spectral density
form a pair of
A. Z-transform
C
B. Laplace transform
C. Fourier transform
D. Convolution
43 If X(t) is an engodic process with auto correlation function of the form for
FOR
= 0 for
The spectral density SXX(w) is
D

A.

B.
C.

D.

44
If for - 2πB ≤ w ≤ 2πB then RXX (τ) is =
A.
B.
A

C.

D.

45
The auto correlation function of a process with PSD of is
A.
C
B.
C.
D.1
46 If Z(t) = X(t) + Y(t) where X(t) and Y(t) are two WSS processes and X(t) and Y(t)
are un
correlated and of zero then RZZ(τ)=
A. Rxy (τ) + ryx (τ) B
B. Rxx (τ) + ryy(τ)
C. Rxy (τ) - ryx (τ)
D. Rxx (τ) + ryy(τ)
47 For the above problem SZZ(w) is
A. Sxx (w) - syy(w)
B. Sxy (w) - syx(w) B
C. Sxx (w) + syy(w)
D. Sxy(w) -syx(w)
48 The mean square value of a __________________ process equals the area under the
graph of a power spectral density
A.wide sense stationary processes (WSS)
A
B. Strictly sense stationary processes
C.Random Process
D. stationary processes
49 Average process is given by
A. E [x(t)]
B
B. Rxx (0)
C. {e[x(t)]}2
D. Rxx (1)
50 If Z(t) = X(t) + Y(t) where X(t) and Y(t) are two WSS processes and X(t) and Y(t)
are un
correlated and of zero then RZZ(τ)=
A. Rxy (τ) + ryx (τ) B
B. Rxx (τ) + ryy(τ)
C. Rxy (τ) - ryx (τ)
D. Rxx (τ) + ryy(τ)

You might also like