IARE PTSP QUIZ Final1
IARE PTSP QUIZ Final1
IARE PTSP QUIZ Final1
(Autonomous)
Dundigal, Hyderabad - 500 043
Regulations : IARE-R16
Semester : III
B. B
C.
D. 0
8 Let S1 and S2 be the sample spaces of two sub experiments. The combined sample
space S is given by
A. S1 X S2
A
B. S1 - S2
C. S1 _ S2
D. S1 | S2
9
A. Relative frequency
B. P(H) A
C. P(A)
D. P(B)
10 Arrangement of n things in a specified manner is called……….
A. permutations
B. combinations A
C. probability
D. Relative Frequency
11 set containing only one element is called……set
A. null set
B. subset C
C. singleton
D. Doubleton
13 An experiment whose outcome can be predicted with maximum certainity is
called…..experiment
A. deterministic
A
B. non deterministic
C. periodic
D. aperiodic
14 If S= {1,2,3,4,5,6,7,8,9} then find P(S)=
A. 1
B. 0 A
C. -1
D. undefined
15
=
A. –
C
B. U
C.
D. 0
16 If ≠Ф m≠n=1,2,3,……..N then
A. 0
B. 1 C
C. S
D. infinite
17 Tickets numbered 1 to 20 are mixed up and then a ticket is drawn at random. What
is the probability that the ticket drawn has a number which is a multiple of 3 or 5?
A. 1/2
D
B. 2/5
C. 8/15
D. 9/20
18 What is the probability of getting a sum 9 from two throws of a dice?
A. 1/6
B. 1/8 C
C. 1/9
D. 1/12
19 Three unbiased coins are tossed. What is the probability of getting at most two D
heads?
A. 3/4
B. 1/4
C. 3/8
D. 7/8
20 From a pack of 52 cards, two cards are drawn together at random. What is the
probability of both the cards being kings?
A. 1/15
D
B. 25/57
C. 35/256
D. 1/221
21 Two dice are tossed. The probability that the total score is a prime number is:
A. 1/6
B. 5/12 B
C. 1/2
D. 7/9
22 If two events X and Y are considered as partially overlapping events then rule of
addition can be written as
A.P(X or Y) = P(X) - P(Y) + P(X and Y)
B. P(X or Y) = P(X) + P(Y) * P(X - Y)
C..P(X or Y) = P(X) * P(Y) + P(X - Y) D
D.P(X or Y) = P(X) + P(Y) - P(X and Y)
23 According to combination rule, if total number of outcomes are 'r' and distinct
outcome collection is 'n' then combinations are calculated as
A. n! ⁄ r!(n - r)!
A
B. n! ⁄ r!(n + r)!
C. r! ⁄ n!(n - r)!
D. r! ⁄ n!(n + r)!
24 Outcomes of an experiment are classified as
A. logged events
B. exponential results D
C. results
D. events
25 Types of probabilities for independent events must includes
A. joint events
B. marginal events D
C. conditional events
D. joint, conditional and marginal events
26 Joint probability of two statistical dependent events Y and Z can be written as P(Y
and Z) =
A.P(Z + Y) * P(Y|Z)
B
B.P(Y) * P(Z|Y)
C.P(Y) * P(Z|Y) + P(Z)
D.P(Y) * P(Z|Y) - P(Z + Y)
27 In a Venn diagram used to represent probabilities, sample space of events is
represented by
A. square
B. triangle D
C. circle
D. rectangle
28 Consider an event A, non occurrence of event A is represented by
A. union of A
B. complement of A B
C. intersection of A
D. A is equal to zero
29 Method of counting outcomes in which number of outcomes are determined while
considering ordering is classified as
A. intersection combinations
D
B. union combinations
C. listed combination
D. permutations
30 Important rules in computation of experimental outcomes includes
A. multiple experiments
B. permutations D
C. combinations
D. permutations, combinations and multiple experiments
31 For two events, probability of occurrence of both events at same time or occurrence
in series is classified as
A. joint probability
A
B. dependent probability
C. series probability
D. conditional probability
32 A card is drawn from a pack of 52 cards. The probability of getting a queen of club
or a king of heart is:
A. 1/13
C
B. 2/13
C. 1/26
D. 1/52
33 In probability theory, events are denoted by
A. Greek letters
B. capital letters B
C. small letters
D. Latin letters
34 If a brown sack consists of 4 white balls and 3 black balls then probability of one
randomly drawn ball will be white is
A
A.4 ⁄ 7
B.1 ⁄7
C.4 ⁄ 4
D.4 ⁄ 3
35 Difference between sample space and subset of sample space is considered as
A. numerical complementary events
B. equal compulsory events C
C. complementary events
D. compulsory events
36 If a bag contains 16 apples, 30 oranges and 20 some other fruits that is neither
oranges nor apples then probability of selecting an orange randomly is
A.0.3
B.0.45
B
C.0.65
D.0.034
37 Method in which previously calculated probabilities are revised with new
probabilities is classified as
A. updating theorem
C
B. revised theorem
C. Baye’s theorem
D. dependency theorem
38 Probability of events must lie in limits of
A. one to two
B. two to three D
C. one to two
D. zero to one
39 Event such as equal chance of heads or tails while tossing coin is an example of
A. numerical events
B. equally likely events B
C. unequal events
D. non-numerical events
40 In a Venn diagram used to represent probabilities, occurred events are represented
by
A. circle
A
B. rectangle
C. square
D. triangle
41 Measure of chance of an uncertain event in form of numerical figures is classified
as
A. probability A
B. variability
C. durability
D. likelihood
42 For mutually exclusive events, formula of calculating probability as n(A) ⁄ n(S) +
n(B) ⁄ n(S) is used for
A. rule of marginal count
C
B. rule of comparison
C. rule of addition
D. rule of division
43 If a coin is tossed one time then probability of occurrence of heads is
A.1⁄2
B.1⁄1 A
C.2⁄1
D.2⁄2
44 If a luggage bag contains two types of shirts, 40 percent are dress shirts, 45 percent
are T-shirts and 15 percent are blue jeans then probability of selecting a dress shirt
in random sample is
A.0.47 B
B.0.4
C.0.35
D.0.3
45 Probability of event A that does not occur in experiment is equal to
A.1 - P(A)
B.1 + P(A) A
C.1 × P(A)
D.2 - P(A)
46 If in an experiment A and B are two events, then occurrence of event B or event A
is represented by
A.A - B
B
B.A union B
C.A intersection B
D.A + B
47 If two events A anb B are statistically independent,then their joint probability is
A.P(A)
B.P(B) C
C.P(A)P(B)
D.P(A)+P(B)
48 Conditional probability of two events Y and Z written as P(Z|Y) = P(Y and Z) ⁄
P(Y) shows that events are
A. statistically dependent events
A
B. descriptive unaffected events
C. statistically independent events
D. statistically unaffected events
49 A bag contains 4 white, 5 red and 6 blue balls. Three balls are drawn at random
from the bag. The probability that all of them are red, is:
A.1/22
C
B.3/22
C.2/91
D.2/77
50 Number of favorable occurrences are divided by total number of possible
occurrences to calculate
A. probability of an event
A
B. total outcomes of an event
C.sample space of experiment
D. none of above
UNIT- II
B
B
C
D. 2
9 Most widely used random variable is
A.Uniform
B. Gaussian B
C .Binomial
D. Poisson
10 P( A | B) = P(A∩B) / ------
A. P(B)
B. P(A) A
C. P(A U B)
D. P(A)-P(B)
11
A.0
B. -1 D
C. - ∞
D. 1
12
A.1
B. 0 B
C. ∞
D. 2
13 In an experiment there are 2 outcomes, which random variable is suitable
A. Uniform
B. Gaussian C
C. Binomial
D. Poisson
14 If S= {1,2,3,4,5,6,7,8,9} then find which one below is true B
A. Continuous random variable can be definable
B. Only discrete random variable is definable
C. Both continuous and Discrete random variables are definable
D. All of the above
15 If the probability density function of a random variable is given by
f(x) =
B
A. Yes
B. no
C. cant tell
D. un determined
17 A random variable X has the density function
f(x)=kx2 , 0<x<1
0, otherwise
Determine K
A.½ D
B. 1/
C.2
D.3
18 FX ( - ∞ / B)
A.0
B. ∞ A
C.100
D.1
19 ___ ≤ FX(x/B) ≤ 1
A.∞
B. 0 B
C.∞
D. 2
20 C
If P(x) =
Find P{X = 1 or 2}
A.0.2
B.0.5
C.0.3
D. 0.5
21 Mean and variance are equal for which random variable D
A. Uniform
B. Gaussian
C. Binomial
D. Poisson
22 The variance of the random variable, of getting number of heads if two coins are
tossed is
A.2
C`
B.3
C.1
D. 6
23 The moment generating function of X, Mx(v) is expressed
A.. E[ev ]
B. E[evx] B
C. ev
D E[e2x]
24 The PDF fx(x) is defined as
A. integral of CDF
B. derivative of CDF B
C. equal to CDF
D. partial derivative of CDF
25 Let S1 and S2 be the sample spaces of two sub experiments. The combined sample
space S is given by
A. S1 X S2
A
B. S1 - S2
C. S1 + S2
D. S1 | S2
26 For N random variables X1,X2,X3,……,XN, the k-dimensional marginal density
function is defined as
A. k-fold integration of the k-dimensional marginal distribution function
D
B. k-fold partial derivative of the n-dimensional marginal distribution function
C. k-fold partial integration of the n-dimensional marginal distribution function
D. k-fold partial derivative of the k-dimensional marginal distribution function
27 The probability density function of the sum of a large no. of random variables
approaches
A. Rayleigh distribution
C
B. Uniform distribution
C. Gaussian distribution
D. Poisson distribution
28
X -3 -2 -1 0 1 3
A
Find E [X + 3]
A .2
B .-1
C .4
D. 6
29 The distribution function of one random variable X conditioned by a second
D
random variable Y With interval {ya ≤y ≤yb} is known as
A. Moment Generation
B. Point Conditioning
C .Expectation
D .Interval Conditioning
30 Central limiting theorem is mostly applicable to statistically
A. Dependent Random variables
B .Independent Random variables B
C .All Random variables
D .Any Random variables
31 E [X + Y] =
A m x + my
B mx * my A
C we can’t find
D m x + my
32 If the X and Y random variables are independent the E [X.Y] =
A .E [X].E [Y]
B . E [X] /E [Y] A
C .NOT POSSIBLE
D .Undetermined
33 |E [g(x)] | ≤
A.E [ |g(x)| ]
B. E [g(x) ] A
C. 0
D. 1
34 E [aX + Y/b] =
A.. Not possible
B. E[X] + E [Y] C
C. a E[X] + E [Y]/b
D.0
35 What is the variance of a sinusoidal voltage signal x(t) with resistance 1Ω.
A. Sinusoidal current signal
B. sinc signal
C
C. Power signal
D. energy signal
36 =
A .E [(X – E(X))2]
B . E [(X2 - E(X))] A
C . E [(E – E(X)2)]2
D .0
37 Variance of Uniform random variable is
A. (b + a)2/12
B. (b − a)/12 C
C. (b − a)2/12
D. (b − a)2
38 Mean and variance of normal Gaussian distribution are
A.0,1
B.1,0 A
C.1,1
D.1,2
39
X -3 -2 -1 0 1 3
A
Find E [2X + 3]
A. 1
B. 3
C. -8
D. 8
40
X -3 -2 -1 0 1 3
A. np and np(1+p)
B. np and np(1-p) B
C. np and np
D. np(1-p) and np(1-p)
UNIT-III
MULTIPLE RANDOM VARIABLES : Vector Random Variables, Joint Distribution Function, Properties of Joint
Distribution, Marginal Distribution Functions, Conditional Distribution and Density – Point Conditioning,
Conditional Distribution and Density – Interval conditioning, Statistical Independence, Sum of Two Random
Variables, Sum of Several Random Variables, Central Limit Theorem, (Proof not expected). Unequal Distribution,
Equal Distributions.
OPERATIONS ON MULTIPLE RANDOM VARIABLES: Expected Value of a Function of Random Variables:
Joint Moments about the Origin, Joint Central Moments, Joint Characteristic Functions, Jointly Gaussian Random
Variables: Two Random Variables case, N Random Variable case, Properties, Transformations of Multiple Random
Variables, Linear Transformations of Gaussian Random Variables.
E [X - Y] =
A. mX + mY
B. mX * mY
1 D
C.we can’t find
D. mX - mY
Fxy(∞,-∞)=…………
A.1
6 B.2 C
C.0
D.3
Fxy(-∞,-∞)=…………
A.1
7 B.2 C
C.0
D.3
In an electrical circuit,both current and voltage cabn be………….. variables
A.1
8 B.2 B
C.0
D.3
If fx(x)=(k/4)x, 0<x<2 is the probability function of a random variable X, the k
=…….
A.4
9 B
B.2
C.1
D.3
If Fxy(x,y)=Fx(x).Fy(y) then x and y are statistically
A.Independent
10 B.Dependent A
C. correlated
D. Uncorrelated
If fxy(x,y)=fx(x).fy(y) then x and y are statistically
A.dependent
11 B.independent B
C. correlated
D. Uncorrelated
The density function of a sum of a large number of random variables will be
A.gaussian
12 B.poisson A
C.binomial
D. 0
If events A and B refer to the sample space ‘S’,while events X≤x and Y≤y refer
to the
A.joint momentum
13 B
B.Joint sample space
C.JPDF
D. PDF
The joint density function is
A. Integral of joint CDF
14 B. Partial derivative of Joint CDF C
C. Derivative of CDF
D. Equal to joint CDF
15 C
If Random variables are X and Y then Fxy(x,y) is
A. Between 0 and 1
16 B.0 A
C.1
D.2
Fxy(x,∞)=
A.FX(x)
17 B. Fx(y) A
C.0
D.1
Fxy(∞,y)=
A.Fx(x)
B. FY(y)
18 B
C.0
D.1
f xy(x,y) is
A.≤0
19 B.≥0 B
C.1
D.2
FX,Y(x,y) is a …………….function of both x and y
A. Decreasing
20 B. linear C
C. Non decreasing
D. Monotonically
If two random variables are having the same MGF ,then they are said to
be………
A. differently
21 B
B .identically
C. numerically
D. integrally
fX,Y(x/y)=
A.fX,Y(x,y)/fY(y)
22 B. f(x,y)/f(x) A
C. f(x,y)
D. f(x)/f(y)
The marginal density of Y is…………..
A. f(x)=∫-∞∞ fX,Y(x,y)dy
B..f(y) )=∫-∞∞ fX,Y(x,y)dx
23 B
C. 0
D. 1
UNIT-IV
1 A random process X(t), has time averages equal to ensemble averages, such a
random process is called
A. Stationary
B
B. ergodic
C. cyclostationary
D. Non stationary
2 E[X+Y] = E[X] +E[Y] for
A. only independent X&Y
B .any X&Y B
C. orthogonal X&Y
D. uncorrelated X&Y
3 Two wide sense stationary processes x(t) and y(t) are jointly wide sense stationary
if
A. E[x(t) y(t)] = E[x(t) + E[y(t)]
D
B. cov[x(t) y(t)] = var[x(t)]+var[y(t)]
C. Rxy(t, t+ τ) = Rxy(τ)
D. E[x(t)] = const & E[y(t) ]= const, Rxy(t, t+ τ) = Rxy(τ)
4 A random process is a function of [ ]
A. sample space
B..time C
C.sample space and time
D. frequency
5 RXX (τ)= 4+ cosωτ . Find average power in the process which as zero mean. A
A. 4
B. 3
C. 2
D. 5
6 An ergodic random process has E[X(t)]=5, then A[X(t)]=
A.4
B3 D
C2
D5
7 The covariance of two independent random variables is
A. -1
B. 1 C
C. 0
D. 0.5
8 Two random variables X & Y are uncorrelated whose covariance is CXY = E[XY]-
20. What is the value of cross correlation RXY =
A. 10
B
B. 20
C. 5
D. 2
9 A process stationary to all orders N = 1, 2, ---- &. For Xi = X(ti) where i = 1, 2, -----
--N is called
A. Strict-sense stationary
B. Wide-sense stationary A
C. Strictly stationary
D. Independent
10 Let X(t) is a random process which is wide sense stationery then
A. E[x(t)] = constant
B. E[x(t) . X[t + τ)] = rxx (τ) C
C. E[x(t)] = constant and e[x(t) . X[t + τ)] = Rxx (τ)
D. E[x2(t)] = 0
11 A random process is defined as X(t) = cos(wot + θ), where θ is a uniform random
variable over (-π, π). The second moment of the process is
A. 0
B.0.5 B
C.0.25
D. 1
12 For the random process X(t) = A coswt where w is a constant and A is a uniform
R.V over (0, 1)
the mean square value is
A. 1/3 B
B.1/3 coswt
C. 1/3 cos(wt+ θ)
D. 1/9
13 The random process X(t) and Y(t) are said to be independent, if fXY (x1, y1 : t1, t2)
is =
C
A. Fx(x1 : t1)
B. Fy(y1 : t2)
C. Fx(x1 : t1) fy(y1 : t2)
D. 0
14 The auto covariance of random process X(t) is CovXX(t1, t2)
A. Rxx (t1, t2) - E[x(t1)]
B. Rxx (t1, t2) + E[x(t2)] C
C. Rxx (t1, t2) - E[x(t1)] . E[x(t2)]
D. Rxx (t1, t2) + E[x(t1)] + e[x(t2)]
15 Two R.V's X1 and X2 have variances K and 2 respectively. A random variable Y is
define as Y =
3X2 - X1. If var(Y) = 25. then K =
A. 6 B
B. 7
C. 8
D. 9
16 Consider a random process X(t) defined as X(t) = A coswt + B sinwt, where w is a
constant and A and B are random variables which of the following is a condition for
its stationary.
A. E (a) = 0, E(b) = 0 A
B. E (ab) ≠ 0
C. E(a) ≠ 0; E(b) ≠ 0
D. A and b should be independent
17 Let X(t) and Y(t) be two jointly stationary random process. Than RYX (-τ)=
A. Rxy(τ)
B. Ryx(τ) A
C. Rxy(-τ)
D. - Ryx(τ)
18 A stationary random process X(t) is periodic with period 2T. Its auto correlation
function is
A. Non periodic
B. Periodic with period t C
C. Periodic with period 2t
D. Periodic with period t/2
19 RXX(τ) =
A. E[x2 (t)]
B. E[x]. E[y] D
C. E[x].e[y2]
D. E[x(t) . X(t +τ)]
20 The auto correlation function of a stationary random process X(t) is RXX(τ) =
25+(4/1+ τ 2)The Value of variance is
A. 4
B. 2 D
C. 12
D. 5
21 The auto correlation function of a stationary random process X(t) is RXX(τ) =
25+(4/1+ τ 2)The Value of E[X] is
A. 4
B. 2 D
C. 12
D. 5
22 Two processes X(t) & Y(t) are called (mutually) orthogonal if for every t1 and t2. A
A. RXY(T1, T2) = 0
B. RXY (T1, T2) > 0
C. RXY (T1, T2) < 0
D. RXY (T1, T2) = 1
23 The auto correlation function of a stationary random process X(t) is RXX(τ) =
25+(4/1+ τ 2)The Value of E[X2] is
A. 4
B. 2 C
C. 29
D. 5
24 Cross-correlation function represented as
A.RXY(t,t+ τ)
B. RYY(t,t+ τ) A
C. RXX(t,t+ τ)
D. RYX(t,t+ τ)
25 Difference of two independent Poisson processes is
A. Poisson process
B. Not a poisson process B
C. Process with mean = 0 [λ1 t ≠ λ2t]
D. Process with variance = 0 [λ1 t ≠ λ2t]
26 A Gaussian random process (which is known to be WSS) the auto covariance and
autocorrelation function will depend only on
A. Time differences
B. Absolute time A
C. Mean
D. Mean & variance
25 If X(t) and Y(t) are independent then RXY(τ) Is
A.1
B.0 C
C.E[X]E[Y]
D.E[X]+E[Y]
26 29 Let X(t) and Y(t) be two random processes with respective auto correlation
functions RXX(τ) and RYY(τ). Then |RXX(τ)| is
A.
B. C
C.
D.
27 RXX(τ) is equals to
A. RXX(2τ)
B. RXX(-τ) B
C. RXX(τ2)
D.0
28 Time average of a quantity x(t) is defined as A[x (t)] = C
A.
B.
C.
D.
29 For an ergodic process
A. Mean is necessarily zero
B. Mean square value is infinity D
C. All time averages are zero
D. Mean square value is independent of time
30 A stationary continuous process X(t) with auto-correlation function RXX(τ) is
called autocorrelation-ergodic or ergodic in the autocorrelation if, and only if, for all
τ
A)
b) B
c)
d)
A.
B. B
C.
D.
38 The auto covariance C(t1, t2) of a process X(t) is (assume that E[X(t) = η(t))
43 If all the statistical properties of x(t) are not affected by Time shift is referred as
A. SSS
B .WSS A
C. Non-stationary
D. Continuous
44 --------averages are computed by considering all the sample functions
A.time
B.ensemble B
C. Space
D. Cordinates
45 A random process X(t) is said to be mean ergodic or ergodic in the mean sense if its
statistical average is---------to its time average
A.equal
A
B.not equal
C.less than
D. greater than
46
X(t) is a random process with mean 3 and autocorrelation R(t1, t2) = .
The
covariance of the R.V's Z = X(5) and w = X(8) is
A. E-0.6 B
B. 4 e-0.6
C. 8 e-0.6
D.1/6. E-0.
47 The auto covariance of random process X(t) is CovXX(t1, t2)
A. Rxx (t1, t2) - e[x(t1)]
C
B. Rxx (t1, t2) + e[x(t2)]
C. Rxx (t1, t2) - E[x(t1)] . E[x(t2)]
D. Rxx (t1, t2) + e[x(t1)] + e[x(t2)]
48 A random process X(t) is called stationary if its ------ properties are unaffected by
time shift.
A. Statistical
A
B. Non statistical
C.Time
D.Frequency
49 RX,Y(τ)=
A. RX,Y(-τ)
B. RY,X(τ)
B
C. RYX(-τ)
D. -RX,Y(τ)
UNIT-V
If and
and X(t) and Y(t) are of zero mean, then let U(t) = X(t) + Y(t). Then SXU (w) is
A
D
B
17 If Y(t) = X(t) - X(t - a) is a random process and X(t) is a WSS process and a > 0, is
a constant, the PSD of y(t) in terms of the corresponding quantities of X(t) is
A
B
D
18 A random process is given by Z(t) = A. X(t) + B.Y(t) where 'A' and 'B' are real
constants and X(t)
and Y(t) are jointly WSS processes. The power spectrum SZZ(w) is
A. Ab sxy (w) + ab syx (w) C
B. A2 + b2, + ab sxy (w) + ab syx (w)
C. A2 sxx(w) + ab sxy (w) + ab syx (w) + b2 syy (w)
D. 0
19 A random process is given by Z(t) = A. X(t) + B.Y(t) where 'A' and 'B' are real
constants and X(t)
and Y(t) are jointly WSS processes. If X(t) and Y(t) are uncorrelated then SZZ (w)
B
A. A2 + b2
B. A2 sxx (w) + b2 syy (w)
C. Ab sxy(w) + ab syx (w)
D. 0
20 PSD is _________ function of frequency
A. even
B. odd A
C. periodic
D. asymmetric
21 For a WSS process, PCD at zero frequency gives
A. the area under the graph of power spectral density
B. area under the graph auto correlation of the process B
C. mean of the process
D. variance of the process
22 The mean square value of WSS process equals
A. the area under the graph of PSD
B the area under the graph of auto correlation of process A
C.zero
D.mean of the process
23 X(T) = A cos (wot + θ), where A and wo are constants and θ is a R.V uniformly
distributed over
(0, π). The average power of X(t) is
Aθ
B B
D
24
A WSS process X(t) has an auto correlation function . The PSD is
B A
D
25 The real part and imaginary part of SYX (w) is ________ and ___________
function of w
respectively.
A. Odd, odd C
B. Odd, even
C. Even, odd
D. Even, even
26
If cross correlation is the cross
power spectrum is D
SXY (w) =
A
D
27 If X(t) and Y(t) are uncorrelated and of constant means E(X) and E(Y), respectively
then SXY (w)Is
A E(X) E(Y)
B
B 2πE(X) E(Y) δ (w)
C 2E(X) E(Y) δ (w)
D 0
28 The means of two independent, WSS processes X(t) and Y(t) are 2 and 3
respectively. Their crossspectral density is
A.6 δ (w)
B
B.12πδz(w)
C.5πδ(w)
D.δ(w)
29 Time average of cross correlation function and the cross spectral density function
from ________
Pair
A. Laplace transform C
B. Z-transform
C. Fourier transform
D. Convolution
30 SYX (w) =
A. Sxy (w)
B. Sxy (-w) B
C. Syx (-w)
D. - syx (w)
31 If X(t) and Y(t) are orthogonal then
A. Sxy (w) = 0
B. Sxy (w) = 1 A
C. Sxy (w) > 1
D. Sxy (w) < 1
32 The cross power formula Pxy is given by
A.
D
B.
C.
D.
33 The average power PXY is
A.
B. C
C.
D.
34 A random process n(t) has a PSD G(f)=η/2 for -∞ ≤ f ≤ ∞. The random process is
passed through alow pass filter which has a transfer function H(f)=2 for -fm ≤ f ≤
fm and H(f)=0 otherwise. Find the PSD of the waveform at the output of the filter.
A
A.2 η
B. η/3
C. η4
D. η/4
35
C.
D.
36 Consider a random process x(t) = cos(ωt + Ө) where ω is a real constant and Ө is a
uniform random variable in (0, π/2). Find the average power in the process. [ ]
A.2/7
C
B. ¼
C. 1/2
D. 1/6
37 Consider a linear system as shown below. x(t) is the input y(t) is the output of the
system. The auto correlation of x(t) is Rxx(τ)=3δ(τ). Find the PSD auto correlation
function and mean square value of the output y(t).
A. 1/2
B. 1/3
C. 2/4
D. ¼
38
If then RXY(τ)
A.
B. B
C.
D.
39 A random process is given by Z(t) = A.X(t) + B.Y(t) where 'A' and 'B' are real
constant's ad X(t)
and Y(t) are jointly WSS processes. The cross power spectrum Sxz (w) is
A .SYX(W) + B SYY (W) C
B. A SXX(W) + B SYY (W)
C. A SXX (W) + B SXY (W)
D. A SYY (W) + B SXY (W)
40 For the above problem SYZ (w) is
A. A syx (w) + b syy(w)
B. A sxx (w) + b syy (w) A
C. A sxx (w) + b sxy (w)
D. A syy (w) + b sxx (w)
41
The auto correlation function of a WSS random process X(t) is
. The area
enclosed by the PSD curve of X(t) is
A. 6 A
B. 2
C. 8
D. 4
42 The time average of the autocorrelation function and the power spectral density
form a pair of
A. Z-transform
C
B. Laplace transform
C. Fourier transform
D. Convolution
43 If X(t) is an engodic process with auto correlation function of the form for
FOR
= 0 for
The spectral density SXX(w) is
D
A.
B.
C.
D.
44
If for - 2πB ≤ w ≤ 2πB then RXX (τ) is =
A.
B.
A
C.
D.
45
The auto correlation function of a process with PSD of is
A.
C
B.
C.
D.1
46 If Z(t) = X(t) + Y(t) where X(t) and Y(t) are two WSS processes and X(t) and Y(t)
are un
correlated and of zero then RZZ(τ)=
A. Rxy (τ) + ryx (τ) B
B. Rxx (τ) + ryy(τ)
C. Rxy (τ) - ryx (τ)
D. Rxx (τ) + ryy(τ)
47 For the above problem SZZ(w) is
A. Sxx (w) - syy(w)
B. Sxy (w) - syx(w) B
C. Sxx (w) + syy(w)
D. Sxy(w) -syx(w)
48 The mean square value of a __________________ process equals the area under the
graph of a power spectral density
A.wide sense stationary processes (WSS)
A
B. Strictly sense stationary processes
C.Random Process
D. stationary processes
49 Average process is given by
A. E [x(t)]
B
B. Rxx (0)
C. {e[x(t)]}2
D. Rxx (1)
50 If Z(t) = X(t) + Y(t) where X(t) and Y(t) are two WSS processes and X(t) and Y(t)
are un
correlated and of zero then RZZ(τ)=
A. Rxy (τ) + ryx (τ) B
B. Rxx (τ) + ryy(τ)
C. Rxy (τ) - ryx (τ)
D. Rxx (τ) + ryy(τ)