Complex random vector

From HandWiki

In probability theory and statistics, a complex random vector is typically a tuple of complex-valued random variables, and generally is a random variable taking values in a vector space over the field of complex numbers. If Z1,,Zn are complex-valued random variables, then the n-tuple (Z1,,Zn) is a complex random vector. Complex random variables can always be considered as pairs of real random vectors: their real and imaginary parts. Some concepts of real random vectors have a straightforward generalization to complex random vectors. For example, the definition of the mean of a complex random vector. Other concepts are unique to complex random vectors.

Applications of complex random vectors are found in digital signal processing.

Definition

A complex random vector 𝐙=(Z1,,Zn)T on the probability space (Ω,,P) is a function 𝐙:Ωn such that the vector ((Z1),(Z1),,(Zn),(Zn))T is a real random vector on (Ω,,P) where (z) denotes the real part of z and (z) denotes the imaginary part of z.[1]:p. 292

Cumulative distribution function

The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form P(Z1+3i) make no sense. However expressions of the form P((Z)1,(Z)3) make sense. Therefore, the cumulative distribution function F𝐙:n[0,1] of a random vector 𝐙=(Z1,...,Zn)T is defined as

F𝐙(𝐳)=P((Z1)(z1),(Z1)(z1),,(Zn)(zn),(Zn)(zn))

 

 

 

 

(Eq.1)

where 𝐳=(z1,...,zn)T.

Expectation

As in the real case the expectation (also called expected value) of a complex random vector is taken component-wise.[1]:p. 293

E[𝐙]=(E[Z1],,E[Zn])T

 

 

 

 

(Eq.2)

Covariance matrix and pseudo-covariance matrix

The covariance matrix (also called second central moment) K𝐙𝐙 contains the covariances between all pairs of components. The covariance matrix of an n×1 random vector is an n×n matrix whose (i,j)th element is the covariance between the i th and the j th random variables.[2]:p.372 Unlike in the case of real random variables, the covariance between two random variables involves the complex conjugate of one of the two. Thus the covariance matrix is a Hermitian matrix.[1]:p. 293

K𝐙𝐙=cov[𝐙,𝐙]=E[(𝐙E[𝐙])(𝐙E[𝐙])H]=E[𝐙𝐙H]E[𝐙]E[𝐙H]

 

 

 

 

(Eq.3)

K𝐙𝐙=[E[(Z1E[Z1])(Z1E[Z1])]E[(Z1E[Z1])(Z2E[Z2])]E[(Z1E[Z1])(ZnE[Zn])]E[(Z2E[Z2])(Z1E[Z1])]E[(Z2E[Z2])(Z2E[Z2])]E[(Z2E[Z2])(ZnE[Zn])]E[(ZnE[Zn])(Z1E[Z1])]E[(ZnE[Zn])(Z2E[Z2])]E[(ZnE[Zn])(ZnE[Zn])]]

The pseudo-covariance matrix (also called relation matrix) is defined replacing Hermitian transposition by transposition in the definition above.

J𝐙𝐙=cov[𝐙,𝐙]=E[(𝐙E[𝐙])(𝐙E[𝐙])T]=E[𝐙𝐙T]E[𝐙]E[𝐙T]

 

 

 

 

(Eq.4)

J𝐙𝐙=[E[(Z1E[Z1])(Z1E[Z1])]E[(Z1E[Z1])(Z2E[Z2])]E[(Z1E[Z1])(ZnE[Zn])]E[(Z2E[Z2])(Z1E[Z1])]E[(Z2E[Z2])(Z2E[Z2])]E[(Z2E[Z2])(ZnE[Zn])]E[(ZnE[Zn])(Z1E[Z1])]E[(ZnE[Zn])(Z2E[Z2])]E[(ZnE[Zn])(ZnE[Zn])]]
Properties

The covariance matrix is a hermitian matrix, i.e.[1]:p. 293

K𝐙𝐙H=K𝐙𝐙.

The pseudo-covariance matrix is a symmetric matrix, i.e.

J𝐙𝐙T=J𝐙𝐙.

The covariance matrix is a positive semidefinite matrix, i.e.

𝐚HK𝐙𝐙𝐚0for all 𝐚n.

Covariance matrices of real and imaginary parts

By decomposing the random vector 𝐙 into its real part 𝐗=(𝐙) and imaginary part 𝐘=(𝐙) (i.e. 𝐙=𝐗+i𝐘), the pair (𝐗,𝐘) has a covariance matrix of the form:

[K𝐗𝐗K𝐘𝐗K𝐗𝐘K𝐘𝐘]

The matrices K𝐙𝐙 and J𝐙𝐙 can be related to the covariance matrices of 𝐗 and 𝐘 via the following expressions:

K𝐗𝐗=E[(𝐗E[𝐗])(𝐗E[𝐗])T]=12Re(K𝐙𝐙+J𝐙𝐙)K𝐘𝐘=E[(𝐘E[𝐘])(𝐘E[𝐘])T]=12Re(K𝐙𝐙J𝐙𝐙)K𝐘𝐗=E[(𝐘E[𝐘])(𝐗E[𝐗])T]=12Im(J𝐙𝐙+K𝐙𝐙)K𝐗𝐘=E[(𝐗E[𝐗])(𝐘E[𝐘])T]=12Im(J𝐙𝐙K𝐙𝐙)

Conversely:

K𝐙𝐙=K𝐗𝐗+K𝐘𝐘+i(K𝐘𝐗K𝐗𝐘)J𝐙𝐙=K𝐗𝐗K𝐘𝐘+i(K𝐘𝐗+K𝐗𝐘)

Cross-covariance matrix and pseudo-cross-covariance matrix

The cross-covariance matrix between two complex random vectors 𝐙,𝐖 is defined as:

K𝐙𝐖=cov[𝐙,𝐖]=E[(𝐙E[𝐙])(𝐖E[𝐖])H]=E[𝐙𝐖H]E[𝐙]E[𝐖H]

 

 

 

 

(Eq.5)

K𝐙𝐖=[E[(Z1E[Z1])(W1E[W1])]E[(Z1E[Z1])(W2E[W2])]E[(Z1E[Z1])(WnE[Wn])]E[(Z2E[Z2])(W1E[W1])]E[(Z2E[Z2])(W2E[W2])]E[(Z2E[Z2])(WnE[Wn])]E[(ZnE[Zn])(W1E[W1])]E[(ZnE[Zn])(W2E[W2])]E[(ZnE[Zn])(WnE[Wn])]]

And the pseudo-cross-covariance matrix is defined as:

J𝐙𝐖=cov[𝐙,𝐖]=E[(𝐙E[𝐙])(𝐖E[𝐖])T]=E[𝐙𝐖T]E[𝐙]E[𝐖T]

 

 

 

 

(Eq.6)

J𝐙𝐖=[E[(Z1E[Z1])(W1E[W1])]E[(Z1E[Z1])(W2E[W2])]E[(Z1E[Z1])(WnE[Wn])]E[(Z2E[Z2])(W1E[W1])]E[(Z2E[Z2])(W2E[W2])]E[(Z2E[Z2])(WnE[Wn])]E[(ZnE[Zn])(W1E[W1])]E[(ZnE[Zn])(W2E[W2])]E[(ZnE[Zn])(WnE[Wn])]]

Two complex random vectors 𝐙 and 𝐖 are called uncorrelated if

K𝐙𝐖=J𝐙𝐖=0.

Independence

Main page: Independence (probability theory)

Two complex random vectors 𝐙=(Z1,...,Zm)T and 𝐖=(W1,...,Wn)T are called independent if

F𝐙,𝐖(𝐳,𝐰)=F𝐙(𝐳)F𝐖(𝐰)for all 𝐳,𝐰

 

 

 

 

(Eq.7)

where F𝐙(𝐳) and F𝐖(𝐰) denote the cumulative distribution functions of 𝐙 and 𝐖 as defined in Eq.1 and F𝐙,𝐖(𝐳,𝐰) denotes their joint cumulative distribution function. Independence of 𝐙 and 𝐖 is often denoted by 𝐙𝐖. Written component-wise, 𝐙 and 𝐖 are called independent if

FZ1,,Zm,W1,,Wn(z1,,zm,w1,,wn)=FZ1,,Zm(z1,,zm)FW1,,Wn(w1,,wn)for all z1,,zm,w1,,wn.

Circular symmetry

A complex random vector 𝐙 is called circularly symmetric if for every deterministic φ[π,π) the distribution of eiφ𝐙 equals the distribution of 𝐙.[3]:pp. 500–501

Properties
  • The expectation of a circularly symmetric complex random vector is either zero or it is not defined.[3]:p. 500
  • The pseudo-covariance matrix of a circularly symmetric complex random vector is zero.[3]:p. 584

Proper complex random vectors

A complex random vector 𝐙 is called proper if the following three conditions are all satisfied:[1]:p. 293

  • E[𝐙]=0 (zero mean)
  • var[Z1]<,,var[Zn]< (all components have finite variance)
  • E[𝐙𝐙T]=0

Two complex random vectors 𝐙,𝐖 are called jointly proper is the composite random vector (Z1,Z2,,Zm,W1,W2,,Wn)T is proper.

Properties
  • A complex random vector 𝐙 is proper if, and only if, for all (deterministic) vectors 𝐜n the complex random variable 𝐜T𝐙 is proper.[1]:p. 293
  • Linear transformations of proper complex random vectors are proper, i.e. if 𝐙 is a proper random vectors with n components and A is a deterministic m×n matrix, then the complex random vector A𝐙 is also proper.[1]:p. 295
  • Every circularly symmetric complex random vector with finite variance of all its components is proper.[1]:p. 295
  • There are proper complex random vectors that are not circularly symmetric.[1]:p. 504
  • A real random vector is proper if and only if it is constant.
  • Two jointly proper complex random vectors are uncorrelated if and only if their covariance matrix is zero, i.e. if K𝐙𝐖=0.

Cauchy-Schwarz inequality

The Cauchy-Schwarz inequality for complex random vectors is

|E[𝐙H𝐖]|2E[𝐙H𝐙]E[|𝐖H𝐖|].

Characteristic function

The characteristic function of a complex random vector 𝐙 with n components is a function n defined by:[1]:p. 295

φ𝐙(ω)=E[ei(ωH𝐙)]=E[ei((ω1)(Z1)+(ω1)(Z1)++(ωn)(Zn)+(ωn)(Zn))]

See also

References

  1. 1.00 1.01 1.02 1.03 1.04 1.05 1.06 1.07 1.08 1.09 Lapidoth, Amos (2009). A Foundation in Digital Communication. Cambridge University Press. ISBN 978-0-521-19395-5. 
  2. Gubner, John A. (2006). Probability and Random Processes for Electrical and Computer Engineers. Cambridge University Press. ISBN 978-0-521-86470-1. 
  3. 3.0 3.1 3.2 Tse, David (2005). Fundamentals of Wireless Communication. Cambridge University Press.