Advanced Signals and Systems - Stochastic Processes II

 

5. Bivariate probability densitiy functions.

Task

Consider the following joint pdf of two stationary random variables \(v_1\), \(v_2\):

\begin{eqnarray*} f_{v_1v_2}(V_1,V_2,\kappa) = \left\{\begin{array}{r@{,\qquad}l} P_0, & \sqrt{V_1^2 + V_2^2} \leq \hat{V} \\ 0, & \hspace{0.6cm} \mbox{otherwise} \end{array} \right. \end{eqnarray*}

  1. Sketch \(f_{v_1v_2}(V_1,V_2,\kappa)\) and determine \(P_0\).
    What kind of distribution is \(f_{v_1v_2}(V_1,V_2,\kappa)\)?
  2. Calculate the marginal probability density functions \(f_{v_1}(V_1)\) and \(f_{v_2}(V_2)\).
  3. Are the random variables \(v_1\) and \(v_2\)
    1. independent
    2. orthogonal
    3. uncorrelated?
  4. Specify a uniform joint pdf \(f_{v_1v_2}(V_1,V_2,\kappa)\) of two independent, uncorrelated but not orthogonal random signals \(v_1(n)\), \(v_2(n)\).

Amount and difficulty

  • Working time: approx. xx minutes
  • Difficulty: ?

Additional Material

  1. Joint pdf of the two stationary random variables \(v_1\) and \(v_2\): \begin{equation} f_{v_1v_2}(V_1,V_2,\kappa)= \begin{cases} \quad P_0, \qquad &\sqrt{V_1^2+V_2^2} \le \hat{V} \notag \\ \quad 0, \qquad &\text{otherwise} \end{cases} \end{equation}

    Joint pdf \(f_{v_1v_2}(V_1,V_2,\kappa)\):


    Product of the two marginal pdfs \(f_{v_1}(V_1)\cdot f_{v_2}(V_2)\):

  2. The following joint pdf fulfills the conditions:

    \(\Rightarrow\) \(v_1(n)\) and \(v_2(n)\) are independent, as \(f_{v_1v_2}(V_1,V_2,\kappa) = f_{v_1}(V_1)\cdot f_{v_2}(V_2)\)

Solution

  1. The following picture shows the two-dimensional uniform distribution with circular base which was defined by the pdf given in the problem.

    The value for \(P_0\) can be derived by \begin{equation*} F_{v_1v_2}(\infty, \infty, \kappa) = \int \limits_{-\infty}^{\infty} \int \limits_{-\infty}^{\infty} f_{v_1v_2} (V_1,V_2,\kappa) \; dV_1 \;dV_2 = 1 \ \ \ \Longrightarrow \ \ \ P_0 = \frac{1}{\pi \hat{V}^2} \end{equation*}
  2. The marginal probability density function is defined by \begin{equation*} f_{v_1}(V_1) = \int \limits_{V_2 = -\infty}^{\infty} f_{v_1v_2}(V_1, V_2, \kappa) \; dV_2 \end{equation*} Solving this equation, the two marginal probability density function are defined by: \begin{align*} f_{v_1}(V_1) &= 2 \cdot P_0 \sqrt{\hat{V}^2-V_1^2}\\ f_{v_2}(V_2) &= 2 \cdot P_0 \sqrt{\hat{V}^2-V_2^2}\\ \end{align*}
  3. The variables are
    1. independent, if \begin{equation*} f_{v_1v_2} (V_1,V_2,\kappa) = f_{v_1} (V_1) \cdot f_{v_2} (V_2) \text{ .} \end{equation*} \(\longrightarrow\) \(v_1\) and \(v_2\) are not independent
    2. orthogonal, if \begin{equation*} s_{v_1v_2}(\kappa) = \int \limits_{-\infty}^{\infty} \int \limits_{-\infty}^{\infty} V_1^* V_2 \cdot f_{v_1v_2} (V_1,V_2,\kappa) \; dV_1 \; dV_2 = 0 \text{ .} \end{equation*} \(\longrightarrow v_1\) and \(v_2\) are orthogonal
    3. uncorrelated, if \begin{equation*} \psi_{v_1v_2}(\kappa) = s_{v_1v_2}(\kappa) - \mu_{v_1}^* \mu_{v_2} = 0 \text{ .} \end{equation*} As both marginal pdfs are symmetric about their origin, their mean values are equal to zero.
      \(\longrightarrow v_1\) and \(v_2\) are uncorrelated
  4. The following joint pdf fulfills the conditions:

    \(\Rightarrow\) The shown pdf is uniform as all possible combinations of \(V_1\) and \(V_2\) are equally likely to be observed. 
    \(\Rightarrow\) \(v_1(n)\) and \(v_2(n)\) are independent, as \(f_{v_1v_2}(V_1,V_2,\kappa) = f_{v_1}(V_1)\cdot f_{v_2}(V_2)\).
    \(\Rightarrow\) As \(v_1(n)\) and \(v_2(n)\) are independent, they are also uncorrelated.
    \(\Rightarrow\) \(v_1(n)\) and \(v_2(n)\) are not orthogonal, as \(\mu_{v_1} \neq 0\) and \(\mu_{v_2} \neq 0\).

 

6. ACF of a sum process.

Task

Given two stationary random processes \(v_1(n)\) and \(v_2(n)\), determine the auto-correlation function (ACF) \(s_{vv}\) of the sum process \(v(n,\kappa)=v_1(n)+v_2(n+\kappa)\).

Amount and difficulty

      • Working time: approx. xx minutes
      • Difficulty: ?

Solution

Given: The sum process \begin{equation*} v(n,\kappa )=v_1(n)+v_2(n+\kappa ), \qquad \text{where } v_1(n) \text{ and } v_2(n) \text{ are stationary} \end{equation*}

Wanted: The auto-correlation function (ACF) \(s_{vv}(\eta,\kappa )\) Definition of the ACF \(s_{vv}(\eta,\kappa )\): \begin{align*} s_{vv}(\eta,\kappa )&=E\{v(n,\kappa )\cdot v(n+\eta,\kappa ) \} \\ &= \cdots \\ s_{vv}(\eta,\kappa )&= s_{v_1v_1}(\eta) + s_{v_1v_2}(\kappa +\eta) + s_{v_1v_2}(\kappa -\eta)+ s_{v_2v_2}(\eta) \end{align*} \(\rightarrow\) \(s_{vv}(\eta,\kappa )\) is dependent on \(\kappa\) despite stationarity of \(v_1(n)\), \(v_2(n)\) and \(\mu_v\)!

 

7. PDF of a sum process.

Task

Let \(v_1(n)\) and \(v_2(n)\) be two independent random processes with following probability density functions:

  1. Determine the mean and the variance of the sum process \(v(n)=v_1(n)+v_2(n)\).
  2. Determine and sketch the pdf \(f_v(V)\) of the sum process.

Amount and difficulty

    • Working time: approx. xx minutes
    • Difficulty: ?

Solution

  1. The mean value of a sum process, for which the processes are independent, is defined by \begin{equation*} \mu_v = \mu_{v_1} + \mu_{v_1} = \cdots = \frac{2}{3} \text{ .} \end{equation*}

    The variance of a sum process, for which the processes are independent, is defined by \begin{equation*} \sigma^2 _v = \sigma^2_{v_1} + \sigma^2_{v_1} + 2 \psi_{v_1v_2}(\kappa) = \cdots = \frac{2}{9} \text{ .} \end{equation*}

    Hint: The problem says two independent random processes are given, we know that independence includes uncorrelatedness <\(\rightarrow \psi_{v_1v_2}(\kappa) = 0 \).

  2. In general the pdf of a sum process is defined by the convolution \begin{equation*} f_v(V) = f_{v_1}(V) * f_{v_2}(V) = \int \limits_{X=-\infty}^{\infty} f_{v_1}(X) \cdot f_{v_2}(V-X) \; dX\text{ .} \end{equation*}

    For solving this integral a case differentiation is carried out and the following result can be achieved: \begin{equation*} f_v(V) = \begin{cases} -\frac{1}{8}V^2+\frac{1}{4}V+\frac{3}{8} & \text{, } -1\leq V < 1\\ \frac{1}{8}V^2-\frac{3}{4}V+\frac{9}{8} & \text{, } 1\leq V < 3\\ 0 & \text{, otherwise.} \end{cases} \end{equation*}

 

8. Mapping of a random process.

Task

Consider a stationary, uniformly distributed random process \(v(n)\) mapped to \begin{equation*} y(n) = a \cdot v(n) + b \text{ .} \end{equation*} Determine the mean \(\mu_y\), the variance \(\sigma_x^2\) and the pdf \(f_y(Y)\) of the resulting random process \(y(n)\).

Amount and difficulty

    • Working time: approx. xx minutes
    • Difficulty: ?

Solution

The random variable is linearly mapped to \(y(n)\), so \(\rightarrow g(V) = a \cdot V +b = Y\). Remember that \(v(n)\) is uniformly distributed random process.

  1. Find the pdf of this new random variable \(Y\) \begin{equation*} f_y(Y) = \left. \frac{f_v(V)}{|dg(V)/dV|} \right|_{V=g^{-1}[Y]} \ \ \ \text{with } V=g^{-1}[Y]= \frac{Y-b}{a} \text{ , } \frac{dg(V)}{dV}=a \end{equation*} \begin{equation*} f_y(Y) = \begin{cases} \frac{1}{|a|} \frac{1}{V_{max}-V_{min}} &\text{, } Y \in \left[ a \cdot V_{min} + b, a \cdot V_{max} + b \right] \\ 0 &\text{, otherwise}\end{cases} \end{equation*}

  2. Find the mean value \begin{equation*} \mu_y = \frac{1}{2} \left[ (a \cdot V_{max} + b) + (a \cdot V_{min} + b) \right] = \cdots = a \cdot \mu_v + b \end{equation*}

  3. Find the variance \begin{equation*} \sigma^2_y = \frac{1}{12} \left[ a\cdot (V_{max}-V_{min}) \right]^2 = \cdots = a^2 \cdot \sigma^2_v \end{equation*}

Website News

03.12.2017: Added pictures from our Sylt meeting.

01.10.2017: Started with a Tips and Tricks section for KiRAT.

01.10.2017: Talks from Jonas Sauter (Nuance) and Vasudev Kandade Rajan (Harman/Samsung) added.

13.08.2017: New Gas e.V. sections (e.g. pictures or prices) added.

Recent Publications

J. Reermann, P. Durdaut, S. Salzer, T. Demming, A. Piorra, E. Quandt, N. Frey, M. Höft, and G. Schmidt: Evaluation of Magnetoelectric Sensor Systems for Cardiological Applications, Measurement (Elsevier), ISSN 0263-2241, https://doi.org/­10.1016/­j.measurement.2017.09.047, 2017

S. Graf, T. Herbig, M. Buck, G. Schmidt: Low-Complexity Pitch Estimation Based on Phase Differences Between Low-Resolution Spectra, Proc. Interspeech, pp. 2316 -2320, 2017

Contact

Prof. Dr.-Ing. Gerhard Schmidt

E-Mail: gus@tf.uni-kiel.de

Christian-Albrechts-Universität zu Kiel
Faculty of Engineering
Institute for Electrical Engineering and Information Engineering
Digital Signal Processing and System Theory

Kaiserstr. 2
24143 Kiel, Germany

Recent News

Jugend Forscht

On November 24th, one of our DSS team members, Owe Wisch, took part in the "Jugend forscht Perspektivforum" at the CAU. Thirty young students from the "Jugend forscht" project came to Kiel and participated in three different workshops focusing on career paths in maritime climate protection. Owe Wisch from our chair lead one of the workshops and presented his research topics, beamforming ...


Read more ...