3. Reminders on statistical laws [bib4]#
3.1. Definitions#
\(t\) parameter: discrete \({({t}_{n})}_{n=\text{1,N}}\) or continuous (time or a space variable).
\(X(t)\) random process.
Each instant \({t}_{n}\) is associated with a random variable \({X}_{n}\), a random implementation variable \({x}_{n}\).
So \(x(t)={({x}_{n}=x({t}_{n}))}_{n=1\text{,N}}\) is a realization of the \(X(t)\) process, a process made up of \(N\) random variables that are a prima facie independent.
Each variable \({X}_{n}\) is characterized by its distribution function \({F}_{n}({\text{x,t}}_{n})=\text{Prob}({X}_{n}\le x)\) or by its probability density \(p({\text{x,t}}_{n})=\frac{\partial {F}_{n}}{\partial x}({\text{x,t}}_{n})\).
The random process is also characterized by its moment functions, the first two moments are of particular importance. This is the mathematical expectancy or average \(m(t)\) also noted \(E\left[X(t)\right]\) and for any pair \(({t}_{1},{t}_{2})\) the autocorrelation function \(R({t}_{1},{t}_{2})\) or \({R}_{\text{XX}}({t}_{1},{t}_{2})\) also noted \(E\left[X({t}_{1})X({t}_{2})\right]\).
\(\begin{array}{}m(t)=E\left[X(t)\right]=\int \mathrm{xp}(x,t)\text{dx}\\ {R}_{\text{XX}}({t}_{1},{t}_{2})=E\left[X({t}_{1})X({t}_{2})\right]=\int \int {x}_{1}{x}_{2}p({x}_{1},{t}_{1};{x}_{2},{t}_{2}){\text{dx}}_{1}{\text{dx}}_{2}\end{array}\)
An intercorrelation function is also defined for two \(X(t)\text{et}Y(t)\) processes.
\({R}_{\text{XY}}({t}_{1},{t}_{2})=E\left[X({t}_{1})Y({t}_{2})\right]=\int \int {x}_{1}{y}_{2}p({x}_{1},{t}_{1};{y}_{2},{t}_{2}){\text{dx}}_{1}{\text{dy}}_{2}\)
The « spread » of the process is characterized by the variance:
\({\sigma }^{2}(t)=E\left[{(X(t)-\mu (t))}^{2}\right]\)
For a zero mean process (\(\mu\) = 0.), the variance that then characterizes the « intensity of the phenomenon » (square of the standard deviation or mean squared value) is equal to the autocorrelation function at time \(t={t}_{1}={t}_{2}\):
\({\sigma }^{2}(t)=E\left[X(t)X(t)\right]={R}_{\text{XX}}(\text{t,t})=\int {x}^{2}p(x,t)\text{dx}\)
3.2. Hypotheses in random dynamics#
Very classically, several hypotheses are posed in the context of random dynamics. It is thus assumed that the processes studied are stationary, with zero mean and ergodic.
3.2.1. Stationary processes with zero mean - variance#
A process is said to be stationary if all of its « probabilistic characteristics » are invariant during a \({t}_{0}\) translation of the parameter \(t\). This involves:
\(\begin{array}{}\mu (t)=\text{Cte}\\ {R}_{\text{XX}}({t}_{1},{t}_{2})={R}_{\text{XX}}({t}_{2}-{t}_{1})={R}_{\text{XX}}(\tau )={R}_{\text{XX}}(-\tau )\end{array}\)
For a zero mean process \({\sigma }^{2}={R}_{\text{XX}}(0)\).
3.2.2. Ergodicity#
This notion comes from a reasoning by Gibbs (1839-1903) for which the observation time of a physical phenomenon can be considered as infinite compared to the time scale at the molecular level. The system then goes through all the possible states by staying as long as possible, or by going as often as possible, in the states that are most probable, so that the time average becomes equal to the statistical mean over the states, that is to say the mathematical expectation. This is extended for the correlation and intercorrelation functions.
\(\begin{array}{}\mu =\underset{T\to \text{+}\infty }{\text{lim}}\frac{1}{T}\underset{-T/2}{\overset{+T/2}{\int }}x(t)\text{dt}\\ {R}_{\text{XX}}(t)=\underset{T\to \text{+}\infty }{\text{lim}}\frac{1}{T}\underset{-T/2}{\overset{+T/2}{\int }}x(t-\tau )x(t)\text{dt}\end{array}\)
Note:
For the rest of the document, it will be assumed that the random process is stationary with zero mean. All the developments carried out in Code_Aster verify these hypotheses.
3.3. Power spectral density#
As part of this statistical approach, we can give a very general definition of power spectral density or DSP . For Code_Aster, we will use the following definitions expressed in frequency or in pulsation:
\(\begin{array}{}{S}_{\text{XX}}(f)=\underset{\text{-}\infty }{\overset{\text{+}\infty }{\int }}{R}_{\text{XX}}(\tau ){e}^{-\mathrm{2i}\pi f\tau }d\tau ;{G}_{\text{XX}}(f)=\underset{0\text{.}}{\overset{\text{+}\infty }{\int }}{R}_{\text{XX}}(\tau ){e}^{-\mathrm{2i}\pi f\tau }d\tau \\ {S}^{{p}_{\text{XX}}}(\omega )=\frac{1}{2\pi }\underset{\text{-}\infty }{\overset{\text{+}\infty }{\int }}{R}_{\text{XX}}(\tau ){e}^{-i\omega \tau }d\tau ;{G}^{{p}_{\text{XX}}}(\omega )=\frac{1}{2\pi }\underset{0\text{.}}{\overset{\text{+}\infty }{\int }}{R}_{\text{XX}}(\tau ){e}^{-i\omega \tau }d\tau \end{array}\)
that lead to the following relationships: \(\begin{array}{}{G}^{{p}_{\text{XX}}}(\omega )=\frac{1}{2\pi }{G}_{\text{XX}}(f)\\ {S}_{\text{XX}}(f)=2{G}_{\text{XX}}(f){S}_{\text{XX}}^{p}(\omega )=2{G}^{{p}_{\text{XX}}}(\omega )\end{array}\)
We can show that \({G}_{\text{XX}}(f)\), which is equal to the Fourier Transformation of \({R}_{\text{XX}}(t)\), is real, positive. Refer to [Annexe1] which contains all the conventions adopted to ensure the consistency of results.
3.4. Spectral moments#
The following quantities (which have been defined in pulsation) are called spectral moments:
\({\lambda }_{i}=\underset{\text{-}\infty }{\overset{\text{+}\infty }{\int }}{\mid \omega \mid }^{i}{S}^{{p}_{\text{XX}}}(\omega )d\omega =\underset{\text{-}\infty }{\overset{\text{+}\infty }{\int }}{\mid \omega \mid }^{i}{S}_{\text{XX}}(f)\text{df}\)
In particular, we have: \({\lambda }_{0}={\sigma }^{2}{\lambda }_{2}={\sigma }_{\dot{X}\dot{X}}^{2}{\lambda }_{4}={\sigma }_{\ddot{X}\ddot{X}}^{2}\) which are the standard deviations of \(X\) and its first derivatives.
These moments are systematically calculated up to order 4; using the keyword MOMENT it is possible to request the calculation of higher modes. In Code_Aster, the calculation is performed for a DSP expressed as a function of the \(f\) frequency.