关灯
护眼
字体:
大
中
小
烟雨山河 [不良人|尤川](9)
作者:紫微客 阅读记录
The Root Mean Square Error (RMSE) is calculated using the following formula:
\begin{equation}
RMSE = \sqrt{\frac{1}{n} \sum_{i=1}^{n}(y_i - \hat{y}_i)^2}
\end{equation}
where:\\
1. n is the number of observations.\\
2. $y_i$ is the actual value of the i-th observation.\\
3. $\hat{y}_i$ is the estimated value of the i-th observation.\\
The formulaputes the square root of the average squared differences between the actual and estimated values, providing a measure of the typical deviation of the estimated values from the actual values.
\section{Estimation of three parameters}
We first estimated the three parameters in the Pearson Type III distribution using the method of moment estimation and maximum likelihood estimation. Detailed descriptions follow.
\subsection{The method of moment estimation}
Firstly, review formulas (1) and (2), the raw moments ($m_k $) and central moments ($t_k$) of a probability distribution can be expressed using the probability density function (PDF) ($f(x)$) as follows:\\
1.Raw Moment ($m_k $):\\
\begin{equation}
m_k = \int_{-\infty}^{\infty} x^k f(x) \, dx=E(X^k)
\end{equation}
The raw moment represents the k-th power of the variable x weighted by the probability density function f(x) over the entire range of possible values. It provides information about the distribution's location, spread, and shape.\\
2. Central Moment ($t_k$):\\
\begin{equation}
t_k = \int_{-\infty}^{\infty} (x - E(x))^k f(x) \, dx= E[(X-E(X))^k]
\end{equation}
The central moment is similar to the raw moment but is calculated with respect to the mean (E(X)) of the distribution. It measures the spread or dispersion of the distribution around its mean.
These integrals represent the area under the curve of the PDF weighted by the respective functions of x . They are fundamental in statistical analysis for quantifying various characteristics of probability distributions, such as moments, variance, skewness, and kurtosis.\\
Here, we set\\
$\bar{x}$=$E(X)$, $s^2$=$t_2$=$E[(X-E(X))^2]$ and $t_3$=$E[(X-E(X))^3]$\\
then, recall equations (1.1), (1.3), (1.4), (1.5) ,(3.2) and (3.3), by equating sample moments to theoretical moments, we can get\\
\begin{equation}
E(X)= \int_{\mu}^{\infty} x f(x;\mu, \sigma) \, dx=\mu + \beta \cdot \sigma=\bar{x}
\end{equation}
\begin{equation}
Var(X)=\int_{\mu}^{\infty} (x - E(x))^2 f(x) \, dx=\sigma^2 \cdot \beta=s^2
\end{equation}
\begin{equation}
\gamma(X) = \frac{E[(X - E(X)^3]}{SD^3} = \frac{2}{\sqrt{\beta}} = \frac{t^3}{s^3}
\end{equation}\\
where, SD represents the standard deviation\\
ording to (3.4), (3.5) and (3.6), $\hat{\mu}$, $\hat{\sigma}$ and $\hat{\beta}$, the moment estimators for $\mu$, $\sigma$ and $\beta$, are solved to be\\
\begin{equation}
\hat{\mu}=\bar{x}-s\sqrt{\frac{4s^6}{t_3^2}}
\end{equation}
\begin{equation}
\hat{\sigma}=\frac{s}{\sqrt{\frac{4s^6}{t_3^2}}}
\end{equation}
\begin{equation}
\hat{\beta} = \frac{4s^6}{t_3^2}
\end{equation}\\
\subsection{The maximum likelihood estimation method}
ording to Equations (1.1) and (2.3), the likelihood function of the Pearson type III distribution is written as follows.
\begin{equation}
\begin{aligned}
& L(\theta)=\prod_{i=1}^n f\left(x_{i}; \mu, \sigma, \beta\right)\\
& = \prod_{i=1}^n \frac{1}{{\sigma^\beta}{\Gamma(\beta)}}(x-\mu)^{\beta-1} \cdot e^{-\frac{x-\mu}{\sigma}}\\
& = \sigma^{-n \beta} \cdot[\Gamma(\beta)]^{-n} \cdot \prod_{i=1}^n\left(x_i-\mu\right)^{\beta-1} \cdot e^{-\frac{1}{\sigma} \sum_{i=1}^n\left(x_i-\mu\right)}
\end{aligned}
\end{equation}\\
where, $\theta=(\mu, \sigma, \beta)$.\\
Then, taking natural log on (3.10), we attain that
\begin{equation}
\begin{aligned}
& l(\theta)=\log L(\theta)=-n \beta-n \log \Gamma(\beta)+(\beta-1) \sum_{i=1}^n \log \left(x_i-\mu\right)-\frac{1}{\sigma} \sum_{i=1}^n\left(x_i-\mu\right)
\end{aligned}
\end{equation}\\
By applying the condition mentioned in (2.5), we attain that the system of equations to be solved contains\\
\begin{equation}
\frac{\partial l(\theta)}{\partial \mu}=-(\beta-1) \sum_{i=1}^n \frac{1}{x_i-\mu}+\frac{n}{\sigma}=0
\end{equation}
\begin{equation}
\frac{\partial l(\theta)}{\partial \sigma}=-\frac{n \beta}{\sigma}+\frac{1}{\sigma^2} \sum_{i=1}^n\left(x_i-\mu\right)=0