Original link: http://tecdat.cn/?p=26578
The exponential distribution is a probability distribution of the time between events in a Poisson process, so it is used to predict the waiting time until the next event, for example, the time you need to wait at a bus stop until the next bus arrives(< /strong>Click “Read the original text” at the end of the article to get the completecode data).
Related videos
In this article we will use the exponential distribution assuming that its parameter λ , the mean time between events, changes at some time point k , i.e.:
Our main goal is to estimate the parameters λ, α and k using a Gibbs sampler given n samples of observations from this distribution.
Gibbs Sampler
The Gibbs sampler is a special case of the Metropolis-Hastings sampler and is typically used when the target is a multivariate distribution. With this approach, chains are generated by sampling from the marginal distribution of the target distribution, so every candidate point is accepted.
The Gibbs sampler generates a Markov chain as follows:
-
Let be a random vector in Rd , initialize X(0) at time t=0.
-
Repeat for each iteration t=1,2,3,…:
-
Set x1=X1(t-1).
-
For each j=1,…,d:
-
Generate >, where is the given X (-j) Univariate conditional density of Xj.
-
Update .
-
When each candidate point is accepted, set .
-
increase t.
Bayesian formula
A simple formulation of the change point problem assumes that f and g have known densities:
where k is unknown and k=1,2,…,n. Let Yi be the elapsed time (in minutes) between bus arrival at the bus stop. Assume that the change point occurs at the kth minute, that is:
When Y=(Y1,Y2,…,Yn), the likelihood L(Y|k) is given by the following formula:
A Bayesian model assuming independent priors is given by:
The joint distribution of data and parameters is:
in,
As I mentioned before, the implementation of the Gibbs sampler requires sampling from the marginal distribution of the target distribution, so we need to find the complete conditional distribution of λ, α, and k. How can you do this? Simply put, you have to select terms from the connectivity distribution introduced above that only depend on the parameters of interest and ignore the rest.
Related videos
The complete conditional distribution of λ is given by:
The complete conditional distribution of α is given by:
The complete conditional distribution of k is given by:
Calculation method
Here you will learn how to estimate the parameters λ, α, and k using the Gibbs sampler using R.
Data
First, we generate data from the next exponential distribution with change points:
set.seed(98712) y <- c(rexp(25, rate = 2), rexp(35, rate = 10))
Considering the situation at the bus station, buses arrive at the bus station every 2 minutes on average at the beginning, but starting from time i=26, buses start arriving at the bus station every 10 minutes on average.
Click on the title to view previous issues
Bayesian simple linear regression simulation analysis using Gibbs sampling in R language
Swipe left or right to see more
01
02
03
04
Implementation of Gibbs sampler
First, we need to initialize k, λ and α.
n <- length(y) # Number of observations in the sample lci <- 10000 # chain size aba <- alpha <- k <- numeric(lcan) k\[1\] <- sample(1:n,
Now, for each iteration of the algorithm, we need to generate λ(t), α(t) and k(t) as follows (remember there are no change points if k + 1>n):
for (i in 2:lcan){ kt <- k\[i-1\] # Generate lambda lambda\[i\] <- rgamma # Generate α # generate k for (j in 1:n) { L\[j\] <- ((lambda\[i\] / alpha\[i # Delete the first 9000 values in the chain bunIn <- 9000
Results
In this section we present the chain generated by the Gibbs sampler and the distribution of its parameters λ, α and k. The true value of the parameter is shown by the red line.
The following table shows the actual values of the parameters and the average of the estimated values obtained using the Gibbs sampler:
res <- c(mean(k\[-(1:bun)\]), mean(lmba\[-(1:burn)\]), mean(apa\[-( 1:buI)\])) resfil
Conclusion
From the results, we can conclude that the mean of the estimates of the parameters k, λ and α from the exponential distribution with change points obtained using the Gibbs sampler in R is close to the actual values of the parameters, but we expect better estimate. This may be due to the choice of the initial value of the chain or the choice of the prior distribution of λ and α.
Click “Read original text” at the end of the article
Get full text complete information.
This article is selected from “R Language Bayesian METROPOLIS-HASTINGS GIBBS Gibbs Sampler Estimation of Change Point Exponential Distribution Analysis Poisson Process Station Waiting Time”.
Click on the title to view previous issues
METROPOLIS HASTINGS in R language Markov MCMC, MH algorithm sampling (sampling) method visualization example
python Bayesian random processes: Markov-Chain, MC and Metropolis-Hastings, MH sampling algorithm visualization
Implementation of Python Bayesian Inference Metropolis-Hastings (M-H) MCMC Sampling Algorithm
Metropolis Hastings Sampling and Bayesian Poisson Regression Poisson Model
Matlab uses BUGS Markov zone conversion Markov switching stochastic volatility model, sequence Monte Carlo SMC, M H sampling analysis time series
R language RSTAN MCMC: NUTS sampling algorithm uses LASSO to build a Bayesian linear regression model to analyze professional reputation data
R language BUGS sequence Monte Carlo SMC, Markov transformation stochastic volatility SV model, particle filtering, Metropolis Hasting sampling time series analysis
R language Metropolis Hastings sampling and Bayesian Poisson regression Poisson model
Bayesian MCMC in R language: Use rstan to build a linear regression model to analyze car data and visual diagnosis
Bayesian MCMC in R language: GLM logistic regression, Rstan linear regression, Metropolis Hastings and Gibbs sampling algorithm examples
R language Bayesian Poisson Poisson-normal distribution model analyzes the number of goals scored in professional football matches
R language uses Rcpp to accelerate Metropolis-Hastings sampling estimation of parameters of Bayesian logistic regression model
R language logistic regression, Naive Bayes, decision tree, random forest algorithm to predict heart disease
Bayesian network (BN), dynamic Bayesian network, and linear model in R language analyze malocclusion data
Block Gibbs Gibbs sampling Bayesian multiple linear regression in R language
Python Bayesian Regression Analysis of Housing Affordability Dataset
R language implements Bayesian quantile regression, lasso and adaptive lasso Bayesian quantile regression analysis
Python uses PyMC3 to implement Bayesian linear regression model
R language uses WinBUGS software to build a hierarchical (hierarchical) Bayesian model for academic ability tests
Bayesian simple linear regression simulation analysis using Gibbs sampling in R language
R language and STAN, JAGS: Use RSTAN, RJAG to build Bayesian multiple linear regression to predict election data
Research on diagnostic accuracy of Bayesian hierarchical mixture model based on copula in R language
R language Bayesian linear regression and multiple linear regression to build a salary prediction model
Bayesian inference and MCMC in R language: Example of implementing Metropolis-Hastings sampling algorithm
R language stan performs regression model based on Bayesian inference
Example of RStan Bayesian hierarchical model analysis in R language
R language uses Metropolis-Hastings sampling algorithm for adaptive Bayesian estimation and visualization
R language stochastic search variable selection SSVS estimates Bayesian vector autoregressive (BVAR) models
WinBUGS for multivariate stochastic volatility models: Bayesian estimation and model comparison
R language implements Metropolis–Hastings algorithm and Gibbs sampling in MCMC
Bayesian inference and MCMC in R language: Example of implementing Metropolis-Hastings sampling algorithm
R language uses Metropolis-Hastings sampling algorithm for adaptive Bayesian estimation and visualization
Video: Bayesian Model of MCMC Sampling with Stan Probabilistic Programming in R
MCMC in R: Metropolis-Hastings sampling for Bayesian estimation of regression