Original link: http://tecdat.cn/?p=26578
The exponential distribution is the probability distribution of the time between events in a Poisson process, so it is used to predict the wait time until the next event, for example, the time you need to wait at a bus stop until the next bus arrives (< /strong>Click “Read the original text” at the end of the article to get the complete code data).
Related videos
In this article, we will use the exponential distribution, assuming that its parameter λ, the average time between events, changes at some point in time k, namely:
Our main goal is to estimate the parameters λ, α, and k using the Gibbs sampler given n samples of observations from this distribution.
Gibbs Sampler
The Gibbs sampler is a special case of the Metropolis-Hastings sampler and is often used when the target is a multivariate distribution. Using this approach, chains are generated by sampling from a marginal distribution of the target distribution, so that every candidate is accepted.
The Gibbs sampler generates a Markov chain as follows:
-
Let be a random vector in Rd , initialize X(0) at time t=0.
-
Repeat for each iteration t=1,2,3,…:
-
Set x1=X1(t-1).
-
For each j=1,…,d:
-
Generate X?j(t) from , where is the given X (-j) Univariate conditional density of Xj.
-
Update .
-
When every candidate point is accepted, set .
-
increase t.
Bayes formula
A simple formulation of the change point problem assumes known densities for f and g:
where k is unknown and k=1,2,…,n. Let Yi be the elapsed time in minutes between the arrival of the bus at the bus stop. Suppose the change point occurs at the kth minute, namely:
When Y=(Y1,Y2,…,Yn), the likelihood L(Y|k) is given by:
Assuming a Bayesian model with independent priors is given by:
The joint distribution of data and parameters is:
in,
As I mentioned before, the implementation of the Gibbs sampler requires sampling from a marginal distribution of the target distribution, so we need to find the full conditional distribution for λ, α, and k. how can you do this In simple terms, you have to select from the connectivity distribution presented above the terms that depend only on the parameter of interest and ignore the rest.
Related videos
The complete conditional distribution of λ is given by:
The complete conditional distribution for α is given by:
The complete conditional distribution of k is given by:
Calculation method
Here, you will learn how to estimate the parameters λ, α, and k using the Gibbs sampler using R.
Data
First, we generate data from the next exponential distribution with change points:
set.seed(98712) y <- c(rexp(25, rate = 2), rexp(35, rate = 10))
Considering the situation at the bus station, the bus runs every 2 minutes on average at the beginning, but from time i=26, the bus starts to arrive at the bus station every 10 minutes on average.
Click on the title to view previous issues
Bayesian Simple Linear Regression Simulation Analysis of Gibbs Sampling in R Language
Swipe left and right to see more
01
02
03
04
Implementation of Gibbs sampler
First, we need to initialize k, λ, and α.
n <- length(y) # number of observations in the sample lci <- 10000 # chain size aba <- alpha <- k <- numeric(lcan) k\[1\] <- sample(1:n,
Now, for each iteration of the algorithm, we need to generate λ(t), α(t) and k(t) as follows (remember there are no change points if k + 1>n):
for (i in 2:lcan){ kt <- k\[i-1\] # Generate lambdas lambda\[i\] <- rgamma # generate alpha # generate k for (j in 1:n) { L\[j\] <- ((lambda\[i\] / alpha\[i # delete the first 9000 values on the chain bunIn <- 9000
Results
In this section, we describe the chains generated by the Gibbs sampler and the distribution of their parameters λ, α, and k. The real values of the parameters are indicated by the red lines.
The following table shows the actual values of the parameters and the mean of the estimated values obtained using the Gibbs sampler:
res <- c(mean(k\[-(1:bun)\]), mean(lmba\[-(1:burn)\]), mean(apa\[-( 1:buI)\])) resfil
Conclusion
From the results, we can conclude that the average of the estimates of the parameters k, λ, and α obtained using the Gibbs sampler in R for an exponential distribution with a change point is close to the actual values of the parameters, but we expect better estimate. This could be due to the choice of initial values for the chain or the choice of prior distributions for λ and α.
Click “Read the original text” at the end of the article
Get the full text and complete information.
This article is selected from “R Language Bayesian METROPOLIS-HASTINGS GIBBS Gibbs Sampler Estimation Change Point Exponential Distribution Analysis Poisson Process Station Waiting Time”.
Click on the title to view previous issues
Visualization example of METROPOLIS HASTINGS in R language Markov MCMC, MH algorithm sampling (sampling) method
Python Bayesian stochastic process: Markov chain Markov-Chain, MC and Metropolis-Hastings, MH sampling algorithm visualization
Python Bayesian infers the implementation of the Metropolis-Hastings (M-H) MCMC sampling algorithm
Metropolis Hastings Sampling and Bayesian Poisson Regression Poisson Model
Matlab uses BUGS Markov area system to convert Markov switching random volatility model, sequence Monte Carlo SMC, M H sampling analysis time series
R language RSTAN MCMC: NUTS sampling algorithm uses LASSO to build a Bayesian linear regression model to analyze professional reputation data
R language BUGS sequence Monte Carlo SMC, Markov transformation stochastic volatility SV model, particle filter, Metropolis Hasting sampling time series analysis
R language Metropolis Hastings sampling and Bayesian Poisson regression Poisson model
R language Bayesian MCMC: use rstan to build a linear regression model to analyze car data and visual diagnosis
R language Bayesian MCMC: GLM logistic regression, Rstan linear regression, Metropolis Hastings and Gibbs sampling algorithm examples
R language Bayesian Poisson Poisson-normal distribution model to analyze the number of goals in professional football matches
R language uses Rcpp to accelerate Metropolis-Hastings sampling to estimate parameters of Bayesian logistic regression model
R language logistic regression, Naive Bayes Bayesian, decision tree, random forest algorithm to predict heart disease
Bayesian network (BN), dynamic Bayesian network, and linear model analysis of malocclusion data in R language
Block Gibbs Gibbs sampling Bayesian multiple linear regression in R language
Python Bayesian Regression Analysis Housing Affordability Dataset
R language implements Bayesian quantile regression, lasso and adaptive lasso Bayesian quantile regression analysis
Python implements Bayesian linear regression model with PyMC3
R language uses WinBUGS software to establish a hierarchical (hierarchical) Bayesian model for academic ability tests
Bayesian Simple Linear Regression Simulation Analysis of Gibbs Sampling in R Language
R language and STAN, JAGS: use RSTAN, RJAG to establish Bayesian multiple linear regression to predict election data
Research on the diagnostic accuracy of Bayesian hierarchical mixed model based on copula in R language
R language Bayesian linear regression and multiple linear regression to build a salary prediction model
R language Bayesian inference and MCMC: an example of implementing the Metropolis-Hastings sampling algorithm
R language stan performs regression model based on Bayesian inference
Example of RStan Bayesian hierarchical model analysis in R language
R language uses Metropolis-Hastings sampling algorithm adaptive Bayesian estimation and visualization
R language random search variable selection SSVS estimation Bayesian vector autoregressive (BVAR) model
WinBUGS for Multivariate Stochastic Volatility Model: Bayesian Estimation and Model Comparison
R language implements Metropolis–Hastings algorithm and Gibbs sampling in MCMC
R language Bayesian inference and MCMC: an example of implementing the Metropolis-Hastings sampling algorithm
R language uses Metropolis-Hastings sampling algorithm adaptive Bayesian estimation and visualization
Video: Bayesian Models for Stan Probabilistic Programming MCMC Sampling in R
R language MCMC: Bayesian estimation of Metropolis-Hastings sampling for regression