7.6 Simulation and Prediction

In some applications we may want to simulate realizations of a point process either over the observed time domain or at some future interval of time. Therefore, we need an approach for simulating the event times. One general approach for doing so is the Lewis-Shedler thinning algorithm. The algorithm is analogous to rejection sampling for simulating from a probability density function.

The steps of the Lewis-Shedler thinning algorithm for simulating from a non-stationary Poisson process with known conditional intensity $$\lambda(t)$$ over the interval $$[0,T]$$ are as follows.

1. Simulate from a stationary Poisson process of rate $$M$$ where $$\lambda(t)\leq M$$ for all $$t$$ in $$[0, T]$$. Suppose that this results in $$n^\star$$ points $$t^\star_1,\dots,t^\star_{n^\star}$$.

2. Simulate uniform random variables $$u_1,u_2,\dots,u_{n^\star}$$.

3. For each simulated event time $$t^\star_j$$, retain point $$j$$ if $$u_j\leq\lambda(t^\star_j)/M$$, otherwise delete $$t^\star_j$$. If the point is retained, call it $$t_j$$.

4. The retained points are a realization from the point process governed by $$\lambda(t)$$.

If $$\lambda(t)$$ is very spikey, it may be difficult or at least very inefficient to find a value of $$M$$ for which $$\lambda(t)\leq M$$ for all $$t$$. Therefore, it might make sense to simulate the process in a piecewise fashion over the interval $$[0,T]$$ using different values of $$M$$ for each segment of the interval.

For cluster models, where the conditional intensity depends on the occurrence of past events, it will generally be impossible to find a value of $$M$$ in advance of simulating the series. Therefore, simulating cluster models usually must be done in a sequential manner, where smaller values of $$M$$ can be used initially, but then must be increased as events occur and the conditional intensity function spikes upwards.