Poisson processes

Heuristics and definition of Poisson processes

Informally, a Poisson process \((N_t)_{t\ge0}\) keeps track of the number of customers (or tasks) arriving at a service point. It is a special case of the so-called renewal processes, which are often used to model a variety of situations in insurance (flows of insurance claims), traffic flow (cars passing a monitoring point), inventory systems (time evolution of the customer demand and stock replenishment of a particular item) or counter processes (activation times of Geiger counters), to name just a few.

In the simplest case, it is natural to assume that: a) the customers arrive at constant average rate; b) the customers arrive one at a time; c) the numbers of arriving customers during non-overlapping time intervals are independent. Under these assumptions we get \[\label{eq:Poisson-process-jump-probabilities} \begin{gathered} \mathsf{P}\bigl(N_{t+\Delta t}=N_t\bigr) = 1-\lambda \Delta t+o(\Delta t)\,, \\ \mathsf{P}\bigl(N_{t+\Delta t}=N_t+1\bigr) = \lambda \Delta t+o(\Delta t)\,, \\ \mathsf{P}\bigl(N_{t+\Delta t}>N_t+1\bigr) = o(\Delta t)\,, \end{gathered}\] where \(\lambda>0\) is the arrival rate and the error terms \(o(\Delta t)\) satisfy \[o(\Delta t)/\Delta t\to0\qquad\text{ as \quad$\Delta t\to0$\,.}\] Informally, if \(N_0=0\), then \((N_t)_{t\ge0}\) is the Poisson process with rate \(\lambda>0\).

To find the distribution of \(N_t\), notice that for each integer \(n>1\), \[\label{eq:Poisson-process-sum-of-increments} N_t=\sum_{k=1}^n\bigl(N_{\frac{kt}n}-N_{\frac{(k-1)t}n}\bigr)\,,\] where the right hand side is a sum of independent identically distributed increments. For large \(n\) the last relation in [eq:Poisson-process-jump-probabilities] implies \[\begin{aligned} \mathsf{P}\Bigl(N_{\frac{kt}n}-N_{\frac{(k-1)t}n}&\ge2\text{ for some }k=1,\dots,n\Bigr) \\ &\le\sum_{k=1}^n\mathsf{P}\bigl(N_{\frac{kt}n}-N_{\frac{(k-1)t}n}\ge2\bigr) \le no(t/n)\to0 \end{aligned}\] as \(n\to\infty\). In other words, for large \(n\) the distribution on the right of [eq:Poisson-process-sum-of-increments] is approximately \(\mathsf{Bin}(n,\tfrac{\lambda t}n)\), implying that \(N_t\sim\mathsf{Poi}(\lambda t)\), i.e., \[\label{eq:Poisson-process-distribution} \mathsf{P}\bigl(N_t=k\bigr)=\frac{(\lambda t)^k}{k!}\,e^{-\lambda t}\,,\qquad k\ge0\,.\]

It is instructive to derive [eq:Poisson-process-distribution] using appropriate differential equations. For simplicity, put \[p_k(t):=\mathsf{P}(N_t=k)\,,\qquad k\ge0\,.\] It follows from [eq:Poisson-process-jump-probabilities] that \[p_0(t+\Delta t)=\mathsf{P}\bigl(N_t=0\bigr)\mathsf{P}\bigl(N_{t+\Delta t}=0\mid N_t=0\bigr)=p_0(t)\bigl(1-\lambda \Delta t+o(\Delta t)\bigr)\,,\] which implies \[\label{eq:Poisson-process-null-differential-equation} p'_0(t)=\lim_{\Delta t\to0}\frac{p_0(t+\Delta t)-p_0(t)}{\Delta t}=-\lambda p_0(t)\,,\qquad t\ge0\,.\] For \(k\ge1\) we similarly get \[\begin{aligned} p_k(t+\Delta t)&=\mathsf{P}\bigl(N_t=k\bigr)\mathsf{P}\bigl(N_{t+\Delta t}=k\mid N_t=k\bigr) \\ &+\mathsf{P}\bigl(N_t=k-1\bigr)\mathsf{P}\bigl(N_{t+\Delta t}=k\mid N_t=k-1\bigr) \\ &+\mathsf{P}\bigl(N_t\le k-2\bigr)\mathsf{P}\bigl(N_{t+\Delta t}=k\mid N_t\le k-2\bigr) \,, \end{aligned}\] which after straightforward rearrangement becomes \[p_k(t+\Delta t)=p_k(t)\bigl(1-\lambda \Delta t\bigr)+p_{k-1}(t)\lambda\Delta t+o(\Delta t)\,.\] As a result, for each integer \(k\ge1\) we get \[\label{eq:Poisson-process-higher-differential-equations} p'_k(t)=\lim_{\Delta t\to0}\frac{p_k(t+\Delta t)-p_k(t)}{\Delta t}=-\lambda p_k(t)+\lambda p_{k-1}(t)\,,\qquad t\ge0\,.\]

We solve equations [eq:Poisson-process-null-differential-equation][eq:Poisson-process-higher-differential-equations] with initial conditions \[p_k(0)=\delta_{k0}\equiv\mathbf{1}_{k=0}=\begin{cases}1\,,&k=0\,,\\0\,,&k\neq0\,.\end{cases}\] It thus follows that \[p_0(t)=p_0(0)\,e^{-\lambda t}\equiv e^{-\lambda t}\,,\] in complete agreement with the \(k=0\) case of [eq:Poisson-process-distribution]. For \(k>0\) it is convenient to rewrite [eq:Poisson-process-higher-differential-equations] in terms of the function \(f_k(t):=p_k(t)\,e^{\lambda t}\). We then get \[f'_k(t)=\lambda f_{k-1}(t)\,,\qquad f_k(0)=0\,,\] (and \(f_0(t)\equiv1\) for \(t\ge0\)). A straightforward induction gives \(f_k(t)=(\lambda t)^k/k!\), in agreement with the \(k>0\) case of [eq:Poisson-process-distribution].

Definition 1. \(\bigl(N_t\bigr)_{t\ge0}\) is a Poisson process with rate \(\lambda>0\), if:

  1. \(N_0=0\);

  2. \(N_{t+s}-N_s\sim\mathsf{Poi}(\lambda t)\) for all \(s\), \(t\ge0\),

  3. \(N_t\) has independent increments: for all \(0\le s_1<t_1\le s_2<t_2<\ldots\) the differences \(N_{t_1}-N_{s_1}\), \(N_{t_2}-N_{s_2}\), … are independent.

Constructive approach to Poisson processes

We start by recalling the following properties of exponential random variables. Recall that \(\xi\sim\mathsf{Exp}(\lambda)\) iff \(\mathsf{P}(\xi>x)=e^{-\lambda x}\) for all \(x\ge0\).

  1. if \(\xi\sim\mathsf{Exp}(\lambda)\), then the following “loss of memory” property holds: \[{}^\forall s,t\ge0\,\qquad \mathsf{P}\bigl(\xi>t+s\mid\xi>s\bigr)=\mathsf{P}(\xi>t\bigr)\,.\]

  2. [min-of-exp-is-exp] if \(\xi\sim\mathsf{Exp}(\lambda)\) and \(\eta\sim\mathsf{Exp}(\mu)\) are independent, then \[\zeta:=\min(\xi,\eta)\sim\mathsf{Exp}(\lambda+\mu)\,.\]

  3. [proba-of-which-exp-clock-rings-first] if \(\xi\sim\mathsf{Exp}(\lambda)\) and \(\eta\sim\mathsf{Exp}(\mu)\) are independent, then \[\mathsf{P}\bigl(\xi<\eta\bigr)=\frac\lambda{\lambda+\mu}\,.\]

  4. [extended-Markov-property-of-exponential-clocks] if \(\xi\sim\mathsf{Exp}(\lambda)\) and \(\eta\sim\mathsf{Exp}(\mu)\) are independent, then for all \(t\ge0\), \[\mathsf{P}\bigl(t<\xi<\eta\bigr)=\frac\lambda{\lambda+\mu}\,e^{-(\lambda+\mu)t}=\mathsf{P}\bigl(\xi<\eta\bigr)\,\mathsf{P}(\zeta>t)\,.\]

  5. if \(\xi\sim\mathsf{Exp}(\lambda)\) and \(\eta\sim\mathsf{Exp}(\mu)\) with \(\lambda\le\mu\), then \(\eta\) is stochastically smaller than \(\xi\), \(\eta\preccurlyeq\xi\).

  6. [sum-of-exp-is-gamma] if \((\xi_k)_{k\ge1}\) are i.i.d.r.v. \(\xi\sim\mathsf{Exp}(\lambda)\) and \(X_n=\xi_1+\dots+\xi_n\), then \(X_n\sim{\sf Gam}(n,\lambda)\) and thus has density \[\label{eq:gamma-density} f_{X_n}(x)=\lambda e^{-\lambda x}\, \frac{(\lambda x)^{n-1}}{(n-1)!}\,\mathbf{1}_{x>0}\,.\]

ex  [exse:gamma-density] (**).

Use induction to verify property (P[sum-of-exp-is-gamma]): if \((\xi_k)_{k\ge1}\) are i.i.d.r.v. \(\xi\sim\mathsf{Exp}(\lambda)\) and \(X_n=\xi_1+\dots+\xi_n\), then \(X_n\sim{\sf Gam}(n,\lambda)\) and thus has density \(f_{X_n}(x)=\lambda e^{-\lambda x}\, \frac{(\lambda x)^{n-1}}{(n-1)!}\,\mathbf{1}_{x>0}\). Use induction to verify [eq:gamma-density].

Definition 2. For fixed \(\lambda>0\), let \((\tau_k)_{k\ge1}\) be i.i.d. with common distribution \(\mathsf{Exp}(\lambda)\). If \(T_k:=\tau_1+\dots+\tau_k\) with \(T_0=0\), the Poisson process with rate \(\lambda\) is defined via \[\label{eq:Poisson-process-renewal-definition} \widetilde{N}_t:=\max\bigl\{k\ge0:T_k\le t\bigr\}\,.\]

The parameter \(\lambda\) is often referred to as the “rate” or “intensity” of \(\widetilde{N}_t\) and \(\tau_k\) is the \(k\)th ‘interarrival“ time. Then \(T_k\) is the time of the \(k\)th arrival event and \(\widetilde{N}_t\) counts the number of customers arrived by time \(t\).

We now check that Definition Definition 1 and Definition Definition 2 give rise to the same process.

Equivalence of two approaches

We start by checking properties of \(\widetilde{N}_t\).

Lemma 3. Let \(\widetilde{N}_t\) be as in [eq:Poisson-process-renewal-definition]. Then for each \(t\ge0\), \(\widetilde{N}_t\sim\mathsf{Poi}(\lambda t)\).

Proof. Fix integer \(k\ge0\). Then \(\{\widetilde{N}_t=k\}=\{T_k\le t<T_{k+1}\}\) and therefore \[\mathsf{P}(\widetilde{N}_t=k)=\int_0^tf_{T_k}(s)\mathsf{P}(\tau_{k+1}>t-s)\,ds= \lambda e^{-\lambda t}\int_0^t\frac{(\lambda s)^{k-1}}{(k-1)!}\,\lambda ds=\frac{(\lambda t)^k}{k!}\,e^{-\lambda t}\,,\] where the second equality follows from (P[sum-of-exp-is-gamma]). ◻

Lemma 4. Fix \(s\ge0\) and for \(t\ge0\) define the process \(\overline{N}_t:=\bigl(\widetilde{N}_{t+s}-\widetilde{N}_s\bigr)\). Then for all integer \(m\), \(k\ge0\) we have \[\label{eq:Poisson-process-Markov-property-with-future-increment-distribution} \mathsf{P}\bigl(\overline{N}_t=m\mid\widetilde{N}_s=k\bigr)=\frac{(\lambda t)^m}{m!}\,e^{-\lambda t}\,.\]

Remark 1. The lemma claims that the future increments \(\widetilde{N}_{t+s}-\widetilde{N}_s\) of the process \(\widetilde{N}_t\) are independent of its past trajectory \((\widetilde{N}_u)_{0\le u\le s}\), i.e., \((\widetilde{N}_t)_{t\ge0}\) is a Markov process. Further, \(\overline{N}_t\sim\widetilde{N}_t\sim\mathsf{Poi}(\lambda t)\) for all \(t\ge0\).

Proof. Fix integer \(k\ge0\). We first show that \(\mathsf{P}(T_{k+1}>s+t\mid\widetilde{N}_s=k)=\mathsf{P}(\tau_{k+1}>t)\), which implies the \(m=0\) version of [eq:Poisson-process-Markov-property-with-future-increment-distribution], \(\mathsf{P}(\overline{N}_t=0\mid\widetilde{N}_s=k)=e^{-\lambda t}\).

By the memoryless property, for each \(u\in[0,s]\) we have \[\mathsf{P}(\tau_{k+1}>t+s-u)=\mathsf{P}(\tau_{k+1}>t)\,\mathsf{P}(\tau_{k+1}>s-u)=\mathsf{P}(\tau_{k+1}>t)\,e^{-\lambda(s-u)}\,.\] Consequently, as in the proof of Lemma Lemma 3, \[\begin{aligned} \mathsf{P}(T_{k+1}>s+t, &\widetilde{N}_s=k)\equiv\int_0^sf_{T_k}(u)\,\mathsf{P}(\tau_{k+1}>t+s-u)\,du \\ &=\mathsf{P}(\tau_{k+1}>t)\,e^{-\lambda s}\int_0^sf_{T_k}(u)\,e^{\lambda u}\,du=\mathsf{P}(\tau_{k+1}>t)\,\mathsf{P}(\widetilde{N}_s=k)\,, \end{aligned}\] equivalently, \(\bigl(T_{k+1}-s\mid\widetilde{N}_s=k\bigr)\sim\mathsf{Exp}(\lambda)\), for all real \(s\ge0\) and integer \(k\ge0\).

Denote \(\overline{T}_n:=T_{n+k}-s\) with \(\overline{T}_0=0\) and let \(\bar\tau_{k+1}:=T_{k+1}-s\). By the above, \[\overline{T}_n\equiv \bar\tau_{k+1}+\tau_{k+2}+\dots+\tau_{k+n}\] is a sum of \(n\) i.i.d. \(\mathsf{Exp}(\lambda)\) random variables, and \(\overline{T}_n\) is independent of \((\widetilde{N}_u)_{0\le u\le s}\) on the event \(\{\widetilde{N}_s=k\}\). Hence, by Lemma Lemma 3, given \(\{\widetilde{N}_s=k\}\), the variable \[\overline{N}_t=\max\{n\ge0:\overline{T}_n\le t\}\] has the same distribution as \(\widetilde{N}_t\sim\mathsf{Poi}(\lambda t)\). In particular, for all \(s\), \(t\ge0\), we have \(\widetilde{N}_{s+t}-\widetilde{N}_s\sim\mathsf{Poi}(\lambda t)\). ◻

Lemma 5. The process \(\widetilde{N}_t\) has independent increments: for all integer \(n>0\) and all \(0\le t_0<t_1<\ldots<t_n\), the differences \(\widetilde{N}_{t_1}-\widetilde{N}_{t_0}\), \(\widetilde{N}_{t_2}-\widetilde{N}_{t_1}\), …, \(\widetilde{N}_{t_n}-\widetilde{N}_{t_{n-1}}\) are independent random variables.

Proof. By Lemma Lemma 4, the increment \(\widetilde{N}_{t_n}-\widetilde{N}_{t_{n-1}}\) is independent of \((\widetilde{N}_s)_{0\le s\le t_{n-1}}\), and hence of the collection of increments \(\{\widetilde{N}_{t_k}-\widetilde{N}_{t_{k-1}}\}_{k=1}^{n-1}\). The claim of the lemma follows by induction. ◻

As a result, Definition Definition 1 and Definition Definition 2 are equivalent:

Theorem 6. Let \(\widetilde{N}_t\) be as in [eq:Poisson-process-renewal-definition]. Then \((\widetilde{N}_t)_{t\ge0}\) is a Poisson process in the sense of Definition Definition 1.

Conversely, let \((N_t)_{t\ge0}\) be as in Definition Definition 1. Then \(N_t\) can be constructed along the lines of Definition Definition 2 from a sequence \(0=T_0<T_1<T_2<\dots\) with i.i.d. inter-arrival times \(\tau_k:=T_k-T_{k-1}\), where \(\tau\sim\mathsf{Exp}(\lambda)\).

Remark 2. It is clear from the definition that the trajectories \((N_t)_{t\ge0}\) are piece-wise constant and non-decreasing in \(t\). It can also be shown that \(N_t\) is right-continuous at every \(t\ge0\), namely, \(\mathsf{P}(\lim_{k\to\infty}N_{t_k} = N_t)=1\) for each sequence \(t_k\searrow t\) as \(k\to\infty\), see [exse:Poisson-process-right-continuity]. The times of arrival events \((T_k)_{k\ge0}\) can be inductively defined via \(T_0=0\) and \(T_k=\min\{t>0:N_t>N_{T_{k-1}}\}\) for \(k>0\). Notice that the minimum is attained due to right-continuity of the trajectories \((N_t)_{t\ge0}\).

Example 7. Let \((N_t)_{t\ge0}\) be the Poisson process with rate \(\lambda>0\). Fix real \(0<s<t\) and integer \(0\le k\le m\). By the independence of increments, we have \[\mathsf{P}\bigl(N_t=m\mid N_s=k\bigr)=\mathsf{P}(N_t-N_s=m-k)=\frac{(\lambda(t-s))^{m-k}}{(m-k)!}\,e^{-\lambda(t-s)}\,;\] in fact, \(\bigl(N_t-N_s\mid N_s=k\bigr)\sim\mathsf{Poi}\bigl(\lambda(t-s)\bigr)\) so that \(\mathsf{E}\bigl(N_t\mid N_s=k\bigr)=k+\lambda(t-s)\).

On the other hand, \[\begin{aligned} \mathsf{P}\bigl(N_s=k\mid N_t=m\bigr) &=\frac{\mathsf{P}\bigl(N_s=k,N_t-N_s=m-k\bigr)}{\mathsf{P}(N_t=m)} \\ &=\frac{m!}{k!(m-k)!}\Bigl(\frac{s}{t}\Bigr)^k\Bigl(1-\frac{s}{t}\Bigr)^{m-k}\,, \end{aligned}\] i.e., \(\bigl(N_s\mid N_t=m\bigr)\sim\mathsf{Bin}(m,s/t)\), implying that \(\mathsf{E}\bigl(N_s\mid N_t=m\bigr)=sm/t\).

Remark 3. Notice that \(\mathsf{P}\bigl(T_1\le s\mid N_t=1\bigr)=\mathsf{P}\bigl(N_s=1\mid N_t=1\bigr)=s/t\), i.e., \(\bigl(T_1\mid N_t=1\bigr)\sim \mathcal{U}(0,t]\). Actually, one can show that \(\bigl(T_1,\dots,T_k\mid N_t=k\bigr)\) is the order statistic of a \(k\)-sample from \(\mathcal{U}(0,t]\), see [exse:Poisson-process-uniform-arrivals-given-total-count].

Transformations of Poisson processes

Superposition of Poisson processes

A straightforward application of Definition Definition 1 gives the following property.

Lemma 8. Let \(N_1(t)\) and \(N_2(t)\) be independent Poisson processes with rates \(\lambda_1\) and \(\lambda_2\). Then \(N(t):=N_1(t)+N_2(t)\) is a Poisson process with rate \(\lambda=\lambda_1+\lambda_2\).

Proof. Clearly, \(N(0)=0\), the process \((N_t)_{t\ge0}\) has independent increments, while for all \(s\), \(t\ge0\) we have \(N_{t+s}-N_s=(N_1(t+s)-N_1(s))+(N_2(t+s)-N_2(s))\sim\mathsf{Poi}(\lambda t)\), by the addition property of Poisson distributions. ◻

Remark 4. Let \((N_t)_{t\ge0}\) be a Poisson process (with rate \(\lambda>0\)). As its trajectory can be uniquely identified from the sequence \((T_k)_{k\ge1}\) of its jumps, there is a one-to-one correspondence between Poisson processes and their jump sequences. Geometrically, the above argument shows that if \(\mathcal{T}_1:=\{T^1_k\}_{k\ge1}\) is the (random) set of jump epochs in \((N_1(t))_{t\ge0}\) and, similarly, \(\mathcal{T}_2:=\{T^2_k\}_{k\ge1}\) is the set of jump epochs in \((N_2(t))_{t\ge0}\), then \(\mathcal{T}=\mathcal{T}_1\cup\mathcal{T}_2\) generates the jump sequence for \((N_t)_{t\ge0}\). This stochastic geometric interpretation implicitly uses the fact that, by independence of \((N_1(t))_{t\ge0}\) and \((N_2(t))_{t\ge0}\), their jump sequences are disjoint almost surely, i.e., \(\mathcal{T}_1\cap\mathcal{T}_2=\varnothing\) with probability one, see [exse:union-independent-Poisson-point-processes].

Corollary 9. Let \((N_1(t))_{t\ge0}\) and \((N_2(t))_{t\ge0}\) be Poisson processes with respective rates \(\lambda_1\) and \(\lambda_2\). If \(0<\lambda_1\le\lambda_2\), then \((N_1(t))_{t\ge0}\) is stochastically dominated by \((N_2(t))_{t\ge0}\).

Proof. We construct \(\widetilde{N}_1(t)\sim N_1(t)\) and \(\widetilde{N}_2(t)\sim N_2(t)\) on the same probability space so that \[\label{eq:Poisson-processes-stochastic-domination} \mathsf{P}\bigl(\widetilde{N}_1(t)\le\widetilde{N}_2(t)\text{ for all }t\ge0\bigr)=1\,.\]

To this end, let \(\bigl(Z_1(t)\bigr)_{t\ge0}\) and \(\bigl(Z_2(t)\bigr)_{t\ge0}\) be independent Poisson processes with respective rates \(\lambda_1\) and \(\lambda_2-\lambda_1\ge0\) (if the latter vanishes, we just have \(Z_2(t)\equiv0\) for all \(t\ge0\)). Defining \(\widetilde{N}_1(t):=Z_1(t)\) and \(\widetilde{N}_2(t):=Z_1(t)+Z_2(t)\), we obtain [eq:Poisson-processes-stochastic-domination]. ◻

Remark 5. The stochastic geometry point of view on Poisson processes outlined in Remark Remark 4 can be used to derive a step-by-step coupling proof of the above result, see [exse:stochastic-order-Poisson-processes-via-coupling].

Thinning of Poisson processes

Let \((N_t)_{t\ge0}\) be as in Definition Definition 2. Each arrival time \(T_k\), \(k>0\), randomly classify into type I or type II by independently flipping a (possibly biased) coin showing the type I side with probability \(p\in(0,1)\). Let \(N_1(t)\) and \(N_2(t)\) be the numbers of type I and type II arrivals among the total of \(N_t\) arrivals by time \(t\).

Lemma 10. \(N_1(t)\) and \(N_2(t)\) are independent Poisson processes with respective rates \(\lambda p\) and \(\lambda(1-p)\).

Proof. The fact that \(N_1(t)\sim\mathsf{Poi}(\lambda pt)\) follows from the properties of random sums of random variables, e.g., \(G_{N_1(t)}(s)\equiv\mathsf{E}\bigl(s^{N_1(t)}\bigr)=\mathsf{E}\bigl[\bigl(1+p(s-1)\bigr)^{N_t}\bigr]=e^{\lambda pt(s-1)}\); similarly, \(N_2(t)\sim\mathsf{Poi}\bigl(\lambda(1-p)t\bigr)\). To check independence, notice that \[G_{N_1(t),N_2(t)}(s,u):=\mathsf{E}\bigl[s^{N_1(t)}\,u^{N_2(t)}\bigr]\equiv\mathsf{E}\bigl[\bigl(ps+(1-p)u\bigr)^{N_t}\bigr]=G_{N_1(t)}(s)\,G_{N_2(t)}(u)\,,\] so that a straightforward differentiation (at \(s=u=0\)) implies the claim.

The remaining properties in Definition Definition 1 are obvious. ◻

Remark 6. Notice that by [exse:geometric-sum-of-poisson-variables-is-poisson], the inter-arrival times of \(N_1(t)\) and \(N_1(t)\) are, respectively, \(\mathsf{Exp}(\lambda p)\) and \(\mathsf{Exp}(\lambda(1-p))\) distributed.

More general Poisson processes

The classical theory of Poisson processes is based upon two key assumptions, that the customers arrive at constant rate one at a time. In some applications, these assumptions might not seem reasonable, and one is forced to use more general Poisson processes. Here we mention two possibilities.

Compound Poisson processes

Suppose, as before, that the time \(T_k\) of the \(k\)th arrival event, \(k\ge1\), satisfies \(T_k=\tau_1+\dots+\tau_k\), where \(\tau_j\) are i.i.d. with common distribution \(\tau\sim\mathsf{Exp}(\lambda)\). On the other hand, assume that \(Y_k\ge1\) customers arrive at time \(T_k\), where \(\{Y_k\}_{k\ge1}\) are i.i.d. with common generating function \(G_Y(s):=\mathsf{E}(s^Y)\); we also assume that the collection \(\{Y_k\}_{k\ge1}\) is independent of \(\{T_1,T_2,\dots\}\), equivalently, of the Poisson process \((N_t)_{t\ge0}\) generated by the arrival times \((T_k)_{k\ge1}\). Then \[S_t:=\sum_{j=1}^{N_t}\,Y_j\] counts the number of individuals arrived by time \(t\ge0\). Notice that \[G_{S_t}(u):=\mathsf{E}\bigl(u^{S_t}\bigr)\equiv G_{N_t}\bigl(G_Y(u)\bigr)=\exp\Bigl\{\lambda t\bigl(G_Y(u)-1\bigr)\Bigr\}\,,\] implying that \[\mathsf{E}S_t=\mathsf{E}N_t\cdot\mathsf{E}Y=\lambda t\mathsf{E}Y, \qquad \mathsf{Var}(S_t)=\mathsf{E}N_t\cdot\mathsf{Var}(Y)+\mathsf{Var}(N_t)\cdot(\mathsf{E}Y)^2\,.\]

Non-homogeneous Poisson processes

Recall that if \(X_i\sim\mathsf{Ber}(p_n)\) are i.i.d. for \(i=1,,\dots,n\), then \(X:=X_1+\dots+X_n\sim\mathsf{Bin}(n,p_n)\). If \(n\cdot p_n\to\lambda>0\), then the generating function of \(X\) is close to that of \(Y\sim\mathsf{Poi}(\lambda)\). Using coupling, one can show that (with \(p_n=\lambda/n\)) \[{\mathsf{d_{TV}}}(X,Y)=\frac12\sum_{k\ge0}\bigl|\mathsf{P}(X=k)-\mathsf{P}(Y=k)\bigr|\le\frac{\lambda^2}n\,.\]

We now relax the homogeneity assumption by supposing that \(X_i\sim \mathsf{Ber}(p_{n,i})\) are independent with \(\sum_{i=1}^np_{n,i}\to\lambda\in(0,\infty)\) and \(\max_ip_{n.i}\to0\) as \(n\to\infty\). Then the distribution of \(X:=X_1+\dots+X_n\) approaches that of \(Y\sim\mathsf{Poi}(\lambda)\); also, for \(Y_n\sim\mathsf{Poi}(\lambda_n)\) with \(\lambda_n=\sum_{i=1}^np_{n,i}\to\lambda\) as \(n\to\infty\), \[{\mathsf{d_{TV}}}(X,Y_n)\le\sum_{i=1}^n(p_{n,i})^2\le\lambda_n\max_{i=1,\ldots,n}p_{n,i}\to0\qquad\text{ as $n\to\infty$\,.}\]

This observation motivates the following definition.

Definition 11. A process \((N_t)_{t\ge0}\) is a Poisson process with rate \(\lambda(r)\), if

  1. it has the initial value \(N_0=0\);

  2. the process \(N_t\) has independent increments;

  3. its increments are Poisson distributed, \(N_{t+s}-N_s\sim\mathsf{Poi}\bigl(\int_s^{s+t}\lambda(r)\,dr\bigr)\).

Remark 7. Decomposing \(N_{s+t}-N_s=\sum_{j=1}^n\bigl(N_{s+jt/n}-N_{s+(j-1)t/n}\bigr)\), a telescopic sum of independent Poisson variables, for continuous \(\lambda(\cdot)\) we deduce \[\mathsf{E}\bigl(N_{s+t}-N_s\bigr)=\sum_{j=1}^n\bigl[\lambda(s+\tfrac{jt}n)\tfrac{t}n+o\bigl(\tfrac{t}n\bigr)\bigr]\approx\int_s^{s+t}\lambda_r\,dr\,.\]

Example 12. Writing \(\mu_t:=\int_0^t\lambda_s\,ds\), we get \(\mathsf{P}(\tau_1>t)=\mathsf{P}(N_t=0)=e^{-\mu_t}\). Hence the density of the first inter-arrival time is \(f_{\tau_1}(t)=\lambda_t\,e^{-\mu_t}\), implying that \(\tau_1\) is not exponentially distributed when \(\lambda_r\) is not identically constant.

Similarly, the arrival times \(T_k:=\tau_1+\dots+\tau_k\) satisfy \(\mathsf{P}\bigl(\tau_{k+1}>t\mid T_k=s)=\mathsf{P}(N_{s+t}-N_s=0)\) and therefore \[f_{T_1,T_2}(u,v)=\lambda_u\,e^{-\mu_u}\cdot\lambda_v\,e^{-(\mu_v-\mu_u)}\,\mathbf{1}_{u<v}\] implying that \[f_{\tau_1,\tau_2}(s,t)=\lambda_s e^{-\mu_s}\cdot\lambda_{s+t}e^{-(\mu_{s+t}-\mu_s)}\,;\] consequently, the inter-arrival times \(\tau_1\) and \(\tau_2\) are independent only if \(\lambda_r\) is identically constant, and hence if \((N_t)_{t\ge0}\) is a homogeneous Poisson process.

The Poisson process \(N_t\) can still be related to a (dependent) renewal system as in Definition Definition 2, however, the intrinsic dependency between inter-arrival times \(\tau_k\) makes the analysis of \(N_t\) much more challenging.

Additional problems

 [exse:Poisson-variable-moments] (**).

Let \(X\sim\mathsf{Poi}(\lambda)\). Find \(\mathsf{E}X\) and \(\mathsf{Var}(X)\), a) using generating functions; b) by direct computation.

 [exse:sum-of-Poissons-is-Poisson-via-generating-functions] (*).

If \((X_k)_{k\ge1}\) are independent with \(X_k\sim\mathsf{Poi}(\lambda_k)\), let \(Y:=X_1+\dots+X_n\). Using generating functions, show that \(Y\sim\mathsf{Poi}(\lambda)\) with \(\lambda=\lambda_1+\dots+\lambda_n\). If \((X_k)_{k\ge1}\) are independent with \(X_k\sim\mathsf{Poi}(\lambda_k)\), use generating functions to show that \(Y:=X_1+\dots+X_n\sim\mathsf{Poi}(\lambda)\), where \(\lambda=\lambda_1+\dots+\lambda_n\).

 [exse:sum-of-Poissons-is-Poisson-via-convolution] (**).

Let \((X_k)_{k\ge1}\) be independent with \(X_k\sim\mathsf{Poi}(\lambda_k)\).
a) Using the convolution formula, \(\mathsf{P}(X_1+X_2=n)=\sum_{k=0}^n\mathsf{P}(X_1=k)\,\mathsf{P}(X_2=n-k)\), show that \(Y:=X_1+X_2\) satisfies \(Y\sim\mathsf{Poi}(\lambda_1+\lambda_2)\). Let \(Y=X_1+X_2\) and \(\lambda=\lambda_1+\lambda_2\). Prove that \(Y\sim\mathsf{Poi}(\lambda)\), using the convolution formula, \(\mathsf{P}(Y=n)=\sum_{k=0}^n\mathsf{P}(X_1=k)\,\mathsf{P}(X_2=n-k)\).
b) Using induction and the result in a), derive an alternative solution to [exse:sum-of-Poissons-is-Poisson-via-generating-functions].

 [exse:min-of-exp-is-exp] (*).

If \((\xi_k)_{k=1}^n\) are independent with \(\xi_k\sim\mathsf{Exp}(\lambda_k)\), show that \(\eta:=\min(\xi_1,\dots,\xi_n)\sim\mathsf{Exp}(\lambda_1+\dots+\lambda_n)\). Generalise property (P[min-of-exp-is-exp]) to any number of variables: If \((\xi_k)_{k=1}^n\) are independent with \(\xi_k\sim\mathsf{Exp}(\lambda_k)\), show that \(\eta:=\min(\xi_1,\dots,\xi_n)\sim\mathsf{Exp}(\lambda)\), where \(\lambda=\lambda_1+\dots+\lambda_n\).

 [exse:proba-of-which-exp-clock-rings-first] (**).

Generalise property (P[proba-of-which-exp-clock-rings-first]) to any number of variables: If \((\xi_k)_{k=1}^n\) are independent with \(\xi_k\sim\mathsf{Exp}(\lambda_k)\), show that \(\eta:=\min(\xi_1,\dots,\xi_n)\) satisfies \(\mathsf{P}(\eta=\xi_k)=\lambda_k/(\lambda_1+\dots+\lambda_n)\).

 [exse:extended-Markov-property-of-exponential-clocks] (**).

Generalise property (P[extended-Markov-property-of-exponential-clocks]) to any number of variables: If \((\xi_k)_{k=1}^n\) are independent with \(\xi_k\sim\mathsf{Exp}(\lambda_k)\), show that with \(\eta:=\min(\xi_1,\dots,\xi_n)\) we have \(\mathsf{P}(\eta>x,\eta=\xi_k)=\mathsf{P}(\eta>x)\mathsf{P}(\eta=\xi_k)\).

 [exse:Poisson-process-many-jumps-estimate] (**).

Let \(T_n=\tau_1+\dots+\tau_n\), where \(\tau_k\) are i.i.d.r.v. with \(\tau\sim\mathsf{Exp}(\lambda)\). By using induction or otherwise, show that \(\mathsf{P}(T_n\le t)\le\tfrac{(\lambda t)^n}{n!}\), for all real \(t\ge0\) and integer \(n\ge1\). Deduce that \(T_n\to\infty\) almost surely.

 [exse:Poisson-process-inhomogeneous-many-jumps-estimate] (***).

Let \(T_n=\tau_1+\dots+\tau_n\), where \(\tau_k\) are independent with \(\tau_k\sim\mathsf{Exp}(\lambda_k)\). Use induction to show that \(\mathsf{P}(T_n\le t)\le\tfrac{t^n}{n!}\prod_{k=1}^n\lambda_k\), for all real \(t\ge0\) and integer \(n\ge1\).

 [exse:finiteness-of-inhomogeneous-sum-of-exponentials] (**).

For independent \((\xi_k)_{k\ge1}\) with \(\xi_k\sim\mathsf{Exp}(\lambda_k)\), let \(X=\sum_{k\ge1}\xi_k\). Show that: If variables \((\xi_k)_{k\ge1}\) are independent with \(\xi_k\sim\mathsf{Exp}(\lambda_k)\), let \(X=\sum_{k\ge1}\xi_k\). Show that:
a) If \(\sum_{k\ge1}(\lambda_k)^{-1}<\infty\), then \(\mathsf{P}(X<\infty)=1\).
b) If \(\sum_{k\ge1}(\lambda_k)^{-1}=\infty\), then \(\mathsf{P}(X=\infty)=1\).

 [exse:Poisson-process-increments-properties] (**).

Let \((N_t)_{t\ge0}\) be a Poisson process with rate \(\lambda\). Fix arbitrary real \(0<s<t\) and integer \(0\le m\le n\).
a) Compute the probability \(\mathsf{P}\bigl(N_t-N_s=m\mid N_s=n\bigr)\).
b) Compute the probability \(\mathsf{P}\bigl(N_s=m\mid N_t=n\bigr)\).

 [exse:Poisson-process-increments-expectations] (**).

Let \((N_t)_{t\ge0}\) be a Poisson process with rate \(\nu\) and let \(T_k\) be the time of its \(k\)th arrival. For fixed integers \(0<m<n\) and real \(0<u<v\), find the following expectations: a) \(\mathsf{E}T_n\); b) \(\mathsf{E}(T_n\mid N_t=m)\); c) \(\mathsf{E}(N_v\mid N_u=m)\).

 [exse:Poisson-process-multiple-increments-properties] (**).

Let \((N_t)_{t\ge0}\) be a Poisson process with rate \(\lambda\). Fix arbitrary real \(0<s<t<u\) and integer \(0\le m\le n\).
a) Compute the probability \(\mathsf{P}\bigl(N_u-N_t=m\mid N_s=n\bigr)\).
b) Compute the probability \(\mathsf{P}\bigl(N_t-N_s=m\mid N_u=n\bigr)\).

 [exse:geometric-sum-of-poisson-variables-is-poisson] (**).

For \(p\in(0,1)\), let \(L\sim\mathsf{Geom}(p)\); namely, \(\mathsf{P}(L>k)=q^k\) for integer \(k\ge0\), where \(q=1-p\). Given a sequence \((X_k)_{k\ge1}\) of i.i.d.r.v. with \(X\sim\mathsf{Exp}(\nu)\), put \(Y:=\sum_{k=1}^LX_k\). Using moment generating functions or otherwise, find the distribution of \(Y\). Deduce that the \(p\)-decimated process is a Poisson process and find its rate.

 [exse:Poisson-processes-addition-property] (**).

Let \(N_1(t)\), …, \(N_k(t)\) be independent Poisson processes with rates \(\lambda_1\), …, \(\lambda_k\). Show that \(N(t):=N_1(t)+\dots+N_k(t)\) is a Poisson process with rate \(\lambda=\lambda_1+\dots+\lambda_k\).

 [exse:Poisson-process-right-continuity] (***).

To verify that trajectories of Poisson processes are right continuous with probability one, fix \(t\ge0\) and a sequence \((t_k)_{k\ge1}\) such that \(t_k\searrow t\) as \(k\to\infty\) and consider the event \(A_k:=\{N_{t_k}>N_t\}\). Notice that \(A_k\) is a decreasing sequence of events with \(\mathsf{P}(A_k)=1-e^{-\lambda(t_k-t)}\le\lambda(t_k-t)\) and such that \(\{N_{t_k}\to N_t\}\equiv\{A_k \text{ finitely often}\}\). Verify that \(\{A_k\text{ infinitely often}\}=\cap_kA_k\) and deduce that \(\mathsf{P}(N_{t_k}\to N_t)=1\). Argue that the claim does not depend on the sequence \((t_k)_{k\ge1}\).

 [exse:Poisson-processes-joint-generating-function] (**).

Finish the argument of Lemma Lemma 10 by verifying that \[\tfrac{\partial^{k+m}}{\partial s^k\,\partial u^m}G_{N_1(t),N_2(t)}(s,u)\big|_{s=u=0}\equiv k!\,m!\,\mathsf{P}\bigl(N_1(t)=k,N_2(t)=m\bigr)\,,\] similarly, that \[\tfrac{\partial^k}{\partial s^k}G_{N_1(t)}(s)\big|_{s=0}\equiv k!\,\mathsf{P}\bigl(N_1(t)=k\bigr)\,,\quad \tfrac{\partial^m}{\partial u^m}G_{N_2(t)}(u)\big|_{u=0}\equiv m!\,\mathsf{P}\bigl(N_2(t)=m\bigr).\]

 [exse:Poisson-process-last-arrival-average] (**).

Let \((N_t)_{t\ge0}\) be a Poisson process with rate \(\lambda\) generated by a sequence \((T_n)_{n\ge0}\) of arrival times. Find the average \(\mathsf{E}(T_n\mid N_t=n)\) and deduce that it is strictly smaller than \(t\).

 [exse:Poisson-process-uniform-arrivals-given-total-count] (***).

a) Let \(X_1\), …, \(X_n\) be independent with common distribution \(X_i\sim\mathcal{U}(0,t]\). Let \(\mathbf{Y}=(Y_1,\ldots,Y_n)\) be the corresponding order statistic, namely, \(\{Y_1,\ldots,Y_n\}=\{X_1,\ldots,X_n\}\) with \(Y_1<\ldots<Y_n\); in other words, \(Y_1=\min(X_1,\ldots,X_n)\), \(Y_2=\min(X_i:X_i>Y_1)\), …, \(Y_n=\max(X_1,\ldots,X_n)\), so that \(Y\)s are \(X\)s written in the ascending order. Show that the joint pdf of \(Y\)s satisfies \[f_{Y_1,\ldots,Y_n}(y_1,\ldots,y_n)=\frac{n!}{t^n}\,\mathbf{1}_{0<y_1<\ldots<y_n\le t}\,.\] b) Let \(\bigl(N(t)\bigr)_{t\ge0}\), \(N_0=0\), be the Poisson process with rate \(\lambda>0\). Assume that \(N_t=m>0\) for some \(t>0\), so that the first \(m+1\) arrival times satisfy \(0<T_1<\ldots<T_m\le t<T_{m+1}\). By using that \(\mathsf{P}(T_k>t\mid T_{k-1}=s)=e^{-\lambda(t-s)}\) for \(t>s\ge0\), or otherwise, show that \[f_{T_1,\ldots,T_m\mid N_t}(s_1,\ldots,s_m\mid m)=\frac{m!}{t^m}\,\mathbf{1}_{0<s_1<\ldots<s_m\le t}\,.\] c) Deduce the claim of Remark Remark 3, namely, that \((T_1,\ldots,T_n\mid N_t=n)\) from part b) has the same distribution as the order statistic \(\mathbf{Y}=(Y_1,\ldots,Y_n)\) in part a).

 [exse:union-independent-Poisson-point-processes] (**).

In the setting of Remark Remark 4, namely, if \(\mathcal{T}_1:=\{T^1_k\}_{k\ge1}\) is the set of jump epochs in the Poisson process \((N_1(t))_{t\ge0}\) with rate \(\lambda_1>0\) and, similarly, \(\mathcal{T}_2:=\{T^2_k\}_{k\ge1}\) is the set of jump epochs in the Poisson process \((N_2(t))_{t\ge0}\) with rate \(\lambda_2>0\), show that \(\mathcal{T}_1\cap\mathcal{T}_2=\varnothing\) almost surely.

 [exse:stochastic-order-Poisson-processes-via-coupling] (***).

The step-by-step coupling mentioned in Remark Remark 5 can be constructed as follows: let \(t\ge0\) with \(\widetilde{N}_1(t)=k\) and \(\widetilde{N}_2(t)=m\), for some integer \(0\le k\le m\). For independent \(\xi_1\sim\mathsf{Exp}(\lambda_1)\) and \(\xi_2\sim\mathsf{Exp}(\lambda_2-\lambda_1)\) let \(\eta:=\min(\xi_1,\xi_2)\). Then define \[\widetilde{N}_1(s)\equiv\widetilde{N}_1(t)=k\,,\qquad \widetilde{N}_2(s)\equiv\widetilde{N}_2(t)=m\,,\qquad\text{for all $s\in[t,t+\eta)$\,,}\] while \[\bigl(\widetilde{N}_1(t+\eta),\widetilde{N}_2(t+\eta)\bigr) =\begin{cases}(k+1,m+1)\,,& \text{ if } \xi_1<\xi_2\,, \\ (k,m+1)\,,& \text{ if } \xi_1\ge\xi_2\,. \end{cases}\] In words, if the \(\xi_1\) clock rings first, both processes jump first at time \(t+\xi_1=t+\eta\), otherwise, the first jump of \(\widetilde{N}_2(s)\) occurs at time \(t+\xi_2=t+\eta\), while \(\widetilde{N}_1(s)\) does not jump for \(s\in[t,t+\eta]\). Show that, starting from \(t=0\) and \(\widetilde{N}_1(0)=\widetilde{N}_2(t)=0\), the above construction generates two trajectories such that \(\widetilde{N}_2(t)\ge\widetilde{N}_1(t)\) for all \(t\ge0\). Further, show that \((\widetilde{N}_1(t))_{t\ge0}\) is a Poisson process with rate \(\lambda_1>0\) while \((\widetilde{N}_2(t))_{t\ge0}\) is a Poisson process with rate \(\lambda_2\ge\lambda_1\).