Poisson Process


Lecture 1

Exponential dist

  • memoryless continuous random variable X
    • \(\Pr(X>t+s|X>s)=\Pr(X>t)\) <==> \(\Pr(X>t+s)=\Pr(X>t)\Pr(X>s)\)
    • survival function of X: \(G(t)=\Pr(X>t)\) (right continuous)
    • G(t+s)=G(t)G(s)
      • easy to check \(G(nt)=G(t)^n\), or \(G(t)=G(nt)^{1/n}\).
      • so \(G(1/p)=G(1)^{1/p}, G(q/p)=G(1/p)^q=G(1)^{q/p}\) for any integer p,q.
      • rational number can approximate any real number, so \(G(t)=G(1)^t=\exp(-\log(1/G(1))t)\), an exponential dist with rate \(\lambda=\log(1/G(1))\).
    • geometric dist is the discrete analog: survival prob \((1-p)^k\).
    • Exponential dist with rate \(\lambda\)
      • \(f(x)=I(x>0)e^{-\lambda x}\)
      • \(F(x)=I(x\geq 0)(1-e^{-\lambda x}), G(x)=e^{-\lambda x}\)
      • \(E(X)=1/\lambda, Var(X)=1/\lambda^2\)
      • MGF: \(\phi(t)=E[e^{tX}]=I(t<\lambda)\lambda/(\lambda-t)\)
      • Failure/hazard rate
        • r(t)=f(t)/G(t)
        • \(\Pr(X\in (t,t+dt)|X>t)=r(t)dt\).
        • \(G(t) = \exp(-\int_0^t r(t)dt)\).
        • \(r(t)=\lambda\) for exponential dist.
    • minimum of independent exp is still exp (rate is the sum of individual rates).
      • \(\Pr(\min(X,Y)>t)=\Pr(X>t)\Pr(Y>t)=e^{-t\lambda_x-t\lambda_y}\)
        • \(\Pr(\min_iY_i>t)=\prod_i\Pr(Y_i>t)=e^{-t\sum_i\lambda_i^y}\)
      • Pr(an exp is the minimum among a set of independent exps) \(\propto\) its rate
        • \(\Pr(X\lt Y)=E[E[I(X\lt Y)|X]]=E[e^{-X\lambda_y}]=\lambda_x/(\lambda_x+\lambda_y)\)
        • \(\Pr(X\lt Y_1,\cdots,X\lt Y_m=E[E[I(X\lt Y_i,\forall i)|X]]\)
          • = \(E[e^{-X\sum_i\lambda_{i}^y}]=\lambda_x/(\lambda_x+\sum_i\lambda_i^y)\)
    • sum of iid exp is gamma(n,\(\lambda\)).
  • Hyperexponential dist: finite mixture of Exp dist
    • independent \(X_i\sim\) Exp(\(\lambda_i), \Pr(T=i)=P_i, \sum_{i=1}^nP_i=1, i=1,\ldots,n\)
    • Consider \(X=X_T\)
    • \(f(x)=\sum_{i=1}^nP_i\lambda_ie^{-\lambda_i x}, \sum_{i=1}^nP_i=1, P_i>0\)
    • \(F(x)=\)
    • \(\Pr(T=i|X>t)\)
    • \(r(x)=\)
  • MLE for Exp dist
    • iid obs \((x_1,\ldots,x_n)\) from one Exp(\(\lambda\))
      • \(\hat{\lambda}=n/\sum_{i=1}^nx_i\)
      • derivation:
  • MLE for hyperexp
    • \((x_1,\ldots,x_n)\) from hyperexp \(\sum_{k=1}^Kp_k\lambda_k e^{-\lambda_k x}\)
      • direct optimization: variable transformation with \(\tau_k=\log(\lambda_k)\)
      • EM algorithm derivation: \(\Pr(X_i,T_i)=\Pr(T_i)\Pr(X_i|T_i)\)
        • \(\Pr(T_i)=\prod_{k=1}^K p_k^{I(T_i=k)}\)
        • \(\Pr(X_i|T_i) = \prod_{k=1}^K \Pr(X_i|T_i=k)^{I(T_i=k)}\)
        • need to compute \(\Pr(T_i=k|X_i)\)
        • \(\max_{p_k,\lambda_k}\sum_iE[\log\Pr(T_i)+\log\Pr(X_i|T_i)|X_i]\)
  • Hypoexponential dist: sum of independent Exp dists with different means
    • independent \(X_i\sim Exp(\lambda_i)\) and \(\lambda_i\neq \lambda_j, \forall i\neq j\)
    • \(X=\sum_{i=1}^nX_i\) is a hypoexponential rv.
    • n=2: \(f_{X_1+X_2}(t)=\int_0^t f_{X_1}(s)f_{X_2}(t-s)ds\)
      • \(f_{X_1}(s)=\lambda_1e^{-\lambda_1s}\) and \(f_{X_2}(t-s)=\lambda_2e^{-\lambda_2(t-s)}\)
      • \(=\lambda_1/(\lambda_1-\lambda_2)\lambda_2e^{-\lambda_2t}+\lambda_2/(\lambda_2-\lambda_1)\lambda_1e^{-\lambda_1t}\)
    • in general \(f_{X_1+\ldots+X_n}(t)=\sum_{i=1}^nC_{i,n}\lambda_ie^{-\lambda_it}\)
      • \(C_{i,n}=\prod_{j\neq i} \lambda_j/(\lambda_j-\lambda_i)\)
      • induction proof:
    • survival function: \(G(t)=\sum_{i=1}^nC_{i,n}e^{-\lambda_it}\)
      • failure rate: \(r(t)= \frac{\sum_{i=1}^nC_{i,n}\lambda_ie^{-\lambda_it}}{\sum_{i=1}^nC_{i,n}e^{-\lambda_it}}\)
      • \(\lim_{t\to\infty} r(t) = \min_j\lambda_j\)
      • failure rate approximates the minimum.

Lecture 2

Poisson process

Counting process

  • A stochastic process \(\{N(t),t\geq 0\}\) is said to be a counting process if N(t) represents the total number of 'events' that occur by time t.
    • \(N(t)\geq 0\).
    • N(t) is integer valued.
    • If s < t, then \(N(s)\leq N(t)\) (monotone increasing).
    • For s < t, N(t) − N(s) equals the number of events that occur in the interval (s, t].
  • A counting process has independent increments if the numbers of events that occur in disjoint time intervals are independent.
  • A counting process has stationary increments if the distribution of the number of events that occur in any interval of time depends only on the length of the time interval.

Poisson process

  • Counting process \(\{N(t),t\geq 0\}\) is said to be a Poisson process having rate \(\lambda, \lambda>0\), if
    • N(0) = 0.
    • The process has independent increments.
    • The number of events in any interval of length t is Poisson distributed with mean \(\lambda t\).That is, \(\forall s,t\geq 0\),
      • \(\Pr[N(t + s)−N(s)=n]=e^{−\lambda t}(λt)^n/n!, n=0,1,\cdots\)
      • i.e., stationary increments with \(E[N(t)]=\lambda t\)
      • note \(\Pr[N(t + s)−N(s)=0]=e^{-\lambda t}\approx 1-\lambda t\) and \(\Pr[N(t + s)−N(s)=1]=\lambda t e^{-\lambda t}\approx \lambda t\)
  • equivalent definition: counting process with independent and stationary increments
    • N(0) = 0.
    • \(\Pr[N(h)=1]=\lambda h+ o(h)\), \(\Pr[N(h)\geq 2]=o(h)\)
      • \(f(h)=o(h)\) <==> \(\lim_{h\to 0}f(h)/h=0\).
  • Proof of equivalence
    • define \(g(t)=E[e^{−\mu N(t)}]\)
    • decompose \(g(t+h)=g(t)E[e^{−\mu N(h)}]\)
      • for small \(h\), \(\Pr(N(h)=0,1,\geq 2)\) can be written out, …
      • \(E[e^{−\mu N(h)}]=1-\lambda h + e^{-\mu}\lambda h+ o(h)\)
    • derive \(g(t)=\exp\{\lambda t(e^{−u}−1)\}\), hence it's Poisson dist.
  • Intuition based on Poisson approx to Binomial dist
    • divide \([0,t]\) into k equal parts with \(k\to\infty\)
    • each small interval has at most one event with prob \(\lambda t/k\)
      • hence can be treated as a Bernoulli rv.
    • with independent and stationary increments, we will have a Binomial dist with \((k,\lambda t/k)\)
    • for large \(k\), the Binomial dist will approach Poisson with mean \(k\lambda t/k=\lambda t\)

Interarrival and Waiting Time Distributions

  • \(T_n\) denote the elapsed time between the \((n−1)\)-st and the \(n\)-th event.
    • sequence \(\{T_n, n=1,2,\cdots\}\) is called the sequence of interarrival times.
  • Proposition 5.1 \(T_n\), n=1,2,…, are independent identically distributed exponential random variables having mean \(1/\lambda\).
    • \(\Pr(T_1>t)=\Pr[N(t)=0]=e^{-\lambda t}\)
    • \(\Pr(T_2>t)=E[\Pr(T_2>t|T_1)]=e^{-\lambda t}\)
      • \(\Pr(T_2>t|T_1=s)=\Pr(\mbox{0 events in } (s,s+t]|T_1=s)=\Pr(\mbox{0 events in } (s, s + t])=e^{-\lambda t}\).
    • The assumption of stationary and independent increments is basically equivalent to asserting that, at any point in time, the process probabilistically restarts itself.
      • That is, the process from any point on is independent of all that has previously occurred (by independent increments), and also has the same distribution as the original process (by stationary increments).
      • In other words, the process has no memory, and hence exponential interarrival times are to be expected.
  • \(S_n\), the arrival time of the \(n\)-th event, also called the waiting time until the \(n\)-th event.
    • \(S_n=\sum_{i=1}^nT_i, n\geq 1\).
      • hence \(S_n\) has a gamma distribution with parameters \(n\) and \(\lambda\)
      • \(f_{S_n}(t)=\lambda e^{-\lambda t} (\lambda t)^{n-1}/(n-1)!, t\geq 0\)
    • alternatively
      • \(N(t)\geq n\) <=> \(S_n\leq t\)
        • \(F_{S_n}(t)=\Pr(S_n\leq t)=\Pr[N(t)\geq n]=\cdots\) (Poisson dist)
      • \(\Pr(t\lt S_n\lt t+h)=\Pr(N(t)=n-1, \mbox{one event in } (t,t+h))+o(h)\)
        • \(=e^{-\lambda t} (\lambda t)^{n-1}/(n-1)![\lambda h+o(h)] + o(h)\)
        • \(=\lambda e^{-\lambda t} (\lambda t)^{n-1}/(n-1)!h+o(h)\)
  • equivalent definition of Poisson process
    • Start with iid \(\{T_n,n\geq 1\}\) of exponetial rvs with mean \(1/\lambda\)
    • Define a counting process with the \(n\)-th event occurng at time \(S_n=\sum_{i=1}^nT_i\)
      • \(N(t)=\max_n\{n: S_n\leq t\}, S_0=0\).

Lecture 3

Multiple Poisson processes

  • Consider a Poisson process \(\{N(t),t\geq 0\}\) having rate \(\lambda\)
    • each time an event occurs it is classified as either a type I or a type II event.
    • each event is classified as a type I event with probability \(p\) or a type II event with probability \(1−p\), independently of all other events
    • Let \(N_1(t)\) and \(N_2(t)\) denote respectively the number of type I and type II events occurring in [0, t]. Note \(N(t)=N_1(t)+N_2(t)\).
    • Proposition 5.2 \(\{N_1(t),t\geq 0\}\) and \(\{N_2(t),t\geq 0\}\) are both Poisson processes having respective rates \(\lambda p\) and \(\lambda(1−p)\). Furthermore, the two processes are independent.
      • \(N_1(0)=0\) follows from the fact that N(0)=0.
      • \(\{N_1(t),t\geq 0\}\) inherits the stationary and independent increment properties of \(\{N(t)\}\).
        • This is true because the distribution of the number of type I events in an interval can be obtained by conditioning on the number of events in that interval, and the distribution of this latter quantity depends only on the length of the interval and is independent of what has occurred in any nonoverlapping interval.
      • \(\Pr[N_1(h)=1]=\Pr[N_1(h)=1|N(h)=1]\Pr[N(h)=1]+\Pr[N_1(h)=1|N(h)\geq 2]\Pr[N(h)\geq 2]\)
        • \(=p(\lambda h + o(h)) + o(h) = \lambda ph + o(h)\)
      • \(\Pr[N_1(h)\geq 2]\leq\Pr[N(h)\geq 2] = o(h)\)
      • Probability of a type I event in (t,t+h) is independent of all that occurs in intervals that do not overlap (t,t+h), it is independent of knowledge of when type II events occur, showing that the two Poisson processes are independent.
  • Consider a Poisson process \(\{N(t)\}\) with rate \(\lambda\).
    • each time an event occurs it is classified as type \(j\) with prob \(p_j\)
      • \(j=1,\ldots,m, \sum_{j=1}^mp_j=1\).
    • \(N_j(t)\) denote the number of type j events by time t.
    • \(\{N_j(t)\},j=1,\ldots,m\) are independent Poisson processes with rate \(\lambda p_j\).
  • Consider \(m\) independent Poisson processes \(\{N_j(t)\}\) with rate \(\lambda_j\) (Ex 5.17, Page 306)
    • \(E[N]\)? N: number of events needed to have a complete collection of at least one of each process
    • \(X_j\) denote the time of the first event of the \(j\)-th process
    • \(X=\max_jX_j\)
      • time at which a complete collection of events is amassed
      • \(\Pr(X\lt t)=\prod_j\Pr(X_j\lt t)=\prod_j(1-e^{-\lambda_jt})\)
      • \(E[X]=\int_0^{\infty}\Pr(X>t)dt\)
    • \(T_i\) denote the \(i\)-th interarrival time of the Poisson process that counts the number of events
      • \(X=\sum_{i=1}^N T_i\).
      • \(T_i\) are independent exp with rate \(\lambda=\sum_{j=1}^m\lambda_j\) (minimum of m Exps)
      • \(T_i\) is independent of \(N\)
      • \(E[X|N]=NE[T_i]=N\lambda\)
        • \(E[N]=E[X]/\lambda\)
    • expected number of types that appear only once in the complete collection
      • Need \(\sum_{i=1}^m\Pr(I_i=1)\), where \(I_i=1\) if there is only one type i event in the final set
      • \(\Pr(I_i=1)=\Pr(X_j\lt S_i, \forall j\neq i)\) where \(S_i\) is the time when the second type i event happens.
        • \(S_i\) follows Gamma(2,\(\lambda_i\)) dist: \(f(x)=\lambda_i^2xe^{-\lambda_ix}\)
        • \(\Pr(X_j\lt S_i, \forall j\neq i|S_i)=\prod_{j\neq i}[1-\exp(-\lambda_jS_i)]\) (exp dist)
  • Two independent Poisson Prcesses
    • Consider \(\Pr(S_n^1\lt S_m^2)\)
    • \(S_n^1\) denote the time of the \(n\)-th event of the first process, and \(S_m^2\) the time of the \(m\)-th event of the second process
    • Probability that n events occur in one Poisson process before m events have occurred in the second
      • type 1 event happens before type 2 event with prob \(p_1=\Pr(S_1^1\lt S_1^2)=\lambda_1/(\lambda_1+\lambda_2)\) (exp dist property)
      • so <==> the first n+m-1 events contains at least n type 1 events
      • \(\Pr(S_n^1\lt S_m^2)=\sum_{k=n}^{n+m-1} (n+m-1,k) p_1^k(1-p_1)^{n+m-1-k}\)

Lecture 4

Conditional Distribution of the Arrival Times

  • Consider \(\Pr[T_1\lt s|N(t)=1]=\Pr(\mbox{1 event in } [0,s))\Pr(\mbox{0 events in }[s,t])/\Pr[N(t)=1]=s/t\)
    • Poisson dist: \(\lambda se^{-\lambda s} e^{-\lambda(t-s)}/[\lambda te^{-\lambda t}]=s/t\)
    • exactly one event has taken place by time t, and the time at which the event occurred is uniform over [0,t].
  • Joint density of the order statistics of iid continuous random variables \(Y_{(1)},\ldots,Y_{(n)}\)
    • \(f(y_1,\ldots,y_n)=n!\prod_{i=1}^nf(y_i), y_1\lt y_2\lt \ldots\lt y_n\).
    • When \(Y_i\sim U[0,t]\), \(f(y_1,\ldots,y_n)=n!/t^n, 0\lt y_1\lt\ldots\lt y_n\lt t\)
  • Theorem 5.2 Given that N(t)=n, the n arrival times \(S_1,\ldots,S_n\) have the same distribution as the order statistics corresponding to n independent random variables uniformly distributed on the interval (0,t).
    • event that \(\{S_1=s_1,\ldots,S_n=s_n,N(t)=n\}\) is equivalent to the event that the first n+1 interarrival times satisfy \(\{T_1=s_1,\ldots,T_n=s_n−s_{n−1},T_{n+1}>t−s_n\}\)
    • \(T_i\) are independent exp with rate \(1/\lambda\)
    • \(f(s_1,\ldots,s_n|N(t)=n)=f(s_1,\ldots,s_n,n)/\Pr[N(t)=n]=n!/t^n, 0\lt s_1\lt\ldots\lt s_n\lt t\)
    • under the condition that n events have occurred in (0,t), the times \(S_1,\ldots,S_n\) at which events occur, considered as unordered random variables, are distributed independently and uniformly in the interval (0,t).
  • Given a Poisson process \(N(t)\) with rate \(\lambda\)
    • an event at time t is classified as type \(i=1,\ldots,k\) with prob \(P_i(t), \sum_{i=1}^kP_i(t)=1\).
    • \(N_i(t)\): # of type i events in [0,t]
    • \(N_i(t)\) independently follows Poisson dist with mean \(\lambda\int_0^tP_i(s)ds\)
      • nonhomogeneous Poisson process: nonstationary increments
      • intensity function: \(\lambda(t)=\lambda P_i(t)\)
      • mean value function: \(m(t)=\lambda\int_0^tP_i(s)ds\)
    • Given an event in [0,t]
      • if it happened at time s, then prob of being type i event is \(P_i(s)\)
      • note that event would happen uniformly in [0,t]
      • so prob of being type i event is \(P_i=1/t\int_0^t P_i(s)ds\)
    • condition on \(N(t)=\sum_{i=1}^k n_i\), \(\{N_i(t)=n_i\}\) follow a multinomial dist with prob \(P_i\)
      • marginally \(N(t)\) follows a Poisson dist with mean \(\lambda t\)
      • can derive the joint dist of \(\{N_i(t)\}\) => independent Poisson

Generalizations of the Poisson Process

  • Nonhomogeneous Poisson Process \(N(t)\)
    • independent but nonstationary increments
    • intensity function \(\lambda(t)\): \(\Pr(N(t+h)-N(t)=1)=\lambda(t)h+o(h), \Pr(N(t+h)-N(t)\geq 2)=o(h)\)
      • mean value function: \(m(t)=\int_0^t\lambda(s)ds\)
      • hazard/failure rate (constant for regular Poisson process)
      • waiting time distribution
        • \(f(t),F(t),G(t)\) derived from \(\lambda(t)=f(t)/G(t)\)
        • \(G(t)=e^{-m(t)}, f(t)=\lambda(t)e^{-m(t)}\)
        • \(f(t|T\geq s)=f(t)/G(s)=\lambda(t)G(t)/G(s)=\lambda(t)/e^{m(t)-m(s)}\)
    • \(N(t+s)-N(s)\) follows Poisson dist with mean \(m(t+s)-m(s)\)
      • study \(E[e^{-uN(t+h)}]\)
      • Poisson approx to limit of sum of independent Bernoulli
    • arrival time of first event: \(\Pr(T_1\leq t)=\Pr[N(t)\geq 1]=1-e^{-m(t)}\) with PDF \(f(t)=\lambda(t)e^{-m(t)}\)
  • Compound Poisson Process: \(X(t)=\sum_{i=1}^{N(t)}Y_i\)
    • \(N(t)\) is a Poisson process with rate \(\lambda\)
    • \(Y_i\) are iid rvs: \(\mu=E(Y_i), \sigma^2=Var(Y_i), E(Y_i^2)=\mu^2+\sigma^2\)
      • \(E[X(t)]=E[E[X(t)|N(t)]]=E[N(t)\mu]=\mu\lambda t\)
      • \(Var[X(t)]=Var[E[X(t)|N(t)]]+E[Var[X(t)|N(t)]]=\mu^2\lambda t + \sigma^2\lambda t\)
    • when \(Y_i=1\): Poisson process \(N(t)\)
    • if \(Y_i\) follows Bernoulli with prob \(p\) => Poisson process with rate \(\lambda p\)
    • if \(Y_i\) follows a finite dist with value \(\alpha_j\) and prob of \(p_j\)
      • \(X(t)=\sum_i\sum_j I(Y_i=\alpha_j)Y_i=\sum_j\sum_{Y_i=\alpha_j}Y_i=\sum_j \alpha_j N_j(t)\)
      • \(N_j(t)=\sum_{i=1}^{N(t)} I(Y_i=\alpha_j)\) a Poisson process with rate \(\lambda p_j\)
        • note \(I(Y_i=\alpha_j)\) is Bernoulli with prob of \(p_j\)
    • sum of independent Poisson rvs are still Poisson rv (approximately sum of Bernoullis)
      • \(N(t)\approx N(\lambda t,\lambda t)\) as \(t\to\infty\)
      • so do \(N_j(t)\) and \(X(t)\)