(2003-10-24)
Poisson processes and the becquerel (Bq) SI unit:
What's a simple way to define a Poisson process?

If a so-called arrival keeps occurring with probability
ldt in the infinitesimal time dt,
then the process of such arrivals is called a Poisson process
when l does not change over time
(and a modulated Poisson process otherwise).

In the International System of Units (SI) where times are expressed in seconds (s),
the proper unit for the parameter l is the becquerel (Bq),
the SI unit of activity.
The becquerel may be loosely described as the
"stochastic reciprocal of the second"
(the herz is another "reciprocal of the second"
which is used for periodic phenomena, not random ones).
This unit is named after the discoverer of radioactivity,
Antoine Henri Becquerel
(1852-1908, X1872, Nobel 1903).

Just like motion need not be uniform
(i.e., of constant velocity V),
an arrival process need not be a Poisson process
(of constant activity l ).

The term arrival is used for the beginning
of a random event which may also be endowed with some duration
(called packet size in a specialized IT context).
This may also be emphasized by calling arrival processes "point processes".

Basic Properties of Poisson Processes:

If you start observing some Poisson process (of activity l)
at time t = 0, the probability you'll see
the first arrival between t and t+dt is:

l exp(-lt) dt

Such a probability density function is called an exponential distribution.

It's probably best to explain this in an "elementary" way:
For the first arrival to occur as prescribed, two independent things
must happen jointly (with a probability which is thus the product of the
two separate probabilities):
First, no arrival should occur before time t. Second, an arrival
should occur between t and t+dt. Because of the way activity is defined,
the latter occurs with probability l dt...
To estimate the probability of the former, let's divide the time t into k tiny intervals
of duration t/k. The probability of having no arrival in each such
interval is very nearly (1-lt/k) so the probability of not having
any arrival in any interval is close to
(1-lt/k)^{k}.
As k becomes very large, this tends to
exp(-lt), which is thus the probability
of no arrivals before time t.

The mean time between arrivals in a Poisson process
is 1/l
(in a memoryless process, waiting between arrivals is like waiting for
the first arrival discussed above).
The activity l
of a Poisson process is sometimes defined this way.

In a Poisson process with an activity of
l becquerels,
the probability of observing exactly n arrivals in t seconds is given by:

P_{n} =
exp(-lt) (lt)^{ n} / n!

Such a discrete probability distribution is called a Poisson distribution.

We've previously explained the special case n=0 (no arrivals in time t).
We may offer a similar explanation for the general case by considering k intervals
of a duration t/k short enough to make the possibility of multiple arrivals
in a single interval utterly negligible.
Assume k is much larger than n.
The probability of an arrival in exactly n intervals is nearly
C(k,n) (lt/k)^{n }(1-lt/k)^{k-n},
which goes toward the advertised limit for very large values of k, because the limit
of C(k/n)/k^{n} turns out to be 1/n!
(this much is left as an interesting exercise for the reader).

(2003-10-28)
Simulating a Poisson Process
How do I simulate a Poisson process
[of activity l ] on a computer?

In most computer languages, a uniform random generator is available
that gives a number between 0 and 1;
the result is between x and x+dx with probability |dx|.
Let's call X a random variable so obtained and consider the random variable Y:

Y = - ln ( X ) / l

So defined, the random variable Y is exponentially distributed in
[0,+¥]...
That's because the values x and y of X and Y verify
x = exp(-ly),
so that Y is between y and y+dy with probability
|dx| = |-l exp(-ly) dy|.

All you have to do, then, is compute successive random values of Y (using the
built-in random generator for the value of X) and let these be the successive durations
between the arrivals of your simulated Poisson process.

(2003-10-24)
Markov Chains & Markov Processes
What are Markov processes?

There are two related flavors of Markov processes, discrete and continuous:

Discrete Markov Chains:

A discrete Markov chain is a sequence of random variables (X_{ i })
where the probability distribution of X_{n+1}
depends only on the actual value of X_{n }.

A Markov chain is said to be finite if the random variables take only
finitely many values. A transition probability matrix
is then defined, whose element at row i and column j is the probability for
X_{n+1} to be in state j, when X_{n} is in state i:

P_{ij} =
P( X_{n+1}= j | X_{n}= i )

Any row ( i ) of such a transition matrix consists of
nonnegative elements which must add up to unity.
A matrix with this property is called a
stochastic matrix.
The product of such matrices is a stochastic matrix,
which gives the conditional probabilities after the sequence of random transitions
described by each factor.

After n transitions,
the unconditional probability to be in the j^{ th} state
is the j^{ th}
coordinate of a row vector p_{n}.
The following relation holds:

p_{n+1} =
p_{n} P
[therefore: p_{n} =
p_{o} P^{ n}, if P is constant]

Although one could consider modulated Markov chains
(whose transition matrices vary with n)
Markov processes are usually understood to be stationary
unless otherwise specified
(their transition probabilities are constant).
Markov chains are often time series evolving discretely
from one finite time interval to the next,
which are best known as Discrete Time Markov Chains (DTMC).
In the usual stationary case,
these are sometimes called "time homogeneous".

Continuous Markov Processes:

When considered as arrivals processes occurring over time,
these are known as Continuous Time Markov Chains (CTMC).
For such a process, a generator matrix Q is defined
(also called a transition rate matrix) as the time derivative of the
stochastic matrix that gives the probability of ending up
in state j at time t, starting from state i at time 0.
For a stationary Markov process, Q is constant.

pui (2003-10-01,
2003-10-03;
homework)
_{ }
Handling Telephone Calls:
5 erlangs of load is offered to 15 circuits for half an hour,
and 15 erlangs in the next half-hour.
The mean call holding time is 3 minutes.
Use the Erlang B Formula to find the blocking probability for each half-hour
and for the whole hour.
Compare this with a uniform load of 10 erlangs.

Capitalization:
There's a delicate spelling issue botched by many authors.
Like the name of any other unit, "erlang" is
never capitalized in English.
The symbol for the unit is "E", which is capitalized,
as is always the case for a unit named after a person (e.g., "Hz" for hertz).
The plural form is "erlangs", whereas the symbol "E" is invariable.
Thus, correct usage include "five erlangs", "5 erlangs" or "5 E"
(while "five E", "five Erlangs" "5 e" and "5 Es"
are unacceptable).
For example, the North-American traffic unit called
centum-call-second (CCS) corresponds to 100 call-seconds per hour,
and is defined via:

1 E = 36 CCS

The erlang unit measures incoming telecommunication traffic and
actually corresponds to what would continuously occupy one single channel or trunk
(called "circuit" in the above question)
*if* each call would always come exactly when the previous one ends.
However, as calls do "bunch up", the peaks in an
average load of a erlangs
requires a capacity n
(the number of trunks) significantly greater than a.
Even so, there will always be
some probability P
(called "Grade of Service", GoS or GOS) that an incoming call is blocked
(a telephone caller would get a busy signal to
indicate no circuit is available).

Under the simplifying assumption
("blocked calls cleared") that rejected callers don't keep trying
immediately after getting busy signals, the relation between
a, n and P was worked out by the
Danish scientist Agner Krarup Erlang (1878-1929)
and is known as the Erlang B Formula:

n

å

k = 0

1

=

^{ }n!^{ }

a^{ k}

P

a^{ n}

^{ }k!^{ }

(Another relation, called Erlang C, is relevant for "blocked calls delayed";
when callers are put on hold until they can be handled.)

Applying the Erlang B formula to the question at hand
(n = 15), we obtain:_{ }

For a = 5, P = 0.0001572562863... (one call in 35266 is blocked)

For a = 10, P = 0.036496945472... (5 calls in 137 are blocked)

For a = 15, P = 0.180316399143... (11 calls in 61 are blocked)

As we're told that calls are expected to have the same duration
[the actual value of 3 min is irrelevant here] during either the
quiet half-hour (5 E) or the busy one (15 E),
we know that a random call is 3 times more likely to occur
during the busy period than during the quiet one.
Therefore, the probability that a given random call is blocked is about 13.53%.
More precisely:

Although as many calls take place under these conditions as with
a uniform load of 10 erlangs (200 calls of 3 min per hour),
the blocking probability of 13.53% is much larger than with a uniform load (3.65%),
because most calls occur during the busy
period, during which the circuits are far more likely to be saturated.

Pui
Yu Carol Chiu (2003-10-23; e-mail) _{ }
With the Markov-Modulated Poisson Process (MMPP) at right,
how do I build a sample of arrivals for 175278 intervals over a 1-hour period?

The same way you would for 175277 intervals...

"Markov-Modulated Poisson Processes" are of a
fairly recent origin
(1990)...
An MMPP is a stochastic arrival process where the instantaneous
activity (l)
is given by the state of a Markov process, instead of being constant
(as would be the case in an ordinary Poisson process).
The term Switched Poisson Process (SPP) may be used when
the Markov chain has only 2 states, as is the case here.

Warning / Erratum

The rest of this article deals with a Poisson process
modulated by a discreteMarkov chain
(with transitions at regular intervals).
This is not what Pui had in mind, since an
MMPP is normally modulated by a continuous Markov process.
We'll fix this as time permits. Sorry.

An MMPP is best defined by the (constant) transition matrix P of
its Markov chain and the (diagonal) matrix L
of the activity rates for all the Markov states.
Since an MMPP is also a special case of a Batch Markovian Arrival Process (BMAP),
it may be described with the notations used in that context, namely:

D_{0} = P - I - L,
D_{1} = L,
and D_{n} = 0
for n > 1

P ^{ } =

é ë

0.65 0.14

0.35 0.86

ù û

L ^{ } =

é ë

3.15 0

0 0.14

ù û

For the question at hand, this translates into the values of
P and L given at right.
For each interval, you would have whichever activity l
(either 3.15 Bq or 0.14 Bq) is given by the Markov process.
Both activities are fairly low with respect to the durations of the intervals
(about 20.24 ms) and multiple arrivals in a single interval are rather rare...
Also, we may remark that:

P ^{k} =

é^{ } ë

^{2}/_{7} ^{2}/_{7}

^{5}/_{7 } ^{5}/_{7 }

^{ }ù û

+ 0.51^{ k }

é^{ } ë

^{5}/_{7} ^{-2}/_{7}

^{-5}/_{7 } ^{2}/_{7 }

^{ }ù û

¾¾¾® k
® ¥

é^{ } ë

^{2}/_{7} ^{2}/_{7}

^{5}/_{7 } ^{5}/_{7 }

^{ }ù û

P^{ k} expresses the conditional probabilities for the k^{th} interval,
from a given initial state (which becomes ultimately irrelevant,
because both rows of the limit are identical).
Convergence toward the limit is quite fast
and intervals are rather short (for the activities involved),
so the thing will thus pretty much behave like an ordinary
Poisson process whose activity (1.00 Bq) is the mean activity:

1.00 = (2/7) (3.15) + (5/7) (0.14)

The details of arrivals will differ from that "average" Poisson process, though.
Quiet intervals are likely (86%) to be followed by quiet intervals, and busy intervals
tend to be followed by busy ones as well (65%).
Arrivals tend to "bunch up" more in the MMPP than they would
in a Poisson process of 1 Bq.

(2006-07-19)
Decay Processes
Superposition of independent monoexponential decays into active processes.

(2006-07-14)
Superposed Independent Poisson Processes
The activities of independent superposed Poisson processes are additive.

Circumstances exist in which several stochastic processes
are usefully combined into a single one where an arrival
is defined as being any arrival from one of the component processes.

Such process may influence each other. We'll consider only the case
of independent processes which don't.

The very definition of the activity of a stochastic process makes
it clear that when several Poisson processes are so combined, the resulting process
is a process whose activity is the sum of the activities of the component
processes. (Such activities need not be constant.)