MODEL OF TOKEN BUCKET SYSTEM HYPEREXPONENTIAL MODEL OF TOKEN BUCKET SYSTEM

In the present paper we apply the Theory of Markov Chains for a formal description of a hyperexponential model of Token Bucket System (TBS). This paper extends the topic covered in [3] by replacing Exponential model of human speech by a more precise hyperexponential model. In the first section we deal with analysis of the Hyperexponential model of On-Off Source of human speech. In the second section we deal with steady-state analysis of Token Bucket system and we solve the balance equations for probabilities of states. In the third section we use real parameters of VoIP and we calculate characteristic of TBS and in the last section the values of probability of packet loss are approximated by an exponential regression function. We can directly compute values of probability of packet loss using this exponential function.


Hyperexponential model of On-Off Source
The On-Off Source Model of human speech is composed of two periods, where On period belongs to the active periods in human speech and the Off period belongs to the silence. There are two main states On and Off and the source switches from one state to another.
Talk-spurt duration is modelled by the variable T 1 and pause duration is modelled by the variable T 2 . The probability densities of T 1 and T 2 are modelled by two weighted geometric distribution functions.
Every increment of these variables T 1 and T 2 is equal to 5 ms and then the average talk-spurt duration is ET 1 ϭ 227ms and the average pause duration is ET 2 ϭ 596ms (see [1]).
The best approximation of these processes is the Hyperexponential model. The discrete geometrical distributions are approxi-mated by continuous exponential ones and the rest of this model has completely the same nature as the geometrical one. In the new model we can keep the same averages.
We construct a transmission diagram of Markov model of On-Off Source (loops are omitted): The rate matrix of this Markovov chain is

HYPEREXPONENTIAL MODEL OF TOKEN BUCKET SYSTEM HYPEREXPONENTIAL MODEL OF TOKEN BUCKET SYSTEM Juraj Smieško *
In the present paper we apply the Theory of Markov Chains for a formal description of a hyperexponential model of Token Bucket System (TBS). This paper extends the topic covered in [3] by replacing Exponential model of human speech by a more precise hyperexponential model.
In the first section we deal with analysis of the Hyperexponential model of On-Off Source of human speech. In the second section we deal with steady-state analysis of Token Bucket system and we solve the balance equations for probabilities of states. In the third section we use real parameters of VoIP and we calculate characteristic of TBS and in the last section the values of probability of packet loss are approximated by an exponential regression function. We can directly compute values of probability of packet loss using this exponential function.
We compute the probability 1 from the normalisation con- The probabilities for real parameters of VoIP are ϭ (0.04902, 0.22546, 0.04579, 0.67973).
Let P ON and P OFF be probabilities that the Source is in state On or Off: Notice that these probabilities are very similar to those in the Exponential model (see [3], P ON ϭ 0.27586 and P OFF ϭ 0.72414). But in the Hyperexponential model we have a more detailed structure of On-Off source than in the Exponential model.

Steady-state Analysis of Token Bucket System
The Token Bucket System is related to VoIP problems. We have formed the steady-state analysis of VoIP under the Token Bucket Control. Our main problems will be to compute the probability characteristics of TBS and to find relation between probability of packet loss and bucket depth. At first we will analyze the working of TBS in general.
There is a flow of packets entering the TBS. The Token Bucket System is generating tokens (marks) and then it marks each packet with one of them. Only the marked packets will go through network. For practical reasons we can assume a limited bucket with depth "n". When the bucket is empty (there are no tokens), the TBS cannot mark any packet, which is lost then. Let P lst be probabilities of this random phenomenon and we will call it "probability of packet loss".
We will deal with the Hyperexponential model of On-Off Source of human speech (see above). When the Source is in On state it starts generating flow of packets, which represents human speech. The packet rate in usual VoIP systems (using G.729A) is 50 p/s. We will assume that the flow of packets is modelled by Poisson process N p (t) with rate ϭ 0.05 p/ms.
In general there can be any token rate. Usually it is the same as an average number of packets entering the TBS. The flow of tokens will be modelled by Poisson process N p (t) with a rate :  In the Token Bucket System we have three elementary random phenomena entering the packet, generating token and switching between On-Off states in Source. We have assumed that each of them is Poisson process, and because of that we can model this TBS by Markov chain.
Probability of an increasing number of tokens in the bucket by one during "short" time ⌬t is P(N T (⌬t) ϭ 1) ϭ ⌬t ϩ o(⌬t). If the bucket contains k tokens, probability of a decreasing number of tokens in the bucket by one during "short" time ⌬t is P(N P (⌬t) ϭ 1) ϭ ⌬t ϩ o(⌬t). If there is an arriving packet and the bucket is empty, we cannot mark it and this packet is lost. Probability of packet loss was named P lst .
We will construct a transition diagram of the whole system. Component columns of the diagram refer to a number of tokens in the token bucket. We assigned this as "k-level" for k ϭ 0, … n. For easy reading loops in the transition diagram are omitted: Let X k and Y k be probabilities that in the TBS are k tokens and Source is in On states and let Z k and W k be probabilities that in the TBS are also k tokens, but Source is in Off states. Let P k be a probability of k-level. We see that P k ϭ X k ϩ Y k ϩ Z k ϩ W k and P lst ϭ X 0 ϩ Y 0 . Taking from the Theory of Markov Chains [2] we can write the following balance equations for state probabilities of steady-state Markov chain: We want to solve the balance equations p . Q ϭ 0, where p ϭ (X 0 , Y 0 , Z 0 , W 0 … X n , Y n , Z n , W n ). The rate matrix Q is more complicated, so we will write it to the block matrix: We assigned Q 0 as rate matrix of the Hyperexponential On-Off Source. It can be easily seen that: We can determine recurrent formulas for the probabilities W i and Z i : But the equations for probabilities Yi and Xi are very complicated: If we assign p k on ϭ X k ϩ Y k and p k off ϭ Z k ϩ W k we have these interesting relations:

TBS with real parameters of VoIP
To deduce an explicit formula for probability of packet loss P lst = P lst (n) ϭ X 0 ϩ Y 0 is unreal, but there is no problem to compute values of P lst for real parameters of our Token Bucket System with the Exponential On-Off Source: For example the reader can see for himself the difference between n ϭ 4 and n ϭ 5:  For real use it is enough to have the bucket depth n ϭ 1, …, 10. Now we will increase the bucket depth n and calculate characteristics of models: P lst (n) -probability of packet loss or probability of an empty bucket in time of arriving packet . P lst (n) -average number of lost packets P n -probability of full bucket (bucket will refuse tokens) EK -average bucket depth H ϭ EK/n -token bucket usage

Relation between Probability of packet loss and Token Bucket depth
The most interesting characteristic is probability of packet loss P lst (n). In Fig.7 we compared the values of probability of packet loss P lst (n) for the Exponential model and Hyperexponential model (see [3]).
The approximation by hyperexponencial distributions is the best approximation and therefore the Hyperexponential Model of TBS is more accurate in modelling the real TBS than the Expo-nential model, but for computation of state probabilities we had to use numerical methods.
The approximation by exponential distributions is not so "good", but for this model we have gained recurrent formulas for state probabilities and values of characteristics that we get from the Exponential model can be used as lower estimation of characteristics of the real TBS. Now we use approximation by the regression function f(n) by the least squares method to find relation between Probability of packet loss P lst (n) and the Token Bucket depth n. The Regression function must satisfy these conditions to gain solid approximation: lim n → ϱ f(n) ϭ 0, ∃ ր n 0 ; f(n 0 ) ϭ 0 and ᭙n 1 Ͻ n 2 ; f(n 1 ) Ͼ f(n 2 ) If we want few parameters of regression function, there is an ideal exponential function f t (n) ϭ ae bn .
We will measure the quality of approximation by the square root of sum residuals: For real use it is enough to have the bucket depth n ϭ 1, …, 10. Then the exponential regression function is f(n) ϭ 0.2025.e Ϫ0.0312n with ε ϭ 0.0144 and with the mean error ε/n ϭ 0.0014.    The exponential function is approximation (except n ϭ 1) with a maximum error 4.4 . 10 Ϫ3 . If we are satisfied with this precision, we can exchange P lst (n) for f(n) for n ϭ 2, …,10.
For real use we are satisfied with the function: f(n) ϭ Ά

Conclusion
We executed steady-state analysis of Hyperexponential model of Token Bucket System and created balance equations for states of Markov chain. Analytical solution led to complicated recurrent formulas for state probabilities. Therefore we used numerical methods to compute probability of a losing packet. We approxiamted these results using the exponential function. Finally we obtained the function for direct calculation of this probability.