开发者

Leaky Bucket problem

开发者 https://www.devze.com 2023-02-26 05:53 出处:网络
I have been trying to solve the following numerical problem........Any help is appreciated in making the concept clear.

I have been trying to solve the following numerical problem........Any help is appreciated in making the concept clear.

"A given source request admission to a QoS network requesting avg throughput of 2 Mbits/sec and burst capacity of 2 Mbits. The source then tr开发者_JAVA百科ansmitt data at 50 Mbits/sec for a duration of 1 millisecond. Right after that the source scales down the throughput to 1.8 Mbits/sec. Plot the size of the data in the buffer reserved for this flow as a function of time side by side with the throughput described above. How much data loss will this source experience? What is the burst capacity this source should use to ensure no data loss with throughput function show above?"

Thank-you


Assume

  • the client is the only source of traffic.
  • the buffer empties at the rate of 2 Mbits/Sec
  • at T = 0, buffer is at 100% of 2Mbits (2^20 bits or about 10^6)

At T = 1 mS, 10^-3 seconds have elapsed, so 2*10^3 bytes have been cleared from the buffer. However, in that time, the client has spat out (50*10^6) bytes/sec for the 1 mS duration, or a total of 50*10^3 bytes.

As the available memory is only 2*10^3 bytes, the first 2*10^3 bytes will read correctly "off the wire", the rest (48*10^3 bytes) will be lost, or cause a fatal buffer overflow.

Somewhere, there needs to be AT LEAST another 48*10^3 bytes of memory if data loss is to be avoided. In relation to this data burst, the rest of the problem statement is meaningless, because the question appears to be asking about the buffering required to support the given burst, and this is the peak data rate over the given graph.

I'm not sure what the answer you are seeking is, but I hope this description of the network mechanics is helpful.

0

精彩评论

暂无评论...
验证码 换一张
取 消