discrete time queuing system (representing a downlink channel) with unlimited buffer size for a 20 Mbps channel transmission rate.
The transmission slot had a 40 ms duration (equal to the inverse of the video frame rate). the assumption in this simulation is that up to 10 packets (this value is taken from [22]) of length equal to 48 B (payload of an ATM-sized packet), may be served during each transmission slot (i.e., we consider a TD/CDMA channel frame with a duration of 12 ms [41] and 62 slots/frame). I want the packet waiting time and the packet loss ratio to validate the model for various load factors, given the delay constraints of real time video streams (packets of a video frame need to be transmitted before the arrival of the next video frame, i.e., within 40 ms, otherwise they are dropped; an upperbound of just 0.01% is allowed for videoconference packet dropping [41])
A load of 0.4, e.g., corresponds to a load of 0.4*20 Mbps = 8 Mbps.
finally i want to compare the results between the whole actual traces and this models
figures 21, 22, 23, 24 are needed
furthermore, figure 3 is also needed find the requirement in the attached files