two computers using tdm take up turns to send


Two computers using TDM take up turns to send 100-bytes packet over a shared channel that operates at 64000 bits per second. The hardware takes 100 microseconds after one computer stops sending before the other can begin. How long will it take for each computer to send one megabyte data file?

Channel rate is 64000 bits per second or 8000 bytes/second.

Thus 1000 bytes size packet will take 1000/8000 seconds which is .125 seconds. One megabyte file will have 1000000/1000 = 1000 packets of 1000 bytes size each. Well Two system sending one megabyte file each implies 2000 packets will be sent. Hence, 2000 packets will get 2000 X .125 = 250.000 seconds.

Hardware gets 100 micro second or 0.001 seconds.

Computer sends packets turn by turn in between each two consecutive packets. Here will be 0.001 second gap for 200 packets gap is 2000 X 0.001 = 2.0 seconds.

Total time will be 250 + 2 = 252 seconds for a computer.

Request for Solution File

Ask an Expert for Answer!!
Computer Engineering: two computers using tdm take up turns to send
Reference No:- TGS0283749

Expected delivery within 24 Hours