Calculating total transmission time of a packet

Novastorm picture Novastorm · Dec 31, 2014 · Viewed 32.1k times · Source

I'm having some difficulty calculating the total time it takes a packet to get from A to B, the question is:

"We have 200 bytes of data to send from A to B, with a distance of 200km between them. Calculate the total transmission time, assuming the speed of the signal is 200,000 km/s and that the data rate is 1Mbps and that a header of 40 bytes has to be added to the data before it is sent."

My understanding is that at some point you need to factor in propagation and the speed of light (??) but I'm unsure if it's needed in this case. Is there a formula which can be used to work these types of question out?

Answer

Daniel picture Daniel · Dec 31, 2014

So we have a total of 200 bytes of payload + 40 bytes of header = 240 bytes. The data can be put on the wire at a rate of 1 Mbps which equals 1,000,000 bits per second (unless the question actually means Mibps which is 1,048,576 bits per second; we'll work on the assumption that Mbps is correct and it's 1,000,000).

240 bytes is equal to 1920 bits (240 * 8), so it takes

1920 bits / 1,000,000 bits per second = 0.00192 seconds

to get the data on the wire.


Now, for the data to be transmitted, it has to travel 200 km at a rate of 200,000 km/s.

200km / 200,000(km/s) = 0.001 seconds.


Now, to take the data from the wire and read into computer in location B takes the same amount of time as putting the data on the wire = 0.00192 seconds.

So the total amount of time is equal to

0.00192 + 0.001 + 0.00192 = 0.00484 seconds = 4.84 milliseconds.