Artificial intelligent assistant

confused with transmission delay I was reading a textbook which says: _**The transmission delay the amount of time required to push (that is, transmit) all of the packet’s bits into the link.**_ Does it mean that transmission delay is determined by router? if you have a stronger/higher capacity router which pushs bits into the link faster, then the transmission delay is lower? but the book also says: _**Denote the length of the packet by L bits, and denote the transmission rate of the link from router A to router B by R bits/sec. For example, for a 10 Mbps Ethernet link, the rate is R = 10 Mbps; for a 100 Mbps Ethernet link, the rate is R = 100 Mbps. The transmission delay is L/R**_ Then we can see that transmission rate seems to be determined only by the link's rate. But isn't it contracting to "the amount of time required to push bits into a link"?

> Does it mean that transmission delay is determined by router?

The delay is determined by the media characteristics. Remember that most media is serial -- that is, one bit is sent at a time*. A 10Mbps link means that you can transmit 10 million bits per second, or 1 bit every 100 ns. So a 100 byte message would take


100 bytes * 8 bits * 100 ns = 80 µs (microseconds)


To be sent on the link.

(* Note that this is a simplified example. In real life, encoding methods and phase changes are used to increase the amount of information sent per bit.)

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy af80f02294294058fa46a56e92031526