Artificial intelligent assistant

Why does throughput (consumed BW) increase when the packet error rate is increased in the TCP? I was simulating TCP used by dailymotion.com, and I noticed that when I increased packet error rate, the consumed BW was increased. How?

Errors in transmission are detected in TCP layer.

The receiving TCP layer discards the damaged segment, forcing the sender to sent the segment again.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 267a5c633591a4523d71928efb10e962