Normally network devices/software show some kind of current throughput. For example my home router shows the throughput for each connected device in its web interface. Or every web browser when downloading a file is showing the rate.
My question is: is this usually the raw data rate including potentially corrupt packets (e.g. checksum failed) that need to be re-transmitted? Or is the the "net throughput" including only the good packets?
Background of the question is: on video streams (between two stationrary devices) over local WLAN I can sometimes and sometimes not observe bad video quality for the same video with the same network throughput (according to device logs) on the same network on the same video position. And I'm currently evaluating the root cause and need to fully understand what the different parameters do actually mean. If the throughput includes corrupt packets this could explain the sporadic bad quality (external signals interferring with my local WLAN, for example).