Skip to main content
added 69 characters in body
Source Link

Infrared has a frequency of 300 GHz – 430 THz, while WiFi has a frequency of 2.4 GHz or 5 GHz.

As the frequency of infrared is greater than the frequency of WiFi, the transfer rate (bitrate) of IR should be greater than WiFi.

In reality, IR transmission is in kbpsKbps, while WiFi for WLAN is about 100 Mbps.

We know that a greater frequency wave can carry more bit data, but it is difficult to handle interference such as thick wall obstacles, therefore lower coverage range, and vice versa for lower frequency.

That statement only applies between mobile cellular band LTE (2.3 GHz) and WiFi (2.4 GHz) where WiFi bitrate is higher than LTE, but not between IR and WiFi.

Edit: I said Kbps after I read random article about IR Remote.

Infrared has a frequency of 300 GHz – 430 THz, while WiFi has a frequency of 2.4 GHz or 5 GHz.

As the frequency of infrared is greater than the frequency of WiFi, the transfer rate (bitrate) of IR should be greater than WiFi.

In reality, IR transmission is in kbps, while WiFi for WLAN is about 100 Mbps.

We know that a greater frequency wave can carry more bit data, but it is difficult to handle interference such as thick wall obstacles, therefore lower coverage range, and vice versa for lower frequency.

That statement only applies between mobile cellular band LTE (2.3 GHz) and WiFi (2.4 GHz) where WiFi bitrate is higher than LTE, but not between IR and WiFi.

Infrared has a frequency of 300 GHz – 430 THz, while WiFi has a frequency of 2.4 GHz or 5 GHz.

As the frequency of infrared is greater than the frequency of WiFi, the transfer rate (bitrate) of IR should be greater than WiFi.

In reality, IR transmission is in Kbps, while WiFi for WLAN is about 100 Mbps.

We know that a greater frequency wave can carry more bit data, but it is difficult to handle interference such as thick wall obstacles, therefore lower coverage range, and vice versa for lower frequency.

That statement only applies between mobile cellular band LTE (2.3 GHz) and WiFi (2.4 GHz) where WiFi bitrate is higher than LTE, but not between IR and WiFi.

Edit: I said Kbps after I read random article about IR Remote.

Became Hot Network Question
added 21 characters in body; edited title
Source Link
ocrdu
  • 9.4k
  • 23
  • 33
  • 43

Why is the bitrate of infrared smaller than the bitrate of WiFi?

Infrared has a frequency of 300 GHz – 430 THz, while WiFi has a frequency of 2.4GHz4 GHz or 5GHz5 GHz.

As the frequency of infrared is greater than the frequency of WiFi, the transfer rate (bitrate) of IR should be greater than WiFi.

In reality, IR transmission is in Kbpskbps, while WiFi for WLAN is about 100Mbps100 Mbps.

We know that a greater frequency wave can carry more bit data, but it is difficult to handle interference such as thick wall obstacles, therefore lower coverage range, and vice versa for lower frequency.

That statement only applies between mobile cellular band LTE (2.3GHz3 GHz) and WiFi (2.4GHz4 GHz) where WiFi bitrate is higher than LTE, but not between IR and WiFi.

Why is the bitrate of infrared smaller than bitrate of WiFi?

Infrared has a frequency 300 GHz – 430 THz, while WiFi has frequency 2.4GHz or 5GHz.

As the frequency of infrared is greater than the frequency of WiFi, the transfer rate (bitrate) of IR should be greater than WiFi.

In reality, IR transmission is in Kbps, while WiFi for WLAN is about 100Mbps.

We know that a greater frequency wave can carry more bit data but difficult to handle interference such as thick wall obstacles therefore lower coverage range, and vice versa for lower frequency.

That statement only applies between mobile cellular band LTE (2.3GHz) and WiFi (2.4GHz) where WiFi bitrate is higher than LTE, but not between IR and WiFi.

Why is the bitrate of infrared smaller than the bitrate of WiFi?

Infrared has a frequency of 300 GHz – 430 THz, while WiFi has a frequency of 2.4 GHz or 5 GHz.

As the frequency of infrared is greater than the frequency of WiFi, the transfer rate (bitrate) of IR should be greater than WiFi.

In reality, IR transmission is in kbps, while WiFi for WLAN is about 100 Mbps.

We know that a greater frequency wave can carry more bit data, but it is difficult to handle interference such as thick wall obstacles, therefore lower coverage range, and vice versa for lower frequency.

That statement only applies between mobile cellular band LTE (2.3 GHz) and WiFi (2.4 GHz) where WiFi bitrate is higher than LTE, but not between IR and WiFi.

deleted 14 characters in body; edited title
Source Link
JRE
  • 75.1k
  • 10
  • 115
  • 197

Why is the bitrate of infrared is smaller than bitrate of WiFi?

Infrared has a frequency 300 GHz – 430 THz, while WiFi has frequency 2.4GHz or 5GHz.

Due toAs the frequency of infrared is greater than the frequency of WiFi, then the transfer rate (bitrate) of IR is supposedshould be greater than WiFi.

But inIn reality, IR transmission is aboutin Kbps unit, while WiFi for WLAN is about 100Mbps.

We know that, a greater frequency wave can carry more bit data but difficult to handle interference such as thick wall obstacleobstacles therefore lower coverage range, and vice versa for lower frequency.

That statement is only appliedapplies between mobile cellular band LTE (2.3GHz) and WiFi (2.4GHz) where WiFi bitrate is higher than LTE. But, but not between IR and WiFi.

Why bitrate of infrared is smaller than bitrate of WiFi?

Infrared has frequency 300 GHz – 430 THz, while WiFi has frequency 2.4GHz or 5GHz.

Due to frequency of infrared is greater than frequency of WiFi, then the transfer rate (bitrate) of IR is supposed greater than WiFi.

But in reality, IR transmission is about Kbps unit, while WiFi for WLAN is about 100Mbps.

We know that, greater frequency wave can carry more bit data but difficult to handle interference such as thick wall obstacle therefore lower coverage range, and vice versa for lower frequency.

That statement is only applied between mobile cellular band LTE (2.3GHz) and WiFi (2.4GHz) where WiFi bitrate is higher than LTE. But not between IR and WiFi.

Why is the bitrate of infrared smaller than bitrate of WiFi?

Infrared has a frequency 300 GHz – 430 THz, while WiFi has frequency 2.4GHz or 5GHz.

As the frequency of infrared is greater than the frequency of WiFi, the transfer rate (bitrate) of IR should be greater than WiFi.

In reality, IR transmission is in Kbps, while WiFi for WLAN is about 100Mbps.

We know that a greater frequency wave can carry more bit data but difficult to handle interference such as thick wall obstacles therefore lower coverage range, and vice versa for lower frequency.

That statement only applies between mobile cellular band LTE (2.3GHz) and WiFi (2.4GHz) where WiFi bitrate is higher than LTE, but not between IR and WiFi.

Source Link
Loading