I like how they quickly glance over the fact that you need line of sight to connect and call that a good thing because people behind a wall cant steal your data.
Within the same room, it is possible to use a frequency of light that will reflect off of almost anything. I just got a window AC unit with a remote that defies physics. Like I can have a desk, and closed plantation shutters (slats and doors) in front of the receiver on the front of the unit, point the remote anywhere in the wrong direction and still activate the thing. It's just an IR LED transmitter setup. I've never seen one that is quite this powerful. It is uber cheapo general electric bottom of the consumer grade junk category too.
This is the NSA's wet dream tech. Anyone with line of sight could intercept the data stream.
For low datarates sure, but at high speeds the dispersion caused by light taking multiple paths will be unacceptable. The reason single node fiber is so thin is to make sure light can only travel along one path. If you want multi gigabit speeds, you will need a direct line of sight.
It absolutely is a good thing when security is concerned. WiFi is easy to snoop even if you're not physically in the room, if you know what you're doing. Sure there are encryption standards that are very good to tamp down on this. However, what's even better with LiFi is you must be physically in the room to intercept any transmissions that are being sent.
This is by design one of the largest advantages to LiFi. There are other practical uses as well, but it's not like LiFi is designed to explicitly replace WiFi.
I could easily imagine having both this and traditional wifi on a device, so that it can fall back to radio frequencies if higher frequency light fails it. Wifi is super cheap these days.
It's going to be great if you don't imagine it as a one-for-one replacement for traditional wifi and use it in applications that are specific to its strengths and weaknesses.
Also, just how dusty is your house that one week's accumulation is enough to snuff out a signal?
the key technical difference being that Wi-Fi uses radio frequency to induce an electric tension in an antenna to transmit data, whereas Li-Fi uses the modulation of light intensity to transmit data.
Is this expected to be a niche technology, or is it something that regular people will use? Seems like it would be a hassle to make sure that your li-fi receivers are within line of sight of your li-fi transmitters or whatever.
I could imagine it being installed on ceilings within certain rooms. Devices could be connected to both lifi and wifi. If lifi isn’t working it could fall back to wifi. But in reality, I have a feeling this will just be in niche scenarios, yes. I can imagine wifi getting 100x faster before this catches on.
One in the ceiling of every room you want coverage in would be fine. Enterprise grade ones in stores.
More importantly, though, it is more secure and higher performing. Could see the government using this for wireless SIPR rooms. They won’t until the tech is tested and refined first though.
Let the hobby community do that part and the regular consumer will see something very usable in a few years.
This is cool and all, but Wi-Fi and Li-Fi are equally "light-based", it's just using different frequencies. A higher frequency means potentially faster data transmission, but at the cost of faster attenuation. We see this with 2.4GHz vs 5GHz wifi already, and this sounds to me like a more extreme version of that
Yes and no. It's both electromagnetic waves but the frequencies are very very far apart. So far, the techniques we use to emit and receive them are fundamentally different. Their propagation and transmission characteristics are also very different.
Also, the data transmission rate (in theory) only depends on the bandwidth of the transmission channel, not the absolute frequency. But there's more "room" for large bands at higher frequencies, of course.
Idk. Lifi uses actual light waves which are quite high up the spectrum. For sure Wi-Fi and Li-Fi are both electromagnetic waves, but light itself is a very small section of the EM spectrum. Above that wavelength you get ionizing radiation that gives you cancer and below that is harmless non-ionizing light and radio waves.
Radio waves are light too. The article says that they're planning to use near infrared range for Lo-Fi. It will basically be mostly limited to short distances and line of sight. I also wonder how natural light in those frequencies from cooking, exhaust etc. would affect the signal.
This is slightly less practical than just connecting your devices with fiber, at least you can run a cable around corners, and your connection won't drop is a piece of paper block the line of sight to the access point.
Sounds like a cool replacement for point to point WiFi bridges. I wonder what sort of distances start to impact data rates and quality just due to air density or weather.
As I understand, this is very low distance, basically for office settings. What ISPs will do to connect to/provide connectivity to a remote site is install point to point microwave radios. They are not impacted by weather too much, but they do lose signal strength if the radios are misaligned. There have been some funny situations where signals will be out of spec because protected birds like bald eagles are nesting on the radio and it is illegal to disturb their nests, or squirrels are storing acorns in them.
That is what service providers will do if they want to offer cellular connectivity to a small town where running fiber would cost millions of dollars. They will contract a service provider to provide CTBH (Cell Tower Backhaul) via point to point microwave radios. Multiple radios can be used for redundancy / to add bandwidth capabilities by bonding channels together, suddenly they can provide 4g/5g cellular connectivity without needing to spend millions of dollars in installing fiber.
Really depends on the size of the receiver. Its possible to use it at interplanetary distances if we are willing to build a mirror 10s of square miles in size. For point to point my guess would be a few miles. The horizon is the cutoff point for sure so one beam could never be more then the line of sight horizon at the altitude of the receiver for sure.
You'd probably need a modified protocol for interplanetary distances, the lightspeed delay would probably cause timeouts and other problems with the usual approach.
A more realistic application might be if there's a way to get the signal up to a Starlink satellite or similar low-Earth-orbit relay. Cloud cover might be a problem for that though.