LiFi specification released, enabling wide adoption of light-based wireless internet

Daniel Sims

Posts: 1,376   +43
Staff
Why it matters: Multiple companies have been developing Light Fidelity technology or LiFi for years, but the IEEE recently cleared a significant stepping stone toward standardization. The wireless communication format won't replace WiFi or 5G but has clear advantages that could make it useful in specific settings.

The IEEE has released the 802.11bb standard for LiFi internet, a wireless transmission system that can be faster and more secure than traditional wireless methods. The new standard paves the way for different manufacturers to make LiFi devices that are compatible with each other. However, don't expect WiFi and ethernet to go away anytime soon.

Instead of radio waves, Light Fidelity (LiFi) transmits data through flickering light from common LED bulbs to receivers that can detect photons and convert them back into information. Users shouldn't notice the flickering because it occurs at frequencies above 60Hz – too fast for the human eye to perceive. Moreover, LiFi signals can be 100 times faster than WiFi, potentially reaching 224GB/s.

Companies working on the technology, including pureLiFi, Fraunhofer HHI, and Philips, have integrated it into lighting systems so devices can receive internet through ceiling lights in homes or offices. Fraunhofer HHI has proposed using LiFi to enhance transportation by transmitting using street lamps, stop lights, and vehicle headlights, potentially enabling vehicle-to-vehicle communication.

Light-based internet can have a few distinct advantages over WiFi or 5G besides higher speeds. Since it doesn't use radio waves, it might be helpful in places where the radio wave spectrum is already congested. Furthermore, LiFi can maintain a strong signal in settings where other wireless technologies usually struggle, like inside tunnels. LiFi is also more secure because it doesn't penetrate opaque objects, preventing anyone from tracking, jamming, or intercepting networks through walls or outside a light source's reach.

However, the requirement for a connection through either line-of-sight or reflection presents a significant drawback, as it limits a LiFi network's potential range. For this reason, it will likely complement, rather than replace, current wireless technologies.

Additionally, using LiFi requires different receivers than the ones vendors already offer to customers and manufacturers for WiFi or 5G, adding to tech clutter. PureLiFi has proposed a light antenna for implementation in future smartphones, a USB receiver for existing PCs, and other Li-Fi-based products.

Permalink to story.

 
MB/s means "Megabyte per second" and GB/s means "Gigabyte per second". They are saying - "researchers were able to reach bidirectional transfer speeds of 224 gigabits per second" which is 224 Gb/s, not GB/s. Data in gigabytes, speed in gigabits. 8 gigabit can transfer 1 gigabyte per second. Such a mistake means the writer doesn't understand the topic.
 
I can finally get a tan while inside!

While agencies absorb your data.
It will be like a reverse Alan Wake. You will be safer in the dark.
Hell what with the massive BS that is "lets pretend the putin/ukraine fiasco is causing issues so we can put prices up on every day necessities like food and electric" won't be able to use this, can't afford to keep the lights on. =P
 
"Users shouldn't notice the flickering because it occurs at frequencies above 60Hz" so you are saying my 144Hz and 240Hz monitor are useless? I arent think that. Is this console ad?
 
Can't say that I am a fan of this. Also feels like a step backwards for wireless technology. All they really are doing is shifting to the visible light spectrum...
 
MB/s means "Megabyte per second" and GB/s means "Gigabyte per second". They are saying - "researchers were able to reach bidirectional transfer speeds of 224 gigabits per second" which is 224 Gb/s, not GB/s. Data in gigabytes, speed in gigabits. 8 gigabit can transfer 1 gigabyte per second. Such a mistake means the writer doesn't understand the topic.
224 GB/s is not outside of the bounds of what is capable with light. At any frequency, the theoretical maximums, for data transfer rates, are on the order of 1/2 the frequency. Given that visible light ranges from about 400THz to about 750THz, divide that by 8 and that gives a theoretical maximum, using visible light, of about a minimum of 0.5TB/s.

Can't say that I am a fan of this. Also feels like a step backwards for wireless technology. All they really are doing is shifting to the visible light spectrum...
Not necessarily the visible portion of the spectrum.
"Users shouldn't notice the flickering because it occurs at frequencies above 60Hz" so you are saying my 144Hz and 240Hz monitor are useless? I arent think that. Is this console ad?
Those frequencies are how many frames per second your monitor can paint. If the monitor is synchronized with the graphics card, slower monitor frequencies will slow down the graphics card. Does it make a difference, not for most people, IMO, but perhaps for competitive gamers it might.
 
"Users shouldn't notice the flickering because it occurs at frequencies above 60Hz" so you are saying my 144Hz and 240Hz monitor are useless? I arent think that. Is this console ad?
The concept is that anything over 60 Hz is not detectable, but that is not true. The eye responds to blinking up to at least 120 Hz. I haven't seen the eye respond to anything above 150 Hz in my "tests" (Eat something crunchy while watching the light. If it jumps around, your eye is responding to it.), but the eyes definitely respond to 80 Hz. In other words, you shouldn't see the blinking in anything over 150 Hz (but you can see the light). That is what the article is trying to say. If you notice the blinking, the light can cause eye strain (short-term headaches and long-term damage). This is important to know for VR sets and so forth.
 
I recall years back, with 10mbps hubs they began making the traffic activity light flash a few times a second instead of flicker with activity because someone back then actually showed a high-speed photodetector could be used to recover the ethernet traffic from the rapid flickering of the activity lights on it.

Anyway, very cool. Of course I am only running 1gbps ethernet myself (and 802.11ac wireless), I don't have any 20gpbs hardware to feed that.. umm.. light bulb I guess?
 
People been getting headaches with monitors that dim by flickering at speeds much, much higher than 60hz… I can’t remember exactly the cutoff point notebookcheck uses for when they deem flickering non problematic, but I’m pretty sure it’s in the 1000s of hz.

Edit: I was a little off, they use 500hz as a cutoff point, and claim that many users report anything sub 250hz problematic. I’d tend to believe them since it’s a darned good notebook review site.
 
I actually got a hold of a monster computer a few days ago (an old Toshiba Qosimo -- seriously, this "laptop" is wider than my 19" LCD monitor!) with apparently all ports (seriously, it's got like 6 USB ports, VGA, HDMI, firewire, Cardbus, SD Card slot, for some reason it has 6 headphone jacks(!), a PS2 port because why not I guess.. maybe that is Svideo though?, and yes even an IrDA port on the front.) It's an old Core 2 Duo but I'm going to throw Ubuntu on there and revive it anyway for playing videos because it's so over the top -- it appears to have 4 speakers and a sub, it has a big jog wheel for volume and one for fast forward/rewind and a huge screen. It must have had wonderful battery life, it also has an Nvidia Geforce 8800M or something like that and dual hard drives. It weighs like 11 pounds though so nobody would actually want to carry it around and run it off the battery anyway.

I don't think flicker will be a problem though -- if you're running multiple gbps data rates I can't imagine it'd be varying the light at anything close to as low as 1000hz.. I also expect it may encode data through variation in brightness as opposed to just "on" or "off". Of course I have not looked at the Lifi spec though so I can't tell you this for sure.

It's interesting, "what's old is new again", the ORIGINAL wifi spec (the 1 and 2mbps only 802.11... 802.11b added the 5.5mbps and 11mbps rates) actualy had the DSSS (direct sequence spread spectrum) option (which is what everything ended up using), a FHSS (frequency hopping spread spectrum) option, AND an option for running optically! I saw a review of one of these setups -- they even included... they gave them a fancy name but they were mirrors... to reflect the light around corners. The obvious problem they found back then was basically if you plug the adapter into the back of a desktop shoved under a desk as people tended to do, there's no line of site to the back so it would not be able to get a signal. Mainly (back then) it was just a matter of 802.11b supplanting both the optical and FHSS wifi though.
 
"Users shouldn't notice the flickering because it occurs at frequencies above 60Hz" so you are saying my 144Hz and 240Hz monitor are useless? I arent think that. Is this console ad?

First off, shouldn't, not won't. Many of our body's systems are adaptive, so a person might notice the flickering initially, but eventually no longer notice it as they grow use to it. But OTOH like the flickering of 60Hz CRTs some might not ever get used to it, and develop eye strain issues, possibly leading to migraines.
There will be no way of knowing how many people are affected until it sees real world adoption.

Secondly, high refresh monitors aren't meant for visuals as much as they're meant to reduce lag. The human eye can create smooth motion with as little as 24 FPS, that's what movies shoot in. We kind of fill in the missing frames, which is the big problem with high intensity gaming.
While we fill in the missing information to create smooth motion, that missing information can make hitting a moving target, especially a fast moving target much harder since the target is jumping from frame to frame ever so slightly instead of actually moving smoothly.

So the higher the FPS the more true smoothness in the movement. One problem though. A LCD running at 60Hz will only display 60FPS properly. So the extra FPS are wasted and more importantly will cause problems as the monitor creates artifacts and is why you need a high refresh monitor if your GPU is exceeding 60FPS.
IMHO most people are fine with a 60Hz LCD monitor, and many games don't need to run higher than 60 FPS. But with the games that do and as a player gets used to the lower lag of high Hz monitors there's a real improvement in target hits. This starts to fall off around 120Hz, and doesn't seem to make any difference at all once you top 200HZ.
 
Back