802.11ax: What’s Next In Wi-Fi?

This series from wireless expert Craig Mathias of Farpoint Group will provide the background and context you need to understand the features, benefits, and positioning of 802.11ax, what it means to organizational users, why it will eventually achieve dominance, and – very importantly – when.

The specification and standardization of the next generation of IEEE 802.11 wireless LAN (WLAN) physical layers (PHYs) is now under development, with publication of the final standard expected by early 2018. This is 802.11ax, and brings with it the promise of 10 Gbps throughput. 802.11ax will, no doubt, eventually replace 802.11ac Wave 2 – but when?

It’s also quite fair to ask if any given user really needs 10 Gbps, implying that such a turnover in the installed base will take many years, but such is in fact asking the wrong question. Like many other contemporary advances in wireless technologies, 802.11ax is really all about improving system-wide capacity, and not just throughput alone, especially critical given the limited radio spectrum available and the fact that this spectrum is already heavily used in many locations.


Another good question: How will 802.11ax compete with that other very-high-throughput technology, 802.11ad (around 7 Gbps), and its follow-on, 802.11ay (aiming at, believe it or not, 20 Gbps)?

Don’t look now, but the nothing-short-of-dramatic and very long list of advances and The key to success with 802.11ax lies in making better use of the spectrum we already have, from both a throughput and a reliability perspective. improvements historically embodied in the 802.11 wireless LAN standard is about to take another great leap forward. Perhaps you remember the original 802.11 standard, released in 1997 after more than six years of very difficult work. Products based on this effort reached a whopping 1 and 2 Mbps (yes, you read that right), but users were thrilled – the promise of mobility was on the cusp of realization.

802.11b followed in 1999, and, with up to 11 Mbps (faster than most wired Ethernet installed at the time) and the establishment of the Wi-Fi Alliance (the most successful trade association in history, IMHO), the WLAN market was cracked wide open.

The swift pace of progress continued – 54-Mbps 802.11g in 2003 (but based on somewhat abortive 802.11a from 1999) and 600 Mbps 802.11n in 2009 (but well-established thanks to non-standard-but-close products appearing well in advance of the official standard), and today’s 802.11ac, commonly delivering between 433 and 2167 Mbps, depending upon – well, a whole bunch of factors. We’ll return to this in a moment.

But despite the fact that Wi-Fi is going to celebrate its 20th anniversary this year (and that wireless itself is now over 120 years old!), new advances in the basic technologies that underpin wireless are now being integrated into the under-development IEEE 802.11ax standard, which seeks to crack yet another obvious barrier: 10 Gbps – yes, 10 gigabits per second, 5,000-10,000 times the performance of the original standard!

After more than a quarter-century in wireless myself, even I’m impressed, and no matter that cynics might note that we’re only seeing a maximum improvement of roughly 3-10X over common .11ac throughput today.

Geez, why bother?

The answer is simple, even if the underlying technologies (which we’ll review next time) are not: While few users and applications actually require gigabit+ throughput, more is always better.

That individual user isn’t going to be unhappy, and, much more importantly, all of the users simultaneously sharing a given Wi-Fi channel are going to be very happy indeed. It’s no longer about throughput; it’s about capacity, and capacity is a function of how rapidly and reliably I can get my bits on and off the air, thereby leaving more time (and thus capacity) for yours.

While 10 Gbps sounds great, and it is, don’t focus on what that means to individual users – think about what its significance to an IT department trying to deal with ever-greater volumes of users with ever-more devices and ever-more-demanding applications.

The increasing volume of streaming video is often cited as a key driver for upgrading to newer Wi-Fi technologies, and such makes at least some sense. In addition to dealing with the large data object that video is, streaming video is time-bounded, meaning that latency must be kept low. Delays in transmission due to network congestion or retransmissions required by problems with prevailing radio conditions can result in the dreaded “buffering” message, or worse. But, to be fair, most HD video traffic requires only between 2 and 20 Mbps, as video can be highly compressed during transmission.

4K video represents, of course, a bigger challenge: Ultra HD Blu-Ray runs at about 82-128 Mbps, but is still well within the copious bandwidth provisioned by even low-end 802.11ac. So with gigabits of performance instantaneously available on each Wi-Fi channel, and with Ultra HD video not all that common (or even required in most organizational settings), the biggest challenge will simply be handling growth as the number of users, devices, and applications drives the need for as much throughput as possible to address the consequential aggregate demand.

IEEE 802.11 standards, like most standards, have the clear benefit of providing a robust and well-specified technology base, particularly with respect to the clear enumeration of interface points, upon which to build an industry. And, thanks in large part to the extensions, exhancements, and interoperability work of the Wi-Fi Alliance (Wi-Fi is now synonymous with both 802.11 and wireless LANs in general), we have some amazing capabilities and products available today.

802.11ax continues an evolution which, I’ll argue is essential as demand for wireless bandwidth continues unabated.

The keys to meeting that challenge center on radio spectrum, which by analogy is the highway our wireless device runs on. While we might get more unlicensed spectrum in the future, such is unlikely, and the spectrum currently available is being consumed not just by Wi-Fi, but by other unlicensed products and services, from high-definition video cameras to unlicensed LTE.

The key to success with 802.11ax lies in making better use of the spectrum we already have, from both a throughput and a reliability perspective, along with implementations that have the cost, form factor (size and weight), and power-consumption characteristics essential to mobility.

Keep in mind that numbers like 10 Gbps represent an upper bound for PHY-layer traffic under ideal conditions, and throughput will in fact vary quite a bit under normal operating conditions – what is commonly called rate adaptation. And with so much consequential variability in throughput, does it even make sense to focus on throughput alone? Nope – not at all. Let’s move on.

Next time, we’ll look at the technologies involved in 802.11ax. Sneak peek: think evolution, not revolution. And that’s a good thing.


Craig J. Mathias is a Principal with Farpoint Group, an advisory firm specializing in wireless networking and mobile IT. Founded in 1991, Farpoint Group works with technology developers, manufacturers, carriers and operators, enterprises, and the financial community. Craig is an internationally-recognized industry and technology analyst, consultant, conference and event speaker, and author. He currently writes columns for Boundless, Connected Futures, CIO.com, and various sites at TechTarget. Craig holds an Sc.B. degree in Computer Science from Brown University, and is a member of the Society of Sigma Xi and the IEEE.

Leave a Reply

Your email address will not be published. Required fields are marked *