What Real World Wi-Fi Performance Looks Like

At Aerohive, we’ve always been focused on Wi-Fi performance. We were the first to market with the industry’s only truly cooperative-control Wi-Fi solution. And the reason behind our fundamental shift in architecture? Because our founders saw that there was a huge flaw with the way the rest of the wireless vendor networks were architected— they wouldn’t be able to keep up with the growing demands of clients as new and faster wireless standards emerged. And you know what? We were right. Over the years we’ve seen most of the major competitors attempt to deliver some sort of controller-less option to help increase performance, reduce latency, and push more intelligence to the edge. And as newer, faster wireless standards have emerged, the way to evaluate that performance has had to change too.  

First, some background: Network performance tests have long been the subject of much discussion and debate in the industry. Sponsored tests sometimes come back with amazing results – usually skewed to the vendor paying the bill. Independent tests are more useful, but the list of unknowns becomes monumental – without vendor support, were the devices configured optimally for performance, what versions of code were in use, or can we trust the test bed to produce consistent results?  

At the end of the day, though, wireless performance tests are particularly difficult to benchmark, since so much of wireless performance relies on a pesky little thing called Physics. Also, unlike wired tests, wireless performance is much more diverse in the number of variables present – for example, the difference between maximum data rate and actual throughput capability, number of available streams (for SU-MIMO or MU-MIMO), channel width, QAM, if application visibility is enabled… the list goes on and on. And who really cares how fast a single client can go in a perfectly clean-air environment? Does that represent any real-life scenario? What it all comes down to is that benchmark tests don’t always give an accurate representation of what you REALLY care about: can your users do what they need to do, at an acceptable performance level, and make it through their day without having to call you for support? 

So what does a real world performance test really look like? Well, first we have to start with a large number of clients all using the same type of device as would happen in the most challenging environments (mixed client environments are common in enterprises, but then you get into questions regarding maximum performance of each type of client rather than can similar clients all achieve maximum performance concurrently). So let’s use 40 clients all with the same high-end device expecting to achieve maximum throughput at the same time – Macbook Pros with 3 spatial streams and 802.11ac support. Now “clean air” might be an oxymoron, but thankfully there are certain channels in the 5 GHz spectrum that have to be reserved for government radar, and as such, they are often cleaner than their more public counterparts. Let’s assume the DFS channels are available, and choose channel 100. Let’s make another assumption and give everyone the chance to show off some top performance – we’ll use 80 MHz channels to achieve the higher 11ac data rates (even though I’m sure we all recognize that no one would really use such wide channels in a high density environment). 

This is where it gets exciting – here are the results of that test among the major Wi-Fi vendors. In an effort to give some of the controller vendors a fair shot, we even tested with local forwarding options enabled or unencrypted tunnels to the controller, and all application visibility and control disabled – except Aerohive. For Aerohive we left the AVC enabled and still came up with this: 

So what does this show? If you’re going to rely on a performance test rather than how a solution works in your own environment, choose a test that demonstrates something you ACTUALLY care about. Multi-client performance in a high capacity environment with as many features enabled as possible. And then, my friends, the choice of vendor becomes very clear – because, let’s face it – there’s only one truly cooperative control Wi-Fi solution :-).

Hive On!

 

~~~~

For helpful tips and to learn more on E-rate, visit Aerohive’s E-rate resource page.

 

mm

Abby is VP of Product Management and Marketing at Aerohive, where she defines market strategy and vision for the Aerohive products and applications portfolios. Previously, she led product strategy and development for the routing, authentication, and education-focused products and platforms. Abby focused on building and supporting network security and routing products at companies such as Concentric, XO Communications, and Juniper Networks before joining Aerohive in 2008.

Leave a Reply

Your email address will not be published. Required fields are marked *