Wi-Fi Stress Test #2 Postponed

For the past 18 months, ever since the successful completion of the first Wi-Fi Stress Test – we have been looking for another opportunity to do more testing.

This testing needed to meet with some specific criteria – mostly due to our business model and available funds. Things like:

  • Stay under a $30,000 budget
  • Provide independent testing of industry leading Access Points
  • Garner results that would be useful additional information for the community
  • Tests could be repeatable with anyone with the same equipment
  • Work within an allowed one-week testing timeframe
  • Allow for a review of third-parties who capture in-the-air as backup data

After much thinking and planning, we came up with a testing idea to gather the following bits of information in a new Wi-Fi Stress Test.

Using a total of 20 devices – each in a different orientation and position – would be tested as a turntable rotated in 45° increments. This would be testing 802.11ac Access Points against both 802.11ac clients and 802.11n clients.

This week we ran some preliminary tests to check on the feasibility and allow us to lock on test specifics so we could share a detailed test plan with all the various WLAN vendors. These tests were to confirm certain assumptions and allow for tuning some of the test parameters.

Regrettably, many of our assumptions have proven to be wrong, and/or a bit off of what we had planned. Thus this testing scenario would no longer meet our goals and so we are postponing the second Wi-Fi Stress Test till sometime early spring 2015 after we can develop a more stable testing plan.

My apologies to all those who wanted to be part of this test plan the week of November 17th, 2014. But the current test plan is not technically accurate or feasible at this time.

Below are the reasons and thought processes we had based on these preliminary tests.

Keith R. Parsons

October 22, 2014

Assumption – Client Device Orientation Matters

Part of the start of this test plan was to come up with an answer to the question, “Is there one orientation for a client device that is better than others?

We had thought if we take enough tests across a variety of client devices, each in a locked orientation, and rotate them through a 360° arc, in 45° increments, we might have a “winner” in terms of which orientation is consistently better than others.

In fact, after doing more than 50 tests, we found there isn’t any statistical differences between portrait, landscape, 0°, 30°, 45°, 60° and 90°. So running thousands of these tests would have been a waste of time.

Actually, we found the opposite to be true. There are, at times, a position that is much worse than others. But that “looser” position isn’t consistent. Change distance, AP rotation, or changes in the lazy-susan arc, and the “looser” device moves to a different orientation. This is consistently inconsistent.

Assumption – Test using 802.11ac 80MHz Channels

We thought one of the best ways to show differences between client devices was to use 802.11ac in 80MHz channels with 3×3:3 Access Points. Then the differences in client Wi-Fi capabilities would show up more prominently.

In fact, since the FCC limits the total aggregate EIRP across the entire channel width, using 80MHz channels works, and works well at close distances – but when we increased the distance, the clients and AP’s chose to drop to 40MHz or even 20MHz to maintain MCS rates by picking up additional RSSI and thus higher SNR rather than staying at a lower RSSI but 80MHz channel width.

This behavior was also inconsistent and different between various vendor implementations.

Assumption – Test all clients on a “Lazy Susan” to rotate devices

The goal was to keep the orientation – portrait vs landscape – and 0° – 90° constant, then rotate the table in 45° steps through an entire 360° circle taking test measurements both Upload and Download at 1 minute chunks at each stop.

But what we found was very little differences as the tables rotated – all the devices were still mere centimeters apart and had very little differentiation. But we found huge differences as the devices were moved around to various different locations.

So the idea of keeping them together for a constant test proved unworthy of the amount of effort.

We also found by spreading clients around – we got statistically significant losses – when compared with client device all together. Drops of 3-%-40% just by moving the clients around the Access Point, rather than all on a single table.

Assumption – We could run this test in the WLAN Professionals Offices

We could easily run the 5m, 10m and 10m with a wall tests in our small offices. But we found to get good differentiation we need to add 20m, 30m+ and do more NLOS testing. Our offices aren’t big enough for these needed tests.

Assumption – Testing at 5m, 10m and 10m through a wall would differentiate

We thought the 5m test would allow all vendor Access Points as well as all 802.11ac client devices to work at 256-QAM – probably at 10m as well. Then by adding an extra wall in between the 64-QAM would kick in.

Instead we found Access Points and Client Devices auto-negotiated around these changes in environment in totally different ways. Sometimes to maintain higher SNR and thus keep 256-QAM a client would drop to 40MHz or 20MHz rather than stay at 80MHz and downgrade. Sometimes the direct opposite.

These inconsistencies proved to be difficult to quantify and compare one situation to another. Devices mounted on the same table reacted differently from each other at the exact same distance. Again, we found the data in this type of test to be consistently inconsistent.

Assumption – Use of 80MHz in the “Real World”

This was a logical fallacy from the very beginning. We would never recommend to one of our customers to design their 5GHz network using 80MHz channels. So why did we even start the test plan with 80MHz as the baseline. I think it was because we wanted to see how to get the most out of 802.11ac clients and to show they really could get some big numbers.

This was just our own mistake. Since we’d never recommend an enterprise customer or K-12 customer to use 80MHz channels – why test them?

Assumption – We could tune the test so Frequency Capacity wasn’t the cap

We wanted to tune the IX Chariot parameters so running out of frequency capacity wasn’t going to be a bottleneck.

At the first Wi-Fi Stress Test, we found by watching the Spectrum Analysis we could predict when any specific vendor would fail, based on when there was no more available time slots in the spectrum.

So this time around we wanted to change parameters to try and keep the capacity of the frequency out of the equation as much as possible. Sure, those vendors who used the frequency most efficiently would do better than those who don’t. But we didn’t want to hit this limit right at the beginning.

What we found in our preliminary tests is that 802.11ac Access Points and clients – even with just a single client – would nearly max out the capacity of the channel. Even at 80MHz given you try to push enough packets over the network.

As we added more clients, they all just shared this same amount of “full” capacity in the spectrum.

Assumption – We could find a “Sweet Spot” for number of clients per Access Point

The initial test plan called for the following devices to all be associated to the test Access Point:

5          Galaxy S5 – 802.11ac Smartphones

5          iPhone 6 – 802.11ac Smartphones

5          Galaxy S 8.4 Tab – 802.11ac Tablets

5          iPad .v4 – 802.11n Tablets

1          MacBookPro 13” – 802.11ac 3×3:3 laptop as baseline

1          MacBookAir 13” – 802.11ac 2×2:2 laptop as a baseline

1          MacBookPro 15” – 802.11n 3×3:3 laptop as a baseline

What we found was after the first five clients were online and being tested, adding additional devices significantly lowered the aggregate throughput. By significant I mean each additional device lowered aggregate throughput by over 5%. EACH! So adding 5 more devices lowered aggregate throughput by 25%.

There was no way to have all our planned 23 devices on the same radio, at the same time, running both upload and download scripts without getting numbers that were way more than anyone would believe. Way below anything in the marketing specs for 802.11ac devices.

Assumption – Client Device Consistency

We initially thought by having different types of devices, from different manufacturers, different form factors, and different Wi-Fi chipsets we could gather information about a pseudo-real-world scenario. But our assumption was the same-type devices would be consistent within the group.

But what we found was same-type devices, even when in the same orientation, and within mere centimeters of each other would perform with 10%-20% differences in throughput.

Again – consistently inconsistent. All while it was the same type device, in the same location, with the same distance, to the same Access Point, on the same channel, at the same time.

After that realization – it was pretty tough to feel very confident about this test plan and that the data would be something worth reporting.