High-Density WLAN Design Part 2

Cell Capacity – How much throughput can we offer in a cell?

So, in the last post I looked at determining how much bandwidth we would need to attempt to supply a given number of users in an area.   Now we are going to look at how we can begin to consider cell sizing and capacity planning to support those requirements.

The real world throughput is what we are also interested in, forget the marketing numbers and theoretical maximums quoted by every vendor, what we care about is what we can deliver most of the time, consistantly and reliably to the end users.

I’ve borrowed the following table of figures from a Cisco book but I would agree with these numbers under reasonable real-world operating conditions.

Protocol Throughput (Mbps)
802.11b 7.2
801.11b/g Mix 13
802.11g 20
802.11a 20
802.11n (1ss, HT20, MCS7) 25
802.11n (2ss, HT20, MCS15) 70
802.11n (2ss, HT40, MCS15) 160

Are these data rates reliable and consistently available though in the real world?

We can clearly see that the more efficient 802.11n protocol is what we really want to see our clients using.

Unfortunately that isn’t always going to be the case, whilst the majority of smart devices / laptop computers that people use these days are 802.11n capable there is still a significant volume of legacy 11b/g and some 11a clients out there.

You also have to bear in mind that the majority of mobile devices (iPhone/Pod/Pad) are single stream devices although this is changing gradually.

Access to historical data for a sites WLAN can be very useful for planning purposes, i.e. how many 5GHz capable devices are there vs 2.4GHz, how many support 11n etc. etc.   I work in events so generally such luxury isn’t afforded to me!   Typically I will design with the lowest common denominator in mind, and that is your average mobile device in most cases.   What I will also do in 9/10 cases is outright disable legacy 11b data rates and most of the lower 11g rates to get the best use out of the available airtime.   In a high-density network I believe that every available advantage, tweak and trick should be used if it will help you deliver the desired levels of throughput and performance.

So, lets go back to our design goal from Part 1 of 50x 1Mbit/s connections, that shouldn’t be too hard even with a mixture of 11g/n 2.4Ghz devices, even easier if some of them support 5GHz.   A couple of good dual radio AP’s all on non-overlapping channels and an even distribution of clients across them can deliver that kind of connectivity relatively easily, as illustrated below.

Cell Capacity - 2.4GHz & 5GHz
Cell Capacity – 2.4GHz & 5GHz

But this is a blog about high-density design, 50 users in a room is not high-density in my books and certainly should not be a challenge for modern WLAN gear.

Let us say we now want to scale that up to an auditorium of 500 users, each with two devices giving us a total of 1,000 connections / devices.   If we make the reasonable assumption that the two devices people wish to connect are going to be a smart phone and a tablet or laptop computer, we can also assume a reasonable 50/50 split between 2.4GHz and 5Ghz clients in this scenario and that the vast majority of the 5GHz capable devices will be 11n 2×2:2 devices whilst the 2.4GHz devices will probably be a 40/60 mixture of single stream 11g/n – this is based on what I see day in day out in the real world.

This requirement brings us nicely towards what I will be writing about in the next part of this series – the need to look at channel reuse planning and spectrum management as instead of 3 wireless access points being sufficient to support the demand with headroom we are now looking at more like 15 access points (based on trying to maintain 30ish clients per AP).

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *