So You Want WiFi?
Written by: Jen Sarto
Jen is the Vice President of Sales at LSR. Prior to her work at LSR, Ms. Sarto worked for a manufacturer's representative in the New York Metro Area. Jen has a bachelor's degree in Engineering from Steven's Institute of Technology.
Developers implementing Wi-Fi in their products are presented with a selection of 802.11 standards, including 802.11a, b, g, n or a combination of standards. So which one to choose? Let’s review the advantages and tradeoffs of these technologies to help you make an educated selection.
Background on Wi-Fi
Wi-Fi, used by more than 700 million people, is one of the fastest growing technologies. It was invented in 1991 by NCR Corporation/AT&T in the Netherlands. “Wi-Fi” is not a technical term but is commonly used to refer to the 802.11 standards, which were established by the IEEE starting in 1997. The standard has evolved over the past 13 years to support more applications in a rapidly growing market. In 1999, the Wi-Fi Alliance was formed to promote the growth of Wi-Fi and ensure industry success by certifying products using Wi-Fi technology.
802.11b Speed, Data Rates, and Frequency
802.11b was established by the IEEE in 1999 to improve the data rate of the original 802.11 standard created in 1997. The original 802.11 standard supported a maximum data rate of 2Mbps. This standard is no longer supported by the industry and therefore is not relevant to this analysis. All 802.11/Wi-Fi standards use the unlicensed radio spectrum. 802.11b supports only the license-free ISM band around 2.4GHz. The maximum data rate of 802.11b is 11 Mbps, which is the slowest relative to the other 802.11 standards. Currently, the main advantages of 802.11b are cost of chipsets and signal integrity, which is inherently due to high sensitivity at lower data rates.
802.11a Speed, Data Rates, and Frequency
802.11a was formed around the same time as 802.11b, but it has taken a lot longer for the market to adopt this standard. 802.11a supports a bandwidth up to 54 Mbps and uses the license-free band around 5.8GHz. The higher frequency degrades the range of the RF signal and does not transmit well through obstructions such as walls. However, the 5.8GHz band is less crowded, and therefore 802.11a excels in interference performance. 802.11a solutions are also typically more expensive than those using 802.11b, due to the more costly components of 802.11a.
802.11g Speed, Data Rates, and Frequency
The 802.11g standard was released in 2003. The IEEE intended the 802.11g standard to combine the benefits of 802.11a and 802.11b. The 802.11g standard uses the 2.4GHz band and supports a bandwidth up to 54Mbps. The higher data rate at the lower frequency provides the best of both worlds by enabling high-bandwidth systems to work over a longer range. 802.11g is backward-compatible with 802.11b. By moving out of the 5.8GHz band, 802.11g systems are more susceptible to interference by other devices in the 2.4GHz band, such as mobile phones, microwave ovens, and Bluetooth headsets.
802.11n Speed, Data Rates, and Frequency
802.11n is the newest standard, although it is not officially ratified. It was formed by the IEEE in 2009. The 802.11n standard was designed to allow for more bandwidth over the existing 802.11 standards by utilizing a technology called MIMO (multiple input, multiple output). With four spatial streams at a channel width of 40MHz, the maximum data rate theoretically increases to 600Mbps. In reality, most implementations do not come close to 600Mbps; they are closer to 65Mbps without MIMO and 130Mbps with 2x2 MIMO.
802.11n also operates in both the 2.4GHz and 5.8GHz bands; however, most available chipsets support only the 2.4GHz band due to added solution cost. Another advantage of MIMO is better range performance in a multi-path environment due to increased signal integrity. The cost to implement an 802.11n solution is typically more than the cost of 802.11b/g.
So which WiFi Standard to use?
The major silicon providers, including Atheros, Broadcom, Marvel, and Texas Instruments, support multiple IC options; however, it is clear that they are promoting the two most comprehensive solutions: 802.11a/b/g/n and 802.11b/g/n. The lack of promotion for 802.11b/g-only chips seems to be due to the demand for applications requiring 802.11n. Perhaps a better question is whether or not to use 5.8GHz or 2.4GHz and whether or not MIMO is required to meet the data rate requirements.
Module versus integrated design
Once you have deliberated over which standard to use, you are faced with how to implement it. It is not unheard of for silicon providers to require extremely high minimum order quantities to support a chip level design. Considering the history and viability behind these companies, there are probably very good reasons behind the large MOQ requirement. A true make-versus-buy analysis warrants its own article, but to leave you with a few thoughts, we have included some of the main points to consider:
- Compliance: EMC testing and certification can cost tens of thousands of dollars. By using a pre-certified module, you can eliminate the need for testing the product as an “intentional radiator.”
- Test equipment: Designing an 802.11 radio requires the appropriate test equipment to validate conformance to the standard and the RF performance. This equipment can cost more than $50,000, and more than one set may be required.
- Risk and time to market: Designing in discrete solution can take months to complete and validate. Using a pre-certified and validated module drastically reduces your risk and development time.
LSR offers a fully certified 802.11b/g/n and BT 2.1 module based on Texas Instruments’ WL1271. The TiWi™ module is part of a complete line of certified RF modules.