The use of wireless technology is quickly becoming the most popular way to connect to a network. Wi-Fi is one of the many available technologies that offer us the convenience of mobile computing. The thought of working anywhere and sending data to and from a device without physical connection is becoming increasingly attractive for many consumers and businesses (Stephen Haag 2007). In this report, I will define what Wi-Fi technology is, briefly how it works and its advantages and disadvantages. The Technology
In 1999 a new technology called Airport was introduced by Apple Computers. The technology enabled a mobile user to establish and maintain a connection to a network without being physically linked to it by some sort of cable. This technology was then adopted and developed by the rest of the IT industry, then changed to the name we are all familiar today, Wi-Fi (Seibold 2007). ‘Wi-Fi stands for wireless fidelity’ (Dynamic Web Solutions 2007). The name has little relation to the technology and is simply a marketing idea. The technology, in short, uses radio waves to send data via the 5 GHz and 2.4 GHz radio frequencies. As a comparison, Mobile Phones use the 800MHz frequency and FM radio uses 100MHz (Dynamic Web Solutions 2007). Higher frequencies are used for Wi-Fi because larger data packets can be sent on a higher frequency (Marshall Brain 2001). Wi-Fi currently comes in 802.11a, 802.11b and 802.11g standards (Hatcher 2007) with more continually being developed like the recent 802.11n and WiMax (Marshall Brain 2001). The 802.11a standard uses the 5 GHz frequency with the assistance of a technology called orthogonal frequency-division multiplexing (OFDM), transmitting 54Mbits of data/second. 802.11b uses 2.4 GHz and is the cheapest, slowest standard, transmitting up to 11Mbits/second. Lastly, the 802.11g standard uses 2.4 GHz like 802.11b, though it uses orthogonal frequency-division multiplexing (OFDM) similar to 802.11a to reduce interference...
Please join StudyMode to read the full document