Wi-Fi is a technology that uses radio waves to provide network connectivity.
A Wi-Fi connection is established using a wireless adapter to create hotspots – areas in the vicinity of a wireless router that are connected to the network and allow users to access internet services.
Once configured, Wi-Fi provides wireless connectivity to your devices by emitting frequencies between 2.4GHz – 5GHz, based on the amount of data on the network.
If you have wireless Internet access at home, you probably have a little box called a router that plugs into your telephone socket.
This kind of router is a bit like a sophisticated modem: it’s a standalone computer whose job is to relay connections to and from the Internet.
At home, you might use a router to connect several computers to the Internet at once (saving on the need for several separate modems).
In other words, the router does two jobs: it creates a wireless computer network, linking all your computers together, and it also gives all your machines a shared gateway to the Internet.
What Does Wi-Fi Stand For?
You may be surprised to hear that many people don’t actually know that Wi-Fi is an abbreviated term. Even those who do don’t always know what Wi-Fi stands for.
There are a number of theories about what the term means, but the most widely accepted definition for the term in the tech community is Wireless Fidelity.
An Introduction to WiFi.
Wireless technology has widely spread lately and you can get connected almost anywhere; at home, at work, in libraries, schools, airports, hotels and even in some restaurants.
Wireless networking is known as Wi-Fi or 802.11 networking as it covers the IEEE 802.11 Technologies. The major advantage of WiFi is that it is compatible with almost every operating system, game device, and advanced printer.
How Wi-Fi Works??
Like mobile phones, a Wi-Fi network makes use of radio waves to transmit information across a network.
The computer should include a wireless adapter that will translate data sent into a radio signal. This same signal will be transmitted, via an antenna, to a decoder known as the router. Once decoded, the data will be sent to the Internet through a wired Ethernet connection.
As the wireless network works as a two-way traffic, the data received from the internet will also pass through the router to be coded into a radio signal that will be received by the computer’s wireless adapter.
A wireless network will transmit at a frequency level of 2.4 GHz or 5GHz to adapt to the amount of data that is being sent by the user. The 802.11 networking standards will somewhat vary depending mostly on the user’s needs.
The 802.11a will transmit data at a frequency level of 5GHz.
The Orthogonal Frequency-Division Multiplexing (OFDM) used enhances reception by dividing the radio signals into smaller signals before reaching the router. You can transmit a maximum of 54 megabits of data per second.
The 802.11b will transmit data at a frequency level of 2.4GHz, which is a relatively slow speed. You can transmit a maximum of 11 megabits of data per second.
The 802.11g will transmit data at 2.4GHz but can transmit a maximum of 54 megabits of data per second as it also uses an OFDM coding.
The more advanced 802.11b/g/n can transmit a maximum of 140 megabits of data per second and uses a frequency level of 5GHz.