WIFI
In 1997, the Institute of Electrical and Electronic Engineers (IEEE) developed a set of standards required for implementing wireless communication between computers and pre-existing networks, including home and office networks, and the Internet. It was an awesome idea, and in the decade and a half since its release, WiFi has matured and grown immensely in capability, changing the face of computing, and communication in the world at large. However, WiFi has also become a large, confusing jumble of standards, tethered to the past by necessary backwards compatibility. What is WiFi and how does it work? Is WiFi 802.11ac worth upgrading to? Why does WiFi use multiple frequencies? If 802.11a WiFi using 5 Ghz was so bad when it first came out, why are new routers pushing 5 Ghz so hard? I will try to answer these questions in this article, in as plain English as I can use without oversimplification. I will focus on Home WiFi so I can avoid some unnecessary discussion (like basic WiFi before 802.11a), but this article is applicable to WiFi in all usage cases.
WiFi for the home began in earnest in 1999 with the release of routers, or wireless access points that used technology based on the first two commercial wireless standards: 802.11a and 802.11b. Computer networking by wire was already standardized under the code IEEE 802, so WiFi as a subset of computer networking became IEEE 802.11. Deciding to start at the beginning of the alphabet for naming the first WiFi protocol, the IEEE called the first commercial WiFi protocol 802.11a. It was supposed to be simple! Unfortunately, physics got in the way of simplicity.
In 1997, the Institute of Electrical and Electronic Engineers (IEEE) developed a set of standards required for implementing wireless communication between computers and pre-existing networks, including home and office networks, and the Internet. It was an awesome idea, and in the decade and a half since its release, WiFi has matured and grown immensely in capability, changing the face of computing, and communication in the world at large. However, WiFi has also become a large, confusing jumble of standards, tethered to the past by necessary backwards compatibility. What is WiFi and how does it work? Is WiFi 802.11ac worth upgrading to? Why does WiFi use multiple frequencies? If 802.11a WiFi using 5 Ghz was so bad when it first came out, why are new routers pushing 5 Ghz so hard? I will try to answer these questions in this article, in as plain English as I can use without oversimplification. I will focus on Home WiFi so I can avoid some unnecessary discussion (like basic WiFi before 802.11a), but this article is applicable to WiFi in all usage cases.
WiFi for the home began in earnest in 1999 with the release of routers, or wireless access points that used technology based on the first two commercial wireless standards: 802.11a and 802.11b. Computer networking by wire was already standardized under the code IEEE 802, so WiFi as a subset of computer networking became IEEE 802.11. Deciding to start at the beginning of the alphabet for naming the first WiFi protocol, the IEEE called the first commercial WiFi protocol 802.11a. It was supposed to be simple! Unfortunately, physics got in the way of simplicity.
The reason for this WiFi schism right out of the gate between 802.11a and 802.11b was because WiFi doesn’t actually operate using magic, it uses electromagnetic waves that run at a specific frequency, or band, on the electromagnetic spectrum (like Radio waves). Unfortunately, you can’t just pick any frequency on the electromagnetic spectrum – some frequencies don’t carry data well, some frequencies are harmful to humans (like X-Rays and UV Rays), and some frequencies are already used for other purposes.
Speed
Wireless router performance varies by standard with 802.11b providing the slowest speeds at up to 11Mbps. Wireless “g” routers deliver a maximum speed of 54Mbps while devices based on the 802.11n standard are fastest, topping out at 300Mbps.
If you’re thinking faster is better, there’s one thing to keep in mind: a wireless router can’t go any faster than your Internet connection allows. So, under most conditions, a wireless “n” device may only perform at speeds up to 100Mbps. Still, a faster wireless router will increase the speed of your network, allowing employees to access the data they need and making them more productive.
Bluetooth
Bluetooth is a wireless technology standard for exchanging data over short distances (using short-wavelength UHF radio waves in the ISM band from 2.4 to 2.485 GHz) from fixed and mobile devices, and building personal area networks (PANs). Invented by telecom vendor Ericsson in 1994, it was originally conceived as a wireless alternative to RS-232 data cables. It can connect several devices, overcoming problems of synchronization.
Bluetooth is managed by the Bluetooth Special Interest Group (SIG), which has more than 20,000 member companies in the areas of telecommunication, computing, networking, and consumer electronics. Bluetooth was standardized as IEEE 802.15.1, but the standard is no longer maintained. The SIG oversees the development of the specification, manages the qualification program, and protects the trademarks. To be marketed as a Bluetooth device, it must be qualified to standards defined by the SIG. A network of patents is required to implement the technology, which is licensed only for that qualifying device.
Basically, Bluetooth is a high-speed, low-power wireless link that was originally designed to connect phones, laptops and other similar equipment with no hassle caused to the consumer. Bluetooth is also the name of the short-range radio frequency (RF) technology used to wirelessly transmit voice and data.
Infrared
Infrared (IR) is invisible radiant energy, electromagnetic radiation with longer wavelengths than those of visible light, extending from the nominal red edge of the visible spectrum at 700 nanometers (frequency 430 THz) to 1 mm (300 GHz) (although people can see infrared up to at least 1050 nm in experiments). Most of the thermal radiation emitted by objects near room temperature is infrared.
Infrared radiation was discovered in 1800 by astronomer William Herschel, who discovered a type of invisible radiation in the spectrum beyond red light, by means of its effect upon a thermometer. Slightly more than half of the total energy from the Sun was eventually found to arrive on Earth in the form of infrared. The balance between absorbed and emitted infrared radiation has a critical effect on Earth's climate.
Infrared energy is emitted or absorbed by molecules when they change their rotational-vibrational movements. Infrared energy elicitsvibrational modes in a molecule through a change in the dipole moment, making it a useful frequency range for study of these energy states for molecules of the proper symmetry. Infrared spectroscopy examines absorption and transmission of photons in the infrared energy range.[6]
Infrared radiation is used in industrial, scientific, and medical applications. Night-vision devices using active near-infrared illumination allow people or animals to be observed without the observer being detected. Infrared astronomy uses sensor-equipped telescopes to penetrate dusty regions of space, such as molecular clouds; detect objects such as planets, and to view highly red-shifted objects from the early days of the universe. Infrared thermal-imaging cameras are used to detect heat loss in insulated systems, to observe changing blood flow in the skin, and to detect overheating of electrical apparatus.
Thermal-infrared imaging is used extensively for military and civilian purposes. Military applications include target acquisition, surveillance,night vision, homing and tracking. Humans at normal body temperature radiate chiefly at wavelengths around 10 μm (micrometers). Non-military uses include thermal efficiency analysis, environmental monitoring, industrial facility inspections, remote temperature sensing, short-ranged wireless communication, spectroscopy, and weather forecasting.
USB
USB—or Universal Serial Bus—is a protocol for connecting peripherals to a computer. It features a standarized port designed to accommodate many different types of hardware devices. Most modern devices such as digital cameras, printers, scanners, flash drives, cell phones, iPods and other MP3 players use some variation of the USB port in their design.
The first USB technology began development in 1994, co-invented by Ajay Bhatt of Intel and the USB-IF (USB Implementers Forum, Inc). The organization is comprised of industry leaders like Intel, Microsoft, Compaq, LSI, Apple and Hewlett-Packard. It supports and adopts comprehensive specifications for all aspects of USB technology.
Before USB came into existence, computers used serial and parallel ports to plug devices into computers and transfer data. Individual ports were used for peripherals such as keyboards, mice, joysticks and printers. Expansion cards and custom drivers were often required to connect the devices. Parallel ports transferred data at approximately 100 kilobytes per second, where as serial ports ranged from 115 to more than 450 kilobits per second. Some ports could not run simultaneously.
The high volume of incompatibilities and the attempt to use multiple interfaces helped signal the demand for a technology like USB, which could take half a dozen port types and streamline it down to one. Immediate interaction between devices and a host computer without the need to disconnect or restart the computer also enables USB technology to render more efficient operation. Consequently, a single USB port can handle up to 127 devices while offering a collective compatibility.
The USB 1.0 debuted in late 1995 and transferred data at a rate of 12 megabits per second. A revised version of this standard, USB 1.1, not only transferred data at a full speed rate of 12 megabits per second, but could also operate at a lower speed of 1.5 megabits per second for lower bandwidth devices. Due to its more efficient operation, USB 1.1 was used more by consumers than its predecessor.
In 1998, the iMac G3 was the first consumer computer to discontinue legacy ports (serial and parallel) in favor of USB. This implementation helped to pave the way for a market of solely USB peripherals rather than those using other ports for devices. The combination of the ease of use, self-powering capabilities and technical specifications offered by USB technology and devices helped it to triumph over other port options.
With a transfer rate forty times faster at 480 megabits per second, USB 2.0 debuted in 2000 and became an official standard the following year. In addition to this high-speed transfer rate, USB 2.0 was capable of operating at two slower speeds: 12 megabits per second (USB 1.1 full speed) and 1.5 megabits per second (low speed for devices like mice requiring less bandwidth). A USB 2.0 port will function with USB 1.1 devices, although a USB 1.1 port may not have the bandwidth capabilities to properly communicate with a 2.0 device.
USB 2.0 offers plug and play capabilities for various multimedia and storage devices. This newer version offered additional user features that did not exist in its previous version. USB 2.0 added support for power sources with USB connectors, a new descriptor for multiple interfaces, as well as the capability for two devices to interact without the need for a different USB host (also referred to as USB On-The-Go) .
Types of USB
HDMI (High-Definition Multimedia Interface)
HDMI (High-Definition Multimedia Interface) is a compact audio/video interface for transferring uncompressed videodata and compressed or uncompressed digital audio data from an HDMI-compliant source device, such as a display controller, to a compatible computer monitor, video projector, digital television, or digital audio device.[1] HDMI is a digital replacement for existing analog video standards.
HDMI implements the EIA/CEA-861 standards, which define video formats and waveforms, transport of compressed, uncompressed, and LPCM audio, auxiliary data, and implementations of the VESA EDID. CEA-861 signals carried by HDMI are electrically compatible with the CEA-861 signals used by the digital visual interface (DVI). No signal conversion is necessary, nor is there a loss of video quality when a DVI-to-HDMI adapter is used. The CEC (Consumer Electronics Control) capability allows HDMI devices to control each other when necessary and allows the user to operate multiple devices with one remote control handset.
Several versions of HDMI have been developed and deployed since initial release of the technology but all use the same cable and connector. Newer versions optionally support advanced features such as 3D, an Ethernet data connection and improved audio and video capacity, performance and resolution.
Production of consumer HDMI products started in late 2003.In Europe either DVI-HDCP or HDMI is included in the HD ready in-store labeling specification for TV sets for HDTV, formulated by EICTA with SES Astra in 2005. HDMI began to appear on consumer HDTV camcorders and digital still cameras in 2006. As of January 8, 2013 (ten years after the release of the first HDMI specification), over 3 billion HDMI devices have been sold.
No comments:
Post a Comment