TECHNOLOGY IN CONTEXT
Military Radars and WLANs Wrestle with Interference Issues
Wireless LANs operate at frequencies that overlap military radar bands. Simulation and test schemes let designers compensate for interference.
DAVID LEISS, APPLICATIONS ENGINEER, AGILENT TECHNOLOGIES
Page 1 of 1
Wireless networking ranks as one of today’s most dynamic areas of technology innovation and market growth. As the wireless LANs proliferate, both military and commercial system designers should be aware of the potential overlaps among frequencies—and in particular the co-interference between military radars and wireless LANs.
The IEEE 802.11a wireless local area network standard allocates the frequencies in the range of 5150 to 5725 MHz for wireless communication between computers and WLAN hubs to facilitate localized networking. Unfortunately, these frequencies overlap the C band military radar frequency range of 5250 to 5925 MHz. This can lead to degraded performance in the WLAN network and also possibly interfere with the radar’s ability to detect weak echoes.
To explore this issue, it’s important that the WLAN and Radar networks are accurately modeled and simulated. Such simulated performance can then be compared with measured results to verify the accuracy of the simulation model. Once this is done the radar signal can be applied to the WLAN network either by simulated signal sources, measured signal data or a combination of these to confirm the simulation results with actual measurements.
IEEE 802.11a Wireless LAN
The IEEE 802.11a Wireless Standard allows very high-speed data transfers, up to 54 Mbytes/s, between computer systems at speeds of a wired local area network. This data communications technology uses Orthogonal Frequency Division Multiplexing (OFDM). In OFDM a high-data-rate digital signal is separated into 48 parallel low-data-rate orthogonal frequency spectral channels using an Inverse Fast Fourier transform (IFFT). Each channel occupies 312.5 kHz and isn’t individually filtered. They are orthogonal because the null of each Sin(x)/x spectral channel occurs at the peak of the adjacent channels (Figure 1).
In addition to the 48 data channels, there are four pilot tone channels and one null channel in the center of the band. The total signal bandwidth is 16.6 MHz. This wideband and low, individual sub-channel data rate signal, along with the built-in Forward Error Correction (FEC) and Interleaving, allow for a reliable high-speed data communications channel.
Consider a simplified description of the OFDM system with a 48 Mbyte/s data bit stream coming in. The serial bit stream is divided into 48 parallel bit streams, each at a 1 Mbyte/s data rate and occupying a bandwidth of 312.5 KHz. The null of each Sin(x)/x response is at the peak of the adjacent channels. There will also be a null at the center of the spectrum as specified in the IEEE 802.11a standard. By spreading the signal out over a broad bandwidth it significantly reduces the possible effects of fading and narrow-band interference, thus improving communication reliability significantly.
OFDM Systems Simulation
The performance of the Wireless LAN system can be simulated using Agilent’s ADS software. The block diagram representation is shown in Figure 2. It consists of an IEEE 802.11a-compliant source, a complete circuit level receiver, WLAN receiver and Bit error rate tester. The receiver consists of a mixture of behavioral and circuit level components. Central to the receiver is an RFIC Gilbert Cell Mixer.
The initial simulation is done at an RF power level, at the receiver front-end, of -75 dBm with a data rate of 36 Mbytes/s. The initial results show an 802.11a Packet Error Rate (PER) of 0.01 (1 percent). The IEEE 802.11a Standard states that at this data rate, the minimum sensitivity to achieve a 0.1 PER (10 percent) is -70 dBm. This design meets that standard with a PER of 0.01 (1 percent) at -75 dBm. Therefore the model’s performance is considered acceptable.
The signal spectrum and time domain simulation results are shown in Figure 3. The spectrum corresponds to what is shown in Figure 1. Note the signal bandwidth is 16.6 MHz and that there is a null at the center of the spectrum as mentioned previously. The integrated spectral power is -75.05 dBm.
Another way to look at the WLAN signal is in the time domain (Figure 3). The signal power versus time is displayed. The WLAN signal framing can be clearly seen. The Y-axis is displayed in dBm with a line designating the nominal mean power level of -75 dBm. Note the peak power is almost 10 dB higher (around -65 dBm) than the average power of -75 dBm. These peak transients will be more likely to cause false alarms during radar operation because the detection threshold will be exceeded.
Radar Interference Analysis
As the above results show, the simulated WLAN system meets the 802.11a standards for operation. Next, the effects of introducing an interfering Radar signal can be simulated according to the ETSI EN 301 893 standard, the source for testing the 5 GHz WLAN systems. To test for the WLAN susceptibility to military radar interference, a radar signal with a Pulse Repetition Frequency (PRF) of 700 Pulses per Second (PPS) with a Pulse Width of 1 microsecond is used. In this example the Radar signal will have a chirp bandwidth of 10 MHz.
For the initial Radar interference analysis the military radar source was set to the same center frequency as the WLAN source—5.7 GHz in this case. The WLAN signal level will initially be set to -70 dBm, where it was previously shown there were no errors. The radar signal level will then be swept to see what effect it has on the WLAN performance. The two RF signals will be summed together at the receiver input and interact as they process through the WLAN receiver.
In this simulation the chirp radar power is swept from -80 to -60 dBm in 2-dB steps (Figure 4, top). Packet errors start occurring at about -78 dBm and exceed the 10% error criteria at approximately -74 dBm. Interestingly, they don’t continue to climb but increase only slightly to approximately 15 percent. That’s because the low duty cycle pulses only disrupt some of the data frames.
The effects of the two signals being offset from one another in frequency can also be modeled (Figure 4, bottom). As the radar center frequency is deviated, the WLAN Packet Error Rate exceeds the 10% limit when the chirp radar center frequency is within approximately +/- 5 MHz of the WLAN center frequency. Beyond that, the WLAN IF filtering reduces the effects of the radar interference. Receiver AGC, desensitization and non-linearity effects could also contribute to reduced WLAN performance.
Interference Simulation with WLAN
After analyzing the simulated results, the next step it to analyze the effect on a real WLAN network. The WLAN operates as normal with network software to monitor the data rate and packet errors of the transfer between the two computers. The radar signal is supplied by the chirp radar signal being directly uploaded from Agilent’s Advanced Design System (ADS) software into an Agilent programmable signal source such as the E4438C ESG or the E8267C PSG signal generator. These signals can be radiated in the proximity of the LAN network. The airborne signal levels of the WLAN communications link and the radar signal level can be monitored with a spectrum analyzer such as the Agilent E4440A PSA spectrum analyzer or one of the Agilent 89600 series of vector spectrum analyzers.
This testing allows a rapid evaluation of WLAN performance as the Radar frequency and power levels are changed. By using the ADS software connected with the test equipment, automated test plans can be performed to evaluate systems over a large set of signal conditions. Any other possible source of interference signals can just as easily be generated in ADS and uploaded to see the effects on the wireless network performance.
WLAN Effects on Radar Operations
The widespread use of IEEE 802.11a WLAN in frequency ranges that overlap the military frequency bands could also potentially degrade the radar’s performance. By injecting the sampled WLAN signal, as measured on the vector spectrum analyzer, into the radar receiver input, the degradation of the chirp radar performance can also be analyzed.
For this test the WLAN signal average power was swept from -80 to -40 dBm as measured at the radar receiver’s input. The result is a response curve (Figure 5). The signal levels shown are all displayed after the pulse compression network of the radar receiver. In radar signal processing after pulse compression, our primary concern is measuring short-duration impulses. This means that the peak values of the interference signal are of major concern. In this plot it can easily be seen that the WLAN’s peak-to-average ratio is approximately 9 dB. This corresponds well with the time domain plot shown in Figure 3.
As the interference signal increases, it adds to the “noise” level and also distorts the chirp radar signal, thus reducing the compressed signal value. At about -62 dBm the high false alarm rate would make the radar system almost totally ineffective. A family of curves relating false alarm rate or probability of detection with WLAN and Radar return power levels could be quickly generated so that the operational effects of this, or any other type of interference, could be quickly accounted for.
The methods above illustrate how to simulate and test for possible co-interference effects of military radars and IEEE 802.11a Wireless Local Area Networks. The same simulation techniques could be used to determine possible interference from any other source of interference such as other radar or communications systems. These tests allow a designer to compensate for these real-world effects earlier in the design cycle, rather than as an afterthought that results in large investments in time and finances to “fix” interference problems after the system is operational in the field.
Palo Alto, CA.