Abstract
A theoretical comparison of the sensitivity of commercial microwave systems is presented in detecting fog, when operating at the typical current operating frequencies, 20 and 38 GHz, versus the 80-GHz range, a frequency range that is being increasingly deployed. To determine the effective detection threshold for each link, it is assumed that the minimal fog-induced attenuation can be sensed whenever the signal loss caused is equal to the quantization interval divided by the length of the intersection between the fog patch and the link L (km). The first simulation was run using a real, already existing, set of MLs operating in a frequency range of around 38 GHz. During the second simulation study, the performance of the same set of links was tested while a simulated frequency of 20 GHz was chosen for the whole set of links. In the third simulation the algorithm was run using a simulated frequency of 80 GHz, illustrating a future network designed to fulfill the increasing demands of network access expansion. The results suggest that when cellular network infrastructure is shifted to operate in higher-frequency bands, its use for fog monitoring can be more reliable.
| Original language | English |
|---|---|
| Pages (from-to) | 1687-1698 |
| Number of pages | 12 |
| Journal | Bulletin of the American Meteorological Society |
| Volume | 96 |
| Issue number | 10 |
| DOIs | |
| State | Published - Oct 2015 |
| Externally published | Yes |
Fingerprint
Dive into the research topics of 'Cellular network infrastructure: The future of fog monitoring?'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver