Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

Satellite Channel State Prediction based on Fade Slope and Artificial Intelligence (AI) Algorithms

Abstract

The ability to transfer large volumes of data from the ground terminals on earth, to satellites in space, requires systems to operate at higher frequencies (such as Ka, Q/V), where there is adequate bandwidth available to achieve the required multi-gigabyte per second data rate. But these frequency bands become susceptible to attenuation during rainfall. The resulting rain attenuation sometimes exceeds what can be mitigated by providing sufficient fixed fade margin.Therefore, there is a need for adaptive fade mitigation techniques (AFMTs), to compensate for the strong tropospheric impairments, caused in the Ka/Q/V frequency bands. For these techniques to work, adequate channel prediction is essential. This involves generating a better predictive approach of rain attenuation, to have a reliable knowledge of the transmission channel, and hence, adaptively optimise the signal to protect against link outages, to ensure reliability and robustness of the transmission.The objective of this work is to propose and develop an accurate short-term prediction algorithm to provide channel state information (CSI) which refers to the channel properties that affect the propagation of signal from the transmitter to the receiver)) for the next period ahead, so that AFMTs can be deployed to mitigate against system outage during rain attenuation. To this end, an analysis of rain propagation and fade mitigation techniques (FMTs), characterisation of satellite channels, satellite channel state prediction, and application of Artificial Intelligence (AI) for prediction have been presented. Simulations were carried out, using two approaches (Fade slope and Long Short-Term Memory (LSTM)) for the prediction of rain attenuation. These simulations were carried out using three months rain attenuation data obtained from the Chilbolton observatory, UK, from May to July, 2017.By comparing both approaches (Short-Term slope and LSTM) using a defined metric of maximum prediction error percentage (MPEP), it was found that the fade slope approach for the short-termprediction of rain attenuation performed better than the LSTM approach. The fade slope approach yielded MPEP values of 12.38% and 26.62% for prediction term lengths of 1s and 10s respectively whereas the LSTM approach yielded MPEP values of 18.15% and 29.90% for the same lengths. This work, however, was limited by the small amount of training dataset. It is expected the LSTM approach performance should improve with the addition of more training datasets.<br/

Similar works

This paper was published in University of South Wales Research Explorer.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.