KEYWORDS: Calibration, Neural networks, Receivers, Signal to noise ratio, Antennas, Signal processing, Error analysis, Data modeling, Phase compensation, Performance modeling
In order to address problem about the channel phase error, a channel phase calibration method based on deep learning is proposed. Using data mining to replace the traditional method can not only improve the flexibility and stability of the method, but also achieve better results. Firstly, we use the frequency response function to model the channel characteristics, and the channel mismatch model is established to simulate the errors of the channel. Secondly, the error generated by the channel is introduced into the signal to generate the analog data set. Through the training and fitting, we achieved the all-phase calibration. At the same time, a variety of different channel parameters are simulated, and the generalization ability of different channel parameters get verified. Finally, the model network is evaluated in the form of test standard deviation. According to the results, the standard deviation can be controlled within 3°, which proves the effectiveness of this method. In this paper, Octave was used to generate the simulated data set for preprocessing, PyCharm platform was used to build the neural network, and the model was trained based on TensorFlow.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.