N.N. Olyunin, V.V. Sazonov, A.G. Vinogradov
Propagation of a radar signal through a regular inhomogeneity in the troposphere is considered. In the framework of geometrical-optics description depolarization of a signal in an isotropic medium (which the troposphere is) is caused by ray torsion. There is no torsion if we use a spherically symmetric model of the regular atmospheric inhomogeneity, and therefore the depolarization does not take place. Besides the vertical gradient of the refractive index, there are horizontal gradients in the troposphere. In the presence of horizontal gradients the torsion is possible, and therefore the depolarization takes place. The order of magnitude of the depolarization caused by horizontal atmospheric inhomogeneities is estimated in the article.
The method of perturbation is applied to the equation describing the evolution of the field vector along the ray, which is a consequence of the Rytov’s law. A measure of depolarization caused by ray torsion is proposed. Exponential model of the height profile of the refractive index is used. The model of the horizontal gradient is chosen for the direction of the refractive index gradient to change appreciably along the ray. The estimation demonstrated that the magnitude of depolarization in a regular troposphere is negligible, and therefore there is no need to take it into account in practice.