Remote sensing books pdf

Thursday, March 7, 2019 admin Comments(0)

PDF | On Jan 1, , Tempfli and others published Principles of remote In book: Principles of Remote Sensing, Edition: ITC Educational. The Manual on Advanced Remote Sensing and GIS is for both the Trainees and the skills level on Remote Sensing and GIS concepts and their different. UK prof. body - Remote Sensing and Photogrammetry Society. – http://www. 3 Sensing (Hardcover). Limited depth, but very wide-ranging – excellent reference book. .. pdf.

Language: English, Spanish, Japanese
Country: Kiribati
Genre: Science & Research
Pages: 767
Published (Last): 08.02.2016
ISBN: 830-3-29135-856-4
ePub File Size: 25.68 MB
PDF File Size: 19.18 MB
Distribution: Free* [*Regsitration Required]
Downloads: 21728
Uploaded by: LEANDRA

No part of this book may be reproduced or translated in any form, by 2 Electromagnetic energy and remote sensing. 49 The electronic document ( PDF. No part of this book may be reproduced or translated in any form, by print, photoprint, microfilm, microfiche 1 Introduction to earth observation by remote sensing. Geospatial data The electronic document (PDF format) enables fast. Remote Sensing data is one of the primary data sources in GIS analysis. Handling and interpretation of remote sensing data will never be easy. .. Remote Sensing Learning Resources. BOOKS. Beginners. Remote Sensing: Principles and.

They show the shape and size of physical features such as hills and valleys. He was rather forced to assume that a particular oscillator of frequency 'V' is able to exist only in discrete states whose energies are separated by the interval hv, where 'h' is known as the Planck's constant. The satellites will also carry GPS receivers for orbit determination. Many topographic maps of the United States Geological Survey have a scale of 1: Materials with low thermal diffusivity include water and sandy soil; limestone and gravel have intermediate thermal diffusivities; and quartzite and dolomite have high values of thermal diffusivity. Detectable UVband with film and photodetectors, but atmospheric scattering is severe Visible 0. Chlorophyll-a has absorption maxima at 0.

W, Collection of reference data consists of either time-critical or time-stable measurements. Time critical measurements are those made in cases where ground conditions, such as, vegetation conditions or water pollutants which change rapidly with time. Time stable measurements like the geology of the area of interest are those involving the materials under observation, which do not change with time.

The rate of transfer of radiant energy is called the flux and has watts as the units of power. Density implies distribution over the surface on which the radiant energy falls. If radiant energy falls upon a surface then the term irradiance E is used in place of radiant flux density. If the flow of energy is away from the surface, as in the case of thermal energy emitted by the earth or incoming solar energy which is reflected by the earth, then the term radiant exitance or radiant emittance as measured in units of Wm- 2 is used Mather, Radiance L is defined as the radiant flux density transmitted from a small area on the earth's surface and viewed through a unit solid angle.

It is measured in watts per square meter per steradian Wm- 2 S. The concepts of the radian and steradian are illustrated in Fig. The other important terms we come across remote sensing technology is 'reflectance' denoted bye. It is defined as the ratio between the irradiance and the radiant emittance of an object. When remotely sensed images collected over a time period are- to be compared, it is most appropriate to convert the radiance values recorded by the sensor into reflectance in order to eliminate the effects of variable irradiance over the seasons of the year.

There are 21t radians degrees in a circle. A need not refer to a uniform shape. There are 41t radians in a sphere. Mather, The reflectance characteristic of earth's surface features may be quantifted by measuring the portion of incident energy that is reflected. It is a dimensionless quantity. The quantities described above are very often used to refer to particular narrow wavebands rather than to the whole spectrum. The terms are then preceded by the word 'spectral', as in 'spectral radiance for a given waveband is the radiant flux density in the waveband per unit solid angle per unit wavelength' Curran, The sun's light is the form of electromagnetic radiation most familiar to human beings.

The light as reflected by physical objects travels in a straight line to the observer's eye. On reaching the retina, it generates electrical signals which are transmitted to the brain by the optic nerve. These Signals are used by the brain to construct an image of the viewer's surroundings.

This is the process of vision and it is closely analogous to the process of remote sensing; indeed, vision itself is a form of remote sensing. The set of a" electromagnetic waves is called the electromagnetic spectrum, which includes the range from the long radio waves, through the microwave and infrared wavelengths to visible light waves and beyond them to the ultraviolet and to the short wave X-and gamma rays Fig.

To be precise, in some situations electromagnetic energy behaves like waves, while in others it displays the properties of particles. This has been a very controversial point for the past years concerned the nature of electromagnetic energy. This controversy has been sufficient to explain the nature of visible light, even though originally it was not realised that light is a form of electromagnetic energy.

Electromagnetic wave theory formulated by Maxwell in , succeeded in characterising the electric and magnetic fields and their relation to charges and current, and expressing these relationships in a set of partial differential equations now known generally, as Maxwell's equations. Maxwell demonstrated that it was possible to have wave-like configurations of electric and magnetic fields. Maxwell's equations explain a great variety of phenomena relating to propagation, dispersion, reflection, refraction, and interference of electromagnetic waves; but they do not explain the interaction of electromagnetic energy with matter on an atomic and molecular level.

In Planck found that, in order to the correct distribution of energy emitted by a black body, he could not assume that the constituent oscillators gain and lose energy continuously. He was rather forced to assume that a particular oscillator of frequency 'V' is able to exist only in discrete states whose energies are separated by the interval hv, where 'h' is known as the Planck's constant. Planck's ideas were applied and extended shortly afterwards. However, it was Schrodinger in who formulated wave mechanics in terms of a wave equation.

The Schrodinger 30 Remote Sensing - Basic Principles wave equation for atomic-molecular scale problems is not really derivable, and should be regraded as the counterpart of Newton's laws of motion for macroscopic bodies. It is used and accepted, not because of its derivation showing validity, but because when properly applied it yields correct results consistent with observation and experiment.

The schrodinger wave equation directly yields the allowed energy levels of an atomic or molecular systems. Based on the historical development of understanding the nature of electromagnetic energy, it is presently possible to furnish a consistent and unambiguous theoretical explanation for all optical phenomena using a combination of Maxwell's electromagnetic wave theory and the modern quantum theory.

Maxwell's theory deals primarily with the propagation and macroscopic optical effects of electromagnetic energy, while quantum theory is concerned with the atomic molecular absorption and emission aspects of radiation. Maxwells Theory The four differential equations that form the basis of electromagnetic theory are generally referred to as "Maxwells's equations," and they are expressed in mathematical terms.

The electric and magnetic fields may exist in regions where no electric charges are present. When the fields at one point in space vary with time, then some variation of the fields must occur at every other point in space at some other time, and consequently, changes in the fields propagate throughout space.

The propagation of such a disturbance is called an electromagnetic wave. According to Maxwell the electromagnetic state at a pOint in a vacuum can be specified by two vectors: E, the electric field in volts per meter and H, the magnetic field in ampere turns per meter. These vector quantities are completely independent of each other in the static case, and are determined by the distribution of all charges and currents in space.

In the dynamic case, however, the fields are not independent, but rather their space and time derivatives are interrelated as expressed by the curl V equations. These four equations are "Maxwell's equations" for a vacuum.

The four equations or both the fields satisfy the same partial differential equation, 31 Remote Sensing and GIS The major implication of the equation is that changes in the fields E and H propagate through space at a speed equal to the constant value c, which is known as the speed of light, with a measured value of 2.

This is the fundamental solution to the wave equation, representing a plane harmonic wave for which the solution is the same as that for the magnetic field. It can be shown that the magnetic and electric components are perpendicular to each other and that these plane waves are both perpendicular to the direction of propagation Fig. In summary, it can be seen that all electromagnetic radiation is energy in x y Fig. It consists of inseparable oscillating electric and magnetic fields that are always mutually perpendicular to each other and to the direction of propagation, the rate of propagation being constant in a vacuum.

Textbook of Remote Sensing and Geographical Information Systems (M.Anji Reddy, 2e, 2008) - Book.pdf

These are called quanta or photons. The dilemma of the simultaneous wave and particle waves of electromagnetic energy may be conceptually resolved by considering that energy is not supplied continuously throughout a wave, but rather that it is carried by photons.

The classical wave theory does not give the intensity of energy at a point in space, but gives the probability of finding a photon at that point. Thus the classical concept of a wave yields to the idea that a wave simply describes the probability path for the motion of the individual photons.

The particular importance of the quantum approach for remote sensing is that it provides the concept of discrete energy levels in materials. The values and arrangement of these levels are different for different materials. Information about a given material is thus available in electromagnetic radiation as a consequence of transitions between these energy levels. A transition to a higher energy level is caused by the absorption of energy, or from a higher to a lower energy level is caused by the' emission of energy.

The amounts of energy either absorbed or emitted correspond precisely to the energy difference between the two levels involved in the transition. Because the energy levels are different for each material, the amount of energy a particular substance can absorb or emit is different for that material from any other materials. Consequently, the position and intensities of the bands in the spectrum of a given material are characteristic to that material.

The wavelength, denoted by A, is the distance between adjacent intensity maximum for example ofthe electromagnetic wave, and consequently, it may be expressed in any unit of length. The frequency denoted by v, is the number of maxima of the electromagnetic wave that passes a fixed point in a given time. Frequency is commonly expressed in reciprocal centimeters, also called wave numbers cm- 1 or cycles per second cps which are also called Hertz Hz.

The wavelengths may assume any value, although for most practical purposes the spectrum is usually presented between 16 and 10 7 m, or from the cosmic ray to the audio range. However, wavelengths as long as 10 11 m have been detected by sensitive magnetometers. No matter what the wavelength of the electromagnetic radiation, it is all generated by electrically charged matter. However, there is no universal radiation generator that provides a useful intensity of radiation at all wavelengths for practical purposes, and there is no universal wavelength resolving instrument or universal detector.

Consequently, the spectrum has been divided into regions that bear names related to the sources that produce it, such as, the "ray" regions, or as extensions from the visible range such as, the ultraviolet and the infrared regions, or according to the way in which wavelengths in a range are used such as, radio and television.

The extent of the wavelength ranges corresponding to these names were made arbitrarily, and the decision as to where the divisions should be was made mostly on the basis of the limits imposed by the range of the human eye visible , the properties of optical materials, and the response limits of various sources and detectors. In brief, the electromagnetic spectrum is the continuum of energy that ranges from meters to nano-meters in wave length, travels at the speed of light, and propagates through a vacuum like the outer space Sabins All matter radiates a range of electromagnetic energy, with the peak intensity shifting toward progressively shorter wave lengths at an increasing temperature of the matter.

In general, the wavelengths and frequencies vary from shorter wave length high frequency cosmic waves to long wavelength low frequency radio waves. The wave lengths of greatest interest in remote sensing are visible and near-infrared radiation in the wave band 0.

Spectral Wave Bands Visible light is electromagnetic radiation with wavelengths between 0. The eye is not uniformly sensitive to light within this range and has its peak sensitivity at 0. This peak in the response function of the human eye corresponds closely to the peak in the sun's radiation emittance distribution. Electromagnetic radiation with wavelengths shorter than those of visible light O. Because of the effect of scattering and absorption, none of these bands is used in satellite remote sensing.

The infrared waveband, extending from 0. Short wavelength or near - IR between 0. Infrared radiation with a wavelength upto 3 flm is reflected by the surface of the earth. Beyond a wavelength of 3 flm, IR radiation emitted by the earth's surface can be sensed in the form of heat. The region of the spectrum composed of electromagnetic radiation with wavelengths between 1 mm and cm is called the microwave band and radiation at these wavelengths can penetrate the clouds. The microwave band is thus a valuable region for remote sensing.

Beyond the microwave region is the , radioband of very long wavelengths used in certain radar applications. The electromagnetic wavebands with their utility in remote sensing are described in Table 2.

All stars and planets emit radiation. Our chief star, the sun is almost a spherical body with a diameter of 1. The continuous conversion of hydrogen to helium which is the main constituents of the sun, generates the energy that is radiated from the outer layers.

X-ray 0. Not employed in remote sensing. Ultraviolet 0. Photographic 0. Detectable UVband with film and photodetectors, but atmospheric scattering is severe Visible 0. Includes reflected energy peak of earth at 0. Infrared 0. Atmospheric transmission windows are separated. Reflected 0. The band from 0. Images at these wavelengths are acquired by optical mechanical scanners and special vidicon systems but not by film.

Microwave 0. Images may be acquired in the active or passive mode. Radar 0. Some classified radars with very long wavelengths operate in this region. This includes the energy reflected by clouds and atmosphere. If the sun were a perfect emitter, it would be an example of an ideal black body. A black body transforms heat energy into radiant energy at the possible maximum rate consistent with Planck's law which defines the spectral exitance of a black body as follows Henderson, The wavelength at which the maximum spectral exitance is achieved is reduced as the temperature increases.

The dashed line in Fig. Wien's displacement law. The law gives the wavelength of maximum spectral exitance Am in the following form: The solar radiation, maximum of which occurs at 0. Wavelength dependent mechanisms of atmospheric absorption alter the solar irradiance that actually reaches the surface of the earth. Generally, the selection of wavebands for use depends on a the characteristics of the radiation source, b the effects of atmospheric absorption and scattering, and c the nature of the target.

This passage will alter the speed, frequency, intensity, spectral distribution, and direction of the radiation. As a result atmospheric scattering and absorption occur Curran, These effects are most severe in visible and infrared wavelengths, the range very crucial in remote sensing. During the transmission of energy through the atmosphere, light interacts with gases and particulate matter in a process called atmospheric scattering. The two major processes in scattering are selective scattering and non-selective scattering.

Rayleigh, Mie and Raman scattering are of selective type. Non selective scattering is independent of wavelength. It is produced by particles whose radii exceed 10 Ilm, such as, water droplets and ice fragments present the clouds. This type of scattering reduces the contrast of the image. While passing through the atmosphere, electromagnetic radiation is scattered and absorbed by gasses and particulates. Besides the major gaseous components like molecular nitrogen and oxygen, other constituents like water vapour, methane, hydrogen, helium and nitrogen compounds play an important role in modifying the incident radiation and reflected radiation.

This causes a reduction in the image contrast and introduces radiometric errors. Regions of the electromagnetic spectrum in which the atmosphere is transparent are called atmospheric windows. The atmosphere is practically transparent in the visible region of the electromagnetic spectrum and therefore many of the satellite based remote sensing sensors are designed to collect data in this region. Some of the commonly used atmospheric windows are 0. Shows relative scatter as o 3 0.

The characteristics of all the four types of scattering in the order of their importance in remote sensing are given in Table 2. TABLE 2. Mie Same size as Spherical Physical scattering Affects all visible the wavelength particles, under overcast wave lengths of radiation.

Therefore, the remaining signal can be interpreted in terms of suspensions only after a careful correction for the atmospheric contribution. For this reason the varying optical parameters of atmosphere must enter the radiative transfer calculations Fischer J, Before we study the 40 Remote Sensing - Basic Principles effects of solar radiation and atmospheric properties, we shall consider the mass quantities which determine the spectral upward radiance.

The source of the shortwave radiation field in atmosphere is the Sun emitting in a broad spectral range. The extraterrestrial irradiance at the top of the atmosphere, the solar constant, depends on the black body emission of the Sun's photosphere and on the scattering and absorption process in the Sun's chromosphere. Important Fraunhofer lines caused by the strong absorption in the Sun's chromosphere show some prominent drops in the spectral distribution of the solar radiation.

E The Chappuis band of ozone in the visible spectrum is the only ozone band used to detect the oceanic constituents from space. The transmission of the chlorophyll fluorescence to the top of the atmosphere is hindered through the absorption by water vapour and molecular oxygen in their vibration action bands.

In order to study the selective gaseous absorption in the radiative transfer calculations the transmission functions of 02 and H 2 0 are computed 41 Remote Sensing and GIS from absorption line parameters by explored through areas of Lorentz's theory of collision broadening.

The contribution from resonance broadening is negligible in the spectral region considered. Also the Doppler line broadening, which is small when compared with Lorentz line widths, is neglected since the area absorption takes place in the atmosphere below 40 km Barrow The reduction in the solar flux due to absorbtion and scattering by a clear mid-latitude summer atmosphere. Response studies for the temperature and pressure dependence of the transmission function have been performed and show only a weak influence for the temperature effect.

The pressure impact is not negligible and has to be accounted for. Air molecules are small compared to the wavelength of the incoming sunlight. Hence, the extinction through molecular scattering can be determined with Rayleigh theory. The necessary property. Since molecular scattering within the atmosphere depends mainly on pressure, the scattering coefficient can be estimated by climatological measurement. Atmospheric spectral turbidity variations are caused by variations in aerosol concentration, composition and size distribution.

The vertical distribution of the aerosols is taken from Adler and Ken, The phase functions of aerosols are nearly wavelength independent within the visible and near infrared. For the radiative transfer calculations the scattering functions are estimated by Mie theory. The range of atmospheric turbidity values used to study the effects of aerosol scattering on the measured spectral radiances correspond to horizontal visibilities at the surface between 6 and 88 km.

As shown in Fig. These two atmospheric effects are expressed mathematically as follows Lillesand, and Kiefer The amount of irradiance depends on seasonal changes, solar elevation angle, and distance between the earth and sun.

Applying the principle of conservation of energy, the relationship can be expressed as: Where, and, EI A. In remote sensing, the amount of reflected energy ER A. Therefore, it is more convenient to rearrange these terms like From this mathematical equation, two important points can be drawn.

Simply, it can be understood that, the measure of how much electromagnetic radiation is reflected off a surface is called its reflectance. The reflectance range lies between 0 and 1. A measure of 1. The reflectance characteristics are quantified by "spectral reflectance, p A. According to Kirchoff's law of physics, the absorbance is taken as emissivity s. Therefore Eq.

The classical example of this type of object is snow white object. Black body such as lamp smoke is an example of this type of object.

Therefore it can be seen that the reflectance varies from 0 black body to 1 white body. When we divide the incident energy on both sides of the balance equation, we get the proportions of energy reflected, absorbed and transmitted which vary for different features of the earth depending on the material type. These differences provide a clue to differentiate between features of an image.

Thus two features which are indistinguishable in one spectral range, may exhibit a marked contrast in another wavelength band. Because many remote sensing systems operate in the wavelength regions in which reflected energy predominates, the reflectance properties of terrestrial features are very important.

The manner of interaction is described by the spectral response of the target. The spectral reflectance curves describe the spectral response of a target in a particular wavelength region of electromagnetic spectrum, which, in turn depends upon certain factors, namely, orientation of the sun solar azimuth , the height of the Sun in the sky solar elevation angle , the direction in which the sensor is pointing relative to nadir the look angle Fig.

Surface normal N Elevation is measured upwards from the horizontal plane. Azimuth is measured clockwise from north. The zenith angle is measured from the surface angle, and equals 90 minus elevation angle, in degrees.

The spectral reflectance curves for vigorous vegetation manifests the "Peak-and-valley" configuration. The valleys in the visible portion of the spectrum are indicative of pigments in plant leaves.

Dips in reflectance Fig. The soil curve shows a more regular variation of reflectance. Factors that evidently affect soil reflectance are moisture content, soil texture, surface roughness, and presence of organic matter. The term spectral signature can also be used for spectral reflectance curves. Spectral signature is a set of characteristics by which a material or an object may be identified on any satellite image or photograph within the given range of wavelengths.

The characteristic spectral reflectance curve Fig. However, the spectral reflectance of water is significantly affected by the presence of dissolved and suspended organic and inorganic material and by depth of the water body.

Experimental studies in the field and in the laboratory as well as experience with multispectral remote sensing have shown that the specific targets are characterised by an individual spectral response. Indeed the successful development of remote sensing of environment over the past decade bears witness to its validity. In the remaining part of this section, typical and representative spectral reflectance curves for characteristic types of the surface materials are considerd.

Imagine a beach on a beautiful tropical island. Solid lines I represent incident rays, lines 4 and 5 are volume rays. After Vincent and Hunt The solid lines in the figure represent the incident rays, and dashed lines 1, 2, and 3 represent rays reflected from the surface but have never penetrated a sand grain. The latter are called specular rays by Vincent and Hunt , and surface-scattered rays by Salisbury and Wald ; these rays result from first-surface reflection from all grains encountered.

For a given reflecting surface, all specular rays reflected in the same direction, such that the angle of reflection the angle between the reflected rays and the normal, or perpendicular to the reflecting surface equals the angle of incidence the angle between the incident rays and the surface normal.

The measure of how much electromagnetic radiation is reflected off a surface is called its reflectance, which is a number between 0 and 1. In the case of first-surface reflection, this measure is called the specular reflectance, which will be designated here as rs A.

The A in parentheses indicates that specular reflectance is a function of a wavelength. The reason that rS A is a function of a wavelength is that the complex index of refraction of the reflecting surface material is dependent on a wavelength.

The term complex means that there is a real and imaginary part to the index of refraction. Every material has a complex index of refraction, though for some materials at some wavelengths, only the real part of the complex index of refraction may be nonzero.

For instance, if the specular reflectance of three grains for a particular wavelength of electromagnetic radiation were 0. The specular reflectance of the beach surface, RS A , is the average of. Rays of electromagnetic radiation that have been transmitted through some portion of one or more grains are called volume rays.

These are shown as dashed lines 4 and 5 in Fig. The equation for the volume reflectance, r5 A , of a sand grain is complicated because it depends on both the transmittance of the grain and the interface reflectance of the top of that grain and the underlying grain s. The average rS A for all the grains in the beach from which electromagnetic radiation is reflected is defined as the volume reflectance of the beach, RS A.

The total reflectance of the beach, RT A , is the averaged sum of the specular and volume reflectance, as follows: Three important observations can be summarised from the above discussion on the beautiful beach island. Note that when we use the terms transparent or opaque to explain optical behavior, we must designate both a wavelength region and the material because the complex index of refraction of any material is generally non-constant over a large range of wavelength.

To consider the effect on reflectance of mixing several minerals together, let us take the simpler case of a particulate medium consisting of several mineral constituents, with air filling the interstices between particles. It is possible for us to estimate the spectral reflectance of a mixed-mineral particulate sample by using a linear combination of the reflectance spectra of its mineral constituents, weighed by the percentage of area on the sample's surface that is covered by each mineral constituent.

The following equation demonstrates this estimation for the total spectral reflectance of a mixed particulate sample at wavelength A. Robert K. Courtesy of R. Thus so far we have talked about volume reflectance and specular reflectance on the basis of whether electromagnetic rays did or did not penetrate one or more grains in a soil or rock surface.

Now we need to define some reflectance terms that 52 Remote Sensing - Basic Principles relate to the manner in which the soil or rock surface is illuminated, as well as, how the reflected energy from its surface is measured. The most fundamental term for reflectance used in this book is defined as spectral hemispherical reflectance or diffuse reflectance.

As "a" decreases the intensity of energy increases, and vice versa. This common effect in photographic images is called vignetting. The microwave band is a valuable region for remote sensing in view of two distinctive features, i Microwaves are capable of penetrating the atmosphere under almost all conditions.

Depending on the wave lengths involved, microwave energy can 'see through' haze, light rain, snow, clouds, and smoke, ii Microwave reflections or emissions from earth materials bear no direct relationship to their counterparts in the visible or thermal portions of the spectrum.

The surfaces that appear rough in the visible may be smooth in microwave. Remote sensing techniques in the microwave region of electromagnetic spectrum can be classified into two categories Reeves, Active systems provide their own illumination, where as passive systems record the energy of thermal origin emitted from materials.

Active microwave sensing systems are of two types and they are imaging sensors and non-imaging sensors. The radar is an acronym derived from Radio Detector and Ranging. These imaging radars are divided into two categories. The first category is real aperture, and the second one is synthetic aperture systems. In the real aperture system, resolution is determined by the actual beam Remote Sensing and GIS width and antenna size. The synthetic aperture system utilises signal processing techniques to achieve narrow beam width in the long track direction which provides better resolution.

Non-imaging remote sensing radars are either scatterometers or altimeters. Any calibrated radar that measures the scattering properties of a surface is called scatteromf! Passive microwave sensors called radiometers, measure the emissive properties of the earth's surface.

A radar altimeter sends out pulses of microwave signals and record the signal scattered back from the earth surface. The height of the surface can be measured from the time delay of the return signals.

A wind scatterometer can be used to measure wind speed and direction over the ocean surface. It sends out pulses of microwaves along several directions and records the magnitude of the signals that are back scattered from the ocean surface. The magnitude of the backscattered signals is related to the ocean surface roughness, which, in turn, is dependent on the sea surface wind conditions, so that the wind speed and direction can be derived. Imaging radars are side looking rather than nadir looking instruments and the geometry is complicated by foreshortening to the extent that the top of a mountain appearing closer to the sensor than the foot of the mountain, and shadow caused by the far side of a mountain or hill is being invisible to the side looking radar sensor.

A microwave radiometer is a passive device which records the natural microwave emission from the earth. It can be used to measure the total water content of the atmosphere within its field of view. Application potential of radar remote sensing for various disciplines like soil moisture, agriculture, geology, hydrology, and oceanography, has been demonstrated through various ground based, aircraft and space craft experiments.

This chapter provides the principles of radar remote sensing. The microwave portion of the spectrum includes wavelength within the approximate range of 1 m.

In active microwave remote sensing, the radar antenna transmits short burst pulses of energy to the target and echoes from these targets carry informations about the position range and quality of the illuminated objects. The radar equation relates the influence of the system and terrain parameters to the power received by the antenna Reeves as shown below: Therefore, the power received from a resolution cell is a combined power obtained by adding the powers from these scatterers.

The above equation can be converted to The back scattering coefficient, according to Elachi , is defined as the ratio of the energy received by the sensor over the energy that the sensor would have received if the surface had scattered the energy incident on it in isotropic fashion.

This is expressed in decibels dB. Back scattering coefficient describes the 57 Remote Sensing and GIS terrain contributing factor to the radar image tone, and the radar cross sections per unit area resolution cell. It is a result of the sensor-target interaction Fig. The backscattering coefficient can be a positive number focusing energy in the back direction or a negative number away from the back direction. Electrical target characteristics I. Wavelength, frequency 2. Surface inhomogeneities 2.

Polarization - microrelief resonant components 3. Angle of incidence - mesorelief surface roughness 4. Flight parameters 3. Sub-surface structures flight direction, altitude 5. They are also governed by physical and electrical properties of the target.

The electromagnetic property of materials is expressed by the complex relative permitivity dielectric constant For a conducting medium, the amplitude of a wave propagation in it is attenuated exponentially with distance. Therefore strong backscatter is observed only in nadir direction. Rough surfaces tend to reradiate uniformity in all the directions diffuse scattering , so they give relatively strong radar returns in all the directions.

For a smooth surface where the surface roughness scale is much shorter than wavelength, incident energy is reflected off specularly as illustrated in Fig. As the roughness scale approaches the same dimension as the wavelength, the scattered energy is dispersed, and when the roughness scale exceeds the wavelength of the incident energy, scattering is nearly uniform over the hemisphere. More exact classification of surface roughness considering surface slopes is defined by Fung Slightly rough surface: Smooth undulating surface: Two scale composite rough surface: Surface scattering is caused normally at the air-ground interface, whereas volume scattering is caused by the dielectric discontinuities in a volume.

The surface scattering mechanism is an important component of radar scattering process. In general, surface scattering occurs at the air-ground interface. For a perfectly smooth surface, the incident wave will excite the atomic oscillators in the dielectric medium at a relative phase such that the reradiated field consists of two plane waves, namely, reflected wave and refracted wave or transmitted wave Fig.

For rough surface, energy is scattered in all directions depending upon the roughness of the surface as well as dielectric properties of the surface. The roughness can be statistically characterised by its standard deviation relative to the mean flat surface. The surface correlation length is the separation after which two points are statistically independent, that is, the length after which auto-correlation function is less than 1 Ie. The description of models likel point scattering model, facet model, and Bragg model, are beyond the scope of this handbook.

Table 3. Although a radar signal does not detect color information or temperature information, it detects surface roughness and electrical conductivity information in soil moisture conditions.

Hence, the wavelength, depression angle, and the polarisation of the signal are important properties. There are two categories of side-looking airborne radar SLAR systems, namely, real aperture and synt: The latter is the focus of this review, but the real aperture SLAR systems may be briefly considered so as to understand why synthetic aperture radar SAR systems have been developed.

This pulse moves out radially from the antenna and results in a beam being formed which is vertically wide but horizontally narrow. The time taken by a pulse to move away from the antenna. From this time measurement, it is possible to determine the distance between the antenna and the object in the slant range.

An image product is generated in a film recorder by using the signal to control the intensity of the beam on a single line cathode ray tube CRT , and recording this Radar pulse sent from aircraft 13 "'17 1: The film is advanced at a rate proportional to the aircraft's motion.

In this way the combined response of many radar pulses is used to generate an image in which each line is the tonal representation of the strength of the signals returned to the radar antenna from a single pulse Lillesand and Kiefer, The ground resolution cell size of SLAR system mainly depends on pulse length and antenna beam width. The pulse length is defined as the length of time that the antenna emits its energy.

Pulse length determines the spatial resolution in the direction of propagation Fig. This direction is called range resolution. The other resolution: Therefore the resolution in the radar system is measured in two directions, along the track azimuthal and across the track range resolution.

The effective resolution is the minimum separated distance that can be determined between two targets with echoes of similar strength. The resolution in the range direction is given by In the azimuthal direction, however, the resolution R is determined by the angular beam width of the antenna, and the slant range distance which can be expressed as: This is because the radar beam 'fans out' with increasing distance from the aircraft. This results in a deterioration of the azimuthal resolution with increasing range, and so objects which are separable close to the flight line are not distinguished further.

At spacecraft altitude, the azimuthal resolution becomes too coarse. Ground range resolution Slant-range resolution To obtain a fine range resolution shorter pulses would need to be transmitted, and these would require a high peak power. The radar systems in which beam width is controlled by the physical antenna length, are called brute-force, real aperture or non-coherent radars. The microwave energy scattered back to the spacecraft is measured.

The SAR makes use of the radar principle to form an image by utilising the time delay of the back scattered signals Fig. In real aperture radar imaging, the ground resolution is limited by the size of the microwave beam sent out from the antenna.

Finer details on the ground can be resolved by using a narrower beam. The beam width is inversely proportional to the size of the antenna, that is, the longer the antenna, the narrower the beam. It is not feasible for a spacecraft to carry a very long antenna that is required for high resolution imaging of the earth surface.

The antenna's footprint sweeps out a strip parallel to the direction of the satellite's ground track. With a SAR system there is the advantage that the azimuth or along track resolution is improved by making it independent of the range. This is because with a SAR system a physically short antenna is made to behave as if it were much longer. This aperture synthesis is achieved by recording not only the strength of a returned signal from an object on the ground, but also the signal's frequency.

With this extra information the beam width can be effectively narrowed when the Doppler shift is used Lillesand and Kiefer, Therefore, it is possible for a single moving antenna to successively occupy the element positions of X to Xo in an array of length L. Under these conditions, it is possible in principle to combine observations from the moving antenna to "- synthesise an array of length L - R D' Because the SAR system is a coherent system and incorporates two-way signal propagation, it can be shown that an array of the form described can provide an effective angular resolutiol1 given below: This calls for the use of a smaller antenna if a fine resolution is to be achieved, unlike the real aperture SLAR.

This means that short pulses, which require a high peak power, need not be used.. The pulse band width, IX, can be made quite large by using chirp techniques without excessive peak power requirements.

The Radar system uses either a slant range or ground range presentation for the across track coordinates. The raw image suffers from geometrical and radiometric errors which are discussed in the following sections.

Hence, it suffers from the effects of speckle noise which arises from coherent summation of the signals scattered from ground scatterers distributed randomly within each pixel. A radar image appears more noisy than an optical image. The speckle noise is sometimes suppressed by applying a speckle removal filter on the digital image before display and further analysis. Because of wave interference, random fluctuations from an extended target appear as speckles on radar imagery.

The random variation in the brightness of individual pixels can be large if the pixels are observed once, whereas if they are observed many times, speckles can be reduced. It is for this reason that speckles are more severe in synthetic aperture radar images as compared to real aperture radar images. Fading of a radar signal which causes speckles, complicates image interpretation. It is often necessary to reduce the effect of speckles by the knowledge of spatial gray level distribution.

The intensity of each pixel represents the proportion of microwave backscattered from that area on the ground which depends on a variety of factors, such as, type, size, shape and orientations of the scatterers in the target 70 Microwave Remote Sensing area, moisture content of the target area, frequency and polarisation of the radar pulses and the incident angles of the radar beam.

It very often requires some familiarity with the ground conditions of the areas imaged. As a useful rule of thumb, the higher the backscattered intensity, the rougher is the surface being imaged. Flat surfaces, such as, paved roads, runways, or quiet water normally appear as dark areas in a radar image since most of the incident radar pulses are specularly reflected away.

Calm sea surfaces appear dark in SAR images. However, rough sea surfaces may appear bright especially when the incident angle is small. The presence of oil films smoothen out the sea surface. Under certain conditions when the sea surface is sufficiently rough, oil films can be detected as dark patches against a bright background. Trees and other vegetation are usually moderately rough on the wavelength scale.

Hence, they appear as moderately bright features in the image. The tropical rain forests have a characteristic backscatter coefficient between -6 and -7 dB, which is spatially homogeneous and remains stable in time. For this reason, the tropical rainforests have been used as calibration targets in performing radiometric calibration of SAR images.

Very bright targets may appear in the image due to the corner-reflector or double bounce effect where the radar pulse bounces off the horizontal ground or the sea towards the target, and then reflected from one vertical surface of the target back to the sensor as in the case of cargo containers.

Built-up areas and many man- made features usually appear as bright patches in a radar image due to the corner reflector effect. The brightness of areas covered by bare soil may vary from dark to very bright depending on its roughness and moisture content. Typically, rough soil appears bright in the image. For a similar soil roughness, the surface with a higher moisture content will appear brighter.

The interpretation of radar images depends on the geometrical properties and other characteristics of imaging systems. The geometrical characteristics that affect the microwave remote sensing data are discussed in the following paragraphs. In a radar image, broadly four geometrical characteristics are observed, namely, i slope foreshortening, ii aspect, iii radar shadow, and iv layover.

Slopes are often made to appear shorter than they really are, that is, they are foreshortened. A slope is recorded at its true length if it slopes away from the radar antenna at an angle at which it is truncated in its slant range presentation.

For slope 1 in Fig. Foreshortening thus occurs when a slope is less steep than when perpendicular to the wavefront, with the base of the slope intercepting the wavefront first Lillesand and Kiefer, , as shown for slope 2 in Fig.

If, however, the terrain slope is steeper than a line perpendicular to incident wavefront, so that its top intercepts the radar beam wavefront before the base of the slope, then layover occurs. This is illustrated in Fig.

It was shown above that foreshortening occurs when a slope facing the radar is not as steep as the wavefront. With layover, however, a slope facing the radar appears to be steeper than it really is, as its slope is steeper than that of the wavefront.

From the above discussion of layover and foreshortening, and the fact that the depression angle varies across the image swath, it can be seen that similar terrain slopes at different positions from the flight line will be recorded differently. Layover is most likely to occur in the near range where the depression angle is low. Layover and foreshortening are important factors, and should be considered while interpreting radar imagery.

This is not only because of the distortion they introduce into the image but also because the amount of energy received per unit area varies with the angle at which the energy is received and so with the slope. The angle at which the energy is received at each surface is critical to a distribution of the energy back scatter from that surface.

For flat terrain the angle at which the energy arrives varies across the swath, and further complexity is introduced by variations in the surface slope.

This can greatly affect the backscatter recorded and so the interpretability of the imagery. Slope orientation relative to the radar look direction is therefore important. This is because the slope facing the radar beam from illuminating the backslope. Areas of radar shadow are generally more common in the far range, because in the near range few backslopes are steep enough to be obscured from the radar beam.

The term 'remote sensing' is used to refer to the aircraft mounted systems developed for military purposes during the early part of the 20 th century. Air borne camera systems are still a very important source of remotely sensed data Lillesand kiefer, Although photographic imaging systems have many uses, this chapter is concerned with image data collected by satellite sensing systems which ultimately generate digital image products.

Space borne sensors are currently used to assist in scientific and socioeconomic activities like weather prediction, crop monitoring, mineral exploration, waste land mapping, cyclone warning, water resources management, and pollution detection.

All this has happened in a short period of time. The quality of analysis of remote sensing data and the varied types of applications to which the science of remote sensing is being put to use are increasing enormously as new and improved spacecraft are being placed into the earth's orbit. The primary objectives, characteristics and sensor capabilities of the plethora of remote sensing satellites Circling this planet, are discussed in this Chapter.

An attempt is made to classify the satellites into three types, namely, Remote Sensing Platforms and Sensors earth resources satellites, meteorological satellites, and satellites carrying microwave sensors. This classification is not rigid. For instance, most of the meteorologic;al satellites are also capable of sensing the resources of the earth. Before turning to the individual satellite's description and the corresponding sensors and capabilities, a brief overview of satellite system parameters is presented in the following paragraphs.

Broadly, the system parameters are of two types: The principal instrumental parameters, namely, wavelength or frequency, polarisation, and sensitivity or radiometric resolution are determined by the design of the transmitter, receiver, antenna, detectors, and data handling system. The principal viewing parameters are determined by both the instrument design and the orbital parameters of the satellite.

In practice, however, the transparency or otherwise of the earth's atmosphere limits the possible wavelength ranges to about 0.

Sensors designed to detect atmospheric constituents utilise spectral bands between the atmospheric 'windows' Robert Massom, Naturally occurring radiation from the earth's surface is found in all these ranges of wavelength.

In the TIR band, on the other hand, the main source of radiation is, as the name suggests, the blackbody thermal mechanism by which all objects above absolute zero emit radiation. For an object at a typical terrestrial temperature, most of this radiation is emitted at wavelengths around 10 flm.

The Remote Sensing Data Book

Detected radiation essentially contains information on two parameters, namely the temperature of the target material and its effectiveness in emitting radiation in this waveband, called the emissivity. Emissivity is a unique characteristic of the target material and its state or condition Wolfe and Zissis.

Again, the detected signal is governed by both the target temperature, and its type and condition. The other main use of the microwave region is for active remote sensing, by which radiation is emitted by the remote sensing instrument and detected after its reflection from the target material.

Active remote sensing in the microwave region is called radar, and the main observable parameters are the range to the target from the time delay of the returned signal and the reflectivity of the material which in turn is determined by many of its physical properties.

This concept was discussed in detail in Chapter 3. The reflective and emissive properties of a material are different for different polarisations, that is, orientations of the electrical field vector in the electromagnetic radiation, where H is horizontal and V is vertical, so that further information on the physical properties of the target material may be obtained by observing different polarisations.

The sensitivity of a remote sensing system measures the response produced by the radiation of a given intensity and wavelength.

Other things being equal, it should be as large as possible, but because the output data are usually digitised, they can only span a finite range of values so that a high sensitivity low value of the minimum detectable signal implies a low value for the maximum signal that can be detected.

This then requires some kind of optimisation, and what is optimal for one kind of target material may not be optimal for another. This may often cause saturation of the detecting system. The spectral resolution and radiometric resolution which are the measures of sensitivity of satellite senSing system are discussed in detail in the following sections.

Satellites used for remote sensing are generally of two types, geostationary and near earth, polar orbiting and sun-synchronous, circular and near polar. Geostationary are stationary with respect to the earth and are at an altitude of about kms above a point on the equator, that is, geostationary satellites maintain a fixed location with respect to the earth's surface. Conversely, a satellite in a low polar orbit traces out a curving path over the earth's surface, as a consequence of the satellite's orbital motion and precision and of the earth's rotation about its axis.

This path the subsatellite track wraps itself round the earth from east to west like a ball of string, oscillating between the equal north and south latitudes in a pattern set by the inclination of the orbit. The oath may 76 Remote Sensing Platforms and Sensors close up on itself if the orbital parameters inclination, height and eccentricity are suitably chosen, in which case the satellite will revisit a given location at regular intervals.

This interval may in general be any integral number of days, though other constraints on the orbital parameters may limit the choice. The point where the satellite, travelling northwards, passes directly over the equator is called the ascending node. The descending node describes the southward crossing Stewart, and Rees, Al issa 1'Atmosphere par Lidar sur Saliout French backscatter lidar, for measurement of cloud and aerosol structure, carried on Mir-I.

Pulse repetition frequency: Almaz-I Russian satellite, launched in March , lifetime 1. Global land, ocean, atmosphere observations. Period 92 minutes; inclination 73". Principal instruments: Almaz-I B Russian satellite, scheduled for launch in with a nominal lifetime of 3 years.

Circular LEO at km altitude. Period 90 minutes; inclination 73". Cartography, environmental monitoring, hazard monitoring. Circular Sun-synchronousLEO. Equator crossing time K, band Pulse length uncompressed: Range precision: Beam-limited footprint: Pulse-limited footprint: Dual frequency operation allows for correction of ionospheric delays. Altitude, orbital See height, orbital.

Height range: AMAS is designed to measure temperature and pressure profiles, ozone, water vapour, N20 and other chemical constituents of the middle and upper atmosphere. Incidence angle: Radiometric resolution: In scatterometer mode, the AM1 generates beams in directions 45", go", " zero is along-track, 90" to the right.

The near edge of the swath is km to the right of the sub-satellite track; the far edge is km from it. It is based on the design of the TMR. Freauen- a: H and V. Absolute accuracy: Temperature resolution: OK MHS. AMSU uses channels for temperature profiling, using the oxygen absorption band between 50 and 60 GHz.

The other AMSU channels provide corrections for surface temperature and atmospheric water and water vapour. Angstrom relation The assumption that the optical thickness r of an atmospheric aerosol varies with wavelength X according to 7 0: Angular frequency see frequency.

Antenna A transducer that couples a guided electromagnetic wave on a transmission line to an unguided wave in the medium surrounding the antenna, and vice versa. Antennas are used by both passive microwave radiometers see passive microwave radiometry and radar systems. The directional pattern of an antenna, specified by its power pattern, consists of a main beam surrounded by a number of sidelobes, with the effective width of the beam p radians in a given plane being approximately equal to AID, where X is the wavelength and D is the width of the antenna in that plane see beamwidth.

The antenna may consist of a single element, such as a parabolic dish, a horn, or a microstrip radiating patch, or it may consist of an array of such elements. A synthetic 8 aperture may be formed by combining the signals received by a real antenna as it travels in space and then processing them together as if they had been received by a long array of individual elements, thereby attaining high angular resolution in the plane containing the synthetic array.

This is the basis of synthetic aperture radar. Antenna temperature The antenna temperature represents the power received by an antenna and transferred to the receiver. For a lossless antenna with a power pattern consisting of only a single main beam with no sidelobes, the antenna temperature TA is equal to the brightness temperature Tb incident upon the antenna through its main beam, where Tb represents the power radiated by the scene observed by the antenna.

A real antenna, however, is characterised by a radiation efficiency q and a main-beam efficiency qm, where q accounts for ohmic losses in the antenna structure and rl, accounts for the fraction of the total antenna power pattern contained in its main beam.

It is possible to achieve values of q as high as 0. It is also possible to design the antenna such that its side-lobe levels are very low, thus achieving values of qm in the range 0.

Archaeological site detection The mastery of nature by man results in a transformation of the landscape through agriculture and construction. In arid parts of the world it is often easy to see the remains left by human activity which survive for long periods above the ground. But in heavily occupied and farmed regions most surface traces have been obliterated.

Traces remaining in the ground just below the surface are usually studied through excavation, a tool of last resort, since it destroys the evidence. Techniques for non-destructive investigation have been developed to protect sites by leaving them undisturbed. Buried remains are detected and mapped by measuring or recording differences in water content, magnetic, thermal and mechanical properties, and displaying these with modern processing methods as two- or threedimensional coloured images.

Low level oblique aerial photographs have been used since the s and are still the most economical approach. Electrical, magnetic, electromagnetic, radar and sensitive remote temperature measurements add techniques for use under widely varying geological, climatic and manmade conditions.

Buried structures are recognised by skilled archaeologists via their characteristic shapes in properly processed images made from such data. Argon 1. A minor constituent of the Earth's atmosphere. Series of U. The satellites carried panchromatic cameras Keyhole with a resolution of m.

The data are now declassified. These transmissions are downlinked to ground stations for processing position determination and data formatting. Positions can be determined to f 3 0 0 m f 3 0 m on some days.

C band 5. W and HH. Incidence 15" to 45" at mid swath image, wave and alternating polarisation modes. All modes can operate at either HH or W polarisation.

The 'alternating polarisation' mode interleaves observations in each polarisation state. Beam angles: Near edges km from sub-satellite track; far edges km. Wavebands and spatial resolution: Atmosphere 1. A unit of pressure, equal to The gaseous envelope surrounding the Earth.

The main constituents of the atmosphere are nitrogen, oxygen, water vapour, argon and carbon dioxide, though other gases present at low concentrations can also have a significant effect on the Earth radiation budget. The composition and physical properties of the atmosphere vary considerably with height, and also with geographical position especially with latitude, but also with proximity to land or sea, or 11 to industrial or rural areas.

The table below shows the mean composition of the atmosphere at sea level, in terms of the fraction by volume, and the mean column-integrated quantity of gases, expressed in moles per square metre. Gas Volume fraction N2 0. The troposphere, in which the temperature usually decreases with height, extends from sea level to a height of about 10km.

This is the most turbulent layer, in which most meteorological phenomena occur. The lowest region of the troposphere, the boundary layer, is approximately 1 km deep. Above the troposphere is the stratosphere, in which the temperature increases with height as a result of increasing absorption of solar ultraviolet radiation by ozone , extending up to about 50 km.

The molecular concentration of ozone increases with height in the stratosphere, reaching a maximum at about 50 km. Between about 50 km and 85 km is the mesosphere, in which the temperature decreases with height, and above this is the thermosphere, in which it increases.

Figure A summarises the variations of temperature, pressure and density between sea level and the lower part of the thermosphere, although it is representative only since the heights of the layer boundaries vary considerably with latitude. Conversely, observations made at the wavelength of an absorption line give the possibility of atmospheric sounding of temperature, pressure and molecular composition of the atmosphere.

Figures B and C summarise the typical atmospheric attenuation spectrum for a one-way vertical path through the atmosphere, although these values are again dependent on latitude and on the concentrations of different molecular species. The main peaks are as follows pm: The absorption below 20GHz is due to nonresonant absorption by oxygen.

The peaks above GHz are due to water vapour. Atmospheric chemistry See chemistry, atmospheric. Atmospheric correction The useful information about a target area of the land, sea or clouds is contained in the physical properties of the radiation that leaves that target area, whereas what is measured by a remote sensing instrument are the properties of the radiation that arrives at the instrument.

It is therefore necessary to correct the satellite- or aircraft- received data to allow for the effects on the radiation as it passes through the atmosphere. In theory the passage of the radiation through the atmosphere is described by the radiative transfer equation; however, the values of the various parameters that appear in the radiative transfer equation are not known sufficiently accurately to make direct and explicit solution of the radiative transfer equation a feasible approach.

Thus in practice more empirical methods are used depending, among other things, on the wavelength of the radiation concerned. For microwave radiation, at the higher microwave frequencies the atmosphere is an absorbing medium which attenuates the energy emitted from the surface. The atmosphere also emits energy as described by Kirchhofs law. If we consider the atmosphere as a stratified medium in which the temperature and the microwave absorption coefficient are functions of height, then it is possible to represent the temperature describing the radiation received at a 14 sensor mounted on a spacecraft as consisting of the following four components: It is the third contribution which constitutes the required signal; therefore it is necessary to estimate the contributions from i , ii and iv in order to determine the temperature of the surface of the Earth from the signal generated at the instrument on board the spacecraft.

For thermal infrared radiation, i. There are three main methods for the atmospheric correction of thermal infrared data. While one can use a model, or average locational and seasonal, atmosphere as input to the LOWTRAN computer package, this is not particularly satisfactory because the water vapour content of the atmosphere is highly variable, both spatially and temporally, and it is really necessary to use the values of the parameters that apply to the actual atmospheric conditions at the time that the remotely sensed data were gathered.

Such a profile can be obtained either from radiosonde data or from a satellite-flown sounding instrument e. The equivalent black-body temperature, TO,for the radiation leaving the surface of the Earth, is then determined as a linear combination of these two brightness temperatures: Again two brightness temperatures are determined for the two views of a given area on the surface of the Earth and a similar relation to that in ii is used to determine To,but with its own set of values of Ao, A , and A Z.

When an electromagnetic wave strikes a particle, no matter the size of the particle, a part of the incident energy is scattered in all directions. This scattered energy is called diffuse radiation. An expression for the energy scattered by spherical particles can be obtained theoretically by solution of Maxwell's equations of electromagnetism. By using the parameter a! Q The simplest atmospheric correction algorithms for optical and near infrared observations assume that the darkest pixels in an image usually arising from deep shadows or from areas of deep clear water correspond to zero radiance, and that the effect of atmospheric scattering of radiation into the line of sight is purely additive, so that the radiance detected from the darkest pixels can simply be subtracted from all pixels in the image.

More sophisticated algorithms use models of atmospheric absorption and scattering. These algorithms take into account five contributions to the radiance measured at the satellite: The at-satellite radiance L can be expressed as an equivalent reflectance p', defined as where E is the exoatmospheric irradiance and po is the cosine of the solar zenith angle. This equivalent reflectance can then be written as the sum of terms representing the contributions i to v above. In turn, these are: The greatest uncertainty in these algorithms is the effect of atmospheric aerosols.

Atmospheric sounders The table summarises the characteristics of the main optical and infrared nadir-looking or scanning instruments used for atmospheric sounding. Several instruments have been carried on more than one satellite: Column 4 waveband uses the following abbreviations: Column 6 applications uses the following abbreviations: See also electro-opticalsensors, imaging radiometer, limb-sounding, multispectral imager, passive microwave radiometry.

Sounding instruments measure the radiation at a number of wavelengths where most of the radiation has been emitted by the atmosphere itself see atmospheric sounders. By looking at wavelengths in which the atmosphere has stronger or weaker transmission of radiation the data from different wavelengths can be used to look at various levels of the atmosphere see radiative transfer equation.

With a knowledge of the distribution of the gases it is possible to invert these measurements from satellite sensors to retrieve the temperature profile or, if the temperature structure is known, the distribution of the gases. Both these problems are ill founded and more information is required before they can be solved, the data coming from climatology or a numerical model of the atmosphere.

Sounding takes place with data from infrared or microwave radiometers on both polar LEO and geostationary satellites. Currently, sounding is carried out using a combination of infrared and microwave measurements with the infrared data providing soundings in cloud-free or partly cloudy conditions and microwave observations giving sounding where there is thick cloud. Operational meteorological sounding is carried out from the polar orbiting satellites using nadir viewing instruments that scan across a km swath.

A number of sensors on research satellites in polar orbit have used a limb-sounding technique where an oblique view 18 of the Earth's atmosphere is obtained. Such a technique allows temperature, water vapour and minor constituents to be retrieved.

See also chemistry, atmospheric and ozone. O, The ATSR uses a conical scanning technique which provides data from both nadir and 52" forward of nadir. The ATSR also includes a nadir-viewing passive microwave radiometer two channels, at The principal function of this is to determine the column-integrated atmospheric liquid water and water vapour content for correction of SST measurements and radar altimeter ranges.

Attenuation is also called extinction. See absorption coejicient, LambertBouguer law, radiative transfer equation, scattering.

Auto mat ic Iinear en hanceme nt See contrast enhancement. SDatial resolution: Wavebands pm: They cover a km swath pixels. GAC Global Area Coverage data are also stored on board for downloading once per orbit, but at a reduced spatial resolution of 4 km.

APT Automatic Picture Transmission data have a spatial resolution of about 4km and a swath width of km 2 bands only , and are transmitted continuously. Continuously transmitted data may be received from the S-band downlink and used freely.

It is also planned that this instrument will be carried on Metop-I. Azimuth shift The displacement of the image of a moving target in a synthetic aperture radar SAR image, relative to the position at which it would have been imaged if it had been stationary. The shift depends on the imaging geometry and the target velocity. The diagram shows the geometry of azimuth shift in a Cartesian coordinate system that can be used to represent either a flat Earth geometry or a curved Earth surface.

At some instant the target is located at x,O,O and moves with velocity u,. Azimuth shift is useful in some applications, for example determining ship velocities by comparing the moving ship with its stationary wake.

However, it is a significant source of uncertainty in determining ocean wave spectra from SAR images. More specifically, scattering through an angle of ". Backscatter coefficient A dimensionless quantity, represented by the symbol uo,denoting the effectiveness of a surface at scattering radiation incident upon it. As a dimensionless quantity, the backscatter coefficient is often specified in decibels.

The backscatter coefficient is dependent on the frequency, observation geometry and polarisation states of the incident and scattered radiation as well as on the properties of the scattering surface. For a given frequency and geometry, it is common to express the backscatter coefficient as 4v where p denotes the polarisation state of the incident radiation and q that of the scattered radiation.

Backscatter lidar See lidar. Pulse repetition frecluencv: Pulse repetition freguency: Banding A periodic error in image brightness that can be caused by scanning systems that use more than one detector to acquire the image.

Banding can be particularly problematic in Landrat MSS images, since the MSS scanning system used six different detectors to image adjacent scan lines. The same detector thus viewed only every sixth scan line, so that any uncorrected calibration errors resulted in a spurious six-pixel periodicity in the along-track direction of the image.

Banding, also called striping, can be at least partially removed by destriping algorithms. Band interleaved by line See image format. Band interleaved by pixel See image format. Band sequential format See image format. Baseline The separation between two positions of a sensor from which slightly different images of the same area are obtained.

The differences contain information on the topography of the area. See interferometric SAR, stereophotography. Bathymetry Measurement of the depth of a body of water. Bathymetry from satellite remote sensing data is possible in some limited circumstances. In very clear water, depths up to about 10m can be inferred from visiblewavelength imagery, through the effect of absorption by the water column on the light reflected from the bottom, or from synthetic aperture radar imagery, through the effects of bottom topography on the refraction and diffraction of surface waves.

Limited information on deep-ocean bathymetry can be obtained from radar altimeter observations, since the topography of the ocean surface follows the geoid provided that there are no ocean currents and that the effects of tides, waves and atmospheric pressure variations have been removed , and variations in the geoid height reflect variations in the bottom topography. Beam-limited Term used to describe the operation of a radar altimeter when the effective spatial resolution is determined by the power pattern of the antenna rather than by the duration of the compressed pulse.

See pulse-limited. Beamwidth The effective angular width of the power pattern of an antenna, measured in a given plane. In practice, the effective beamwidth is usually 23 defined as the width of the beam at half the peak value and referred to as the half-power beamwidth. The beamwidth is usually specified in each of two principal planes, the azimuth horizontal plane and the elevation plane. If the direction of the beam is denoted by the z-axis, the horizontal direction by the y-axis and the vertical direction by the x-axis, the beamwidth in the xz-plane is denoted by P,, and that in the yz-plane by Pyr.

If the antenna dimensions are D, and Dy along x and y respectively, then where k, and ky are illumination factors that characterise the electric field distribution across the antenna aperture, X is the wavelength, and Pis measured in radians. Tapered illumination is often used to reduce the side-lobe levels of the power pattern, which leads to larger values of k, and k,,. For steep tapers, with correspondingly very low side-lobe levels, k, and ky may be as large as 2.

Bhaskara-I , -2 The first experimental remote-sensing satellites launched by India, in and respectively. Bhattacharyya d istance See separability. Bidirectional reflectance distribution function A function characterising the amount of electromagnetic radiation reflected scattered from a surface, as a function of the directions of the incident and reflected radiation. The bidirectional reflectance distribution function BRDF is defined as the ratio of the reflected radiance to the incident irradiance.

The BRDF is symmetric with respect to the incident and reflected directions, i. The reflectivity of the surface is a function only of the incidence direction, and is defined as the ratio of the radiant exitance to the irradiance. Bidirectional reflectance factor The bidirectional reflectance factor BRF is defined as the ratio of the flux scattered into a given direction by a surface under given illumination conditions, to the flux scattered in the same direction by a perfect Lambertian scatterer under the same conditions.

BI L See image format. BiIinear int erpo Iat io n See resampling. Detection and quantification of biological productivity, for estimation of its contribution to the global carbon cycle and for the identification of potential fishing areas, can be performed from satellite sensor observations of ocean colour.

The above-ground biomass of terrestrial vegetation is commonly estimated using vegetation indices. Bistatic See radar equation. Black body A body that absorbs all the radiation incident upon it, reflecting none.

Such a body also has an emissivity of 1. A black body at an absolute temperature Temits radiation whose properties are characterised only by the value of T see Planck distribution. This radiation is called black-body radiation. Bouguer's law See Lambert-Bouguer law. Boundary layer, atmospheric The lowest region of the atmosphere, of the order of 1 km thick, in which transport processes are dominated by wind turbulence and by convection.

The majority of molecular and particulate species particularly water vapour and aerosols found in the boundary layer are generated by surface interactions. Box classifier See supervised classification. Bragg scattering A surface scattering model based on coherent scattering from periodic structure in the surface. The figure shows that this is equivalent to requiring that twice the path difference, 2Asin8, should be equal to one incident wavelength.

The factor of two arises because the wave must travel over the same path twice. B R DF See bidirectional reflectance distribution function. Brewster angle If electromagnetic radiation propagating in vacuo is incident on a loss-free medium having a planar surface, the Fresnel coeficient for parallel26 polarised radiation is zero when the incidence angle is equal to the Brewster angle Bs.

BR F See bidirectional reflectancefactor. Brightness 1. See ZHS display. See tasselled-cap transformation. Brightness temperature The brightness temperature of a body that is emitting thermally generated radiation is the temperature that the body would need to have if it were a black body emitting the same amount of radiation. The brightness temperature of radiation is defined in terms of its radiance through the Planck distribution.

Books pdf sensing remote

Building materials, electromagnetic properties Typical emissivitiesin the wavelength range pm are listed below: Material Emissivity aluminium, polished asphalt brick concrete glass paint plaster wood 0.

The purpose of BUV was to provide vertical profiles of atmospheric ozone concentrations. It was the forerunner of the SBUV instrument.

The process of relating the measured output spatial, radiometric, spectral, polarimetric etc. The results of the process 1. Candela The unit of luminous intensity, equal to one lumen per steradian. See photometric quantities. Canonical components The canonical components of a multi-band image are linear combinations of the bands, chosen in such a way that classes clusters of data are maximally separable using the first canonical component. The degree of separability decreases from the first canonical component to the second, from the second to the third, and so on.

Canonical component analysis is similar to principal component analysis PCA , but, unlike PCA, takes account of the clustering of the data in the Ndimensional feature space e. Chemical formula C Profiling of atmospheric carbon dioxide can be carried out using the infrared absorption lines. The most important of these have wavelengths of 1. Svatial resolution: C band Subdivision of the microwave region of the electromagnetic spectrum, covering the frequency range 3.

Global especially Brazil land, ocean, atmosphere observations. Period Exactly repeating orbit orbits in 26 days. CCD Charge-coupled device An imaging electro-optical sensor.

A CCD can record radiation from a ground resolution element see rezel for representation within a pixel in an image. In its imaging mode, line after line 30 of radiation can be built up to form an image as the aircraft or satellite moves forward. In comparison with a scanner the linear CCD array has more sensors to calibrate and is restricted in the range of wavelengths it can sense.

However, it possesses many advantages over the scanner: The last advantage assures a larger signal and the potential to increase the signal-to-noise ratio, the spatial resolution or the spectral resolution. Today many optical sensors use one or several one-dimensional or twodimensional CCD arrays.

These arrays have made possible improvements in image data quality and the development of fine spatial resolution sensors, imaging radiometers and high-performance spectroradiometers.

See also stepstare imager. CCT Computer-compatible tape Magnetic tape used for storage of image and other data; now largely superseded by 8mm cartridge exabyte types and compact disc storage. C EOS Committee on Earth Observation Satellites International body established in to coordinate international activity in spaceborne remote sensing.

Limb to limb. These trace gases can be detected and measured by the amount of electromagnetic radiation that they absorb, emit or scatter. Atmospheric sounding of most trace gases requires high spectral resolution in order to resolve individual absorption or emission lines, although lower spectral resolution is required if only column-integrated values are required.

Lower spectral resolution is also needed for profiling of ozone concentrations, since ozone has particularly strong absorption lines. Profiling of concentrations in the troposphere is more difficult than in the stratosphere.

Currently, most of the important instruments for atmospheric chemistry measurements are carried by the UARS satellite. Chlorophyll The most important photosynthetic pigment in plants see leaf. Chlorophyll-a has absorption maxima at 0. Chlorosis See geology. Ci rcula r polar isat ion See polarisation. Altitude range: CLAES was used to deduce concentrations of various atmospheric gases COz, H20, CH4, 03, and several nitrogen and chlorine compounds from measurements of thermal emission.

It ceased operation in May Proof-of-concept mission for small, low-cost spacecraft. Circular Sunsynchronous LEO at km altitude; inclination The satellite will 32 also carry a solar-terrestrial physics package and a laser retroreflector for ground-based lidar measurements of the atmosphere.

The output of the classification stage may be regarded as a thematic map rather than an image. Its accuracy is normally assessed using an error matrix. Classification techniques can be broadly divided into two types: In supervised classification, information about the distribution of ground-cover types in part or parts of the image is used to initiate the process.

In unsupervised classification, the entire image is first analysed by clustering to find distinguishable classes of pixels present within it. After this stage has been completed, the classes present within the image are associated with classes present on the ground by comparison with training data.

In essence, therefore, supervised classification forces the image classification to correspond to user-defined ground-cover classes, but does not guarantee that the classes will be separable; whereas unsupervised classification forces the classes to be separable but does not guarantee that they will correspond to the ground-cover classes required by the user.

Hybrid classiJication algorithms combine features of both approaches. Climate The climate of a location is the synthesis of the day-to-day values of the main meteorological elements that affect the site.

Factors that affect climate include precipitation, temperature, cloud cover, wind speed and direction, sunshine and humidity. Satellite remote sensing systems can provide data on many of these quantities and the data are a valuable supplement to the in situ observations, which tend to be distributed very unevenly around the globe with a bias towards the heavily populated areas of Europe and North America.

Satellite data also have the advantage of being collected on a global basis by a single sensor and the data can be processed in a consistent way for all locations.

Some satellite data sets, such as the global cloud statistics, sea ice coverage and global sea surface temperature data sets, are very important for climate studies and are maturing into reliable products, although the data sets are of short duration compared to in situ observations.

Other data, such as climatological fields of precipitation over the ocean, are in an early stage of development, and there is still a great deal of development work taking place on the processing algorithms. Some quantities, such as surface heat and moisture flux and near surface air temperature, cannot be derived in a reliable manner from observations from satellite sensors at present.

They are visible aggregations of water droplets or ice crystals with a base above ground level. The nature of the atmospheric structure, the physical form of the cloud, including the shape of individual cloud elements, and the phase of the water liquid or solid which makes up the cloud provide the most common basis for cloud classification. Methods of classifying clouds on the basis of height of formation and shape are used by surface observers e. In the case of high-resolution satellite imagery, human classification of cloud types may be possible.

In most types of imagery, the resolution is insufficient to permit the clear identification of such cloud types. Automatic classification of clouds in remote sensing can only be on the basis of characteristics which can be extracted from the radiance field measured by the satellite. The classification generally follows the cloud detection process. The simplest form of automated classification from satellite imagery is therefore based on the brightness of the clouds at visible wavelengths and the temperature of the cloud as indicated by the brightness at thermal infrared wavelengths.

Two radiance channels are generally considered to be the minimum requirement. The detail of classification from remote sensing imagery is limited by the complexity of the algorithms used.

More detailed information on cloud character can be obtained by examining the spatial and temporal variance of the radiance field. The altitude of cloud-tops can be determined directly by laser profiling, and the water content by lidar or passive microwave radiometry. In most situations, clouds are brighter than the background surface at visible wavelengths and colder than the background surface at thermal infrared wavelengths.

Infrared imagery is usually viewed as a negative image so that clouds appear white. Several problem areas for cloud detection exist, and cloud detection is difficult if the image pixel size is significantly larger than the size of individual cloud elements. This is particularly problematic for thin cirrus, low stratiform cloud decks, polar regions, and multilayer cloud systems. Cloud detection is difficult at night in areas where the thermal contrast between the surface and the cloud is low, for example in the case of marine stratocumulus regions and areas of fog.

In regions of thin cirrus, the observed radiance is a combination of the radiances from the cloud and the ground. In the polar regions and other areas of snow cover, the contrast is low at visible wavelengths and the thermal gradient in the atmosphere is often weak.

In the polar winter, where a strong surface inversion forms, the thermal contrast is inverted. Detection of clouds in problem areas can be enhanced by the use of multispectral imagery to detect the signatures of cloud types. Predominantly, cloud detection relies on optical and thermal infrared observations, but wavelength regions in the near infrared have been used to distinguish between clouds and snow 1.

Such multispectral analysis offers the opportunity to develop cloud type signatures for improved classification of clouds. Pattern recognition techniques have also been applied to problem areas such as wintertime polar clouds. Cloud masking Cloud masking is essentially the same process as cloud detection, except that the focus is on the elimination of cloud-affected pixels from further analysis. When determining surface parameters such as surface temperature and vegetation indices from satellite radiance data, the presence of clouds can seriously contaminate the results.

A number of techniques have been used for cloud masking. A simple threshold brightness temperature e. AVHRR channel 5 can be used to indicate cloud-affected pixels. Some techniques have used the difference in brightness temperature between AVHRR channels 4 and 5 as indicators of the presence of cloud. The spatial coherence of cloud-affected areas is generally larger than that of cloud-free areas, and this can also be used for cloud masking although the technique is not useful where the background variance is large, e.

Most cloud-masking schemes involve the derivation of empirical thresholds and their success depends on the geographical area of application. Clouds Clouds are collections of ice crystals, water droplets or a mixture of the two that have their base above the surface of the Earth. They form when air is cooled sufficiently so that its temperature falls below the dew point, i.

Under these conditions the excess water condenses as water droplets or ice crystals. The cooling of the air usually occurs because of expansion as it rises through the atmosphere. This can take place for a number of reasons and affects the appearance of the clouds. Large-scale atmospheric motion, such as at a frontal surface, usually gives layer cloud, while convection above a relatively warm surface often results in more isolated clouds.

Forced ascent of air over high ground is also an important mechanism in the formation of cloud. The water droplets making up a cloud are often supercooled, but they must have a diameter of less than pm otherwise they are classed as rain or drizzle.

The vast majority of clouds are found below the tropopause, which is at a height of about 11km in mid-latitudes, although some important cloud types occur in the stratosphere. Although the cloud-climate feedback is insufficiently understood, it is clear that clouds play an important part in determining the Earth radiation budget.

There are two main effects. At optical wavelengths they act to reduce the amount of heating by reflecting radiation back to space.

Since clouds are collections of small refracting particles, they are effective scatterers of solar radiation. This scattering is well described by the Mie theory.

Cloud properties are generally discussed in terms of their liquid water path, optical depth or liquid water content. At thermal infrared wavelengths, clouds with the exception of thin cirrus behave as black bodies. Most analyses of cloud radiation interactions assume that the clouds are plane parallel and that the effects of the sides can be ignored. In many environments, where broken cloud fields are common, the contribution of radiation reflected and radiated from cloud sides must be considered.

Cloud sides act to enhance both infrared and optical-band radiation with respect to the linear combination of clear and overcast conditions. Most analyses of the effects of clouds are performed in terms of the concept of cloud radiative forcing. In a region large enough to consist of both clear and cloud-covered areas e.

The cloud forcing is then H - Hclr,where Hcl, is the net heating under clear sky conditions.

Pdf books remote sensing

Cloud statistics Prior to observations ofclouds from satellite sensors, a range of climatologies describing cloud distribution, derived from surface observations, were available. Since low clouds tend to obscure high level clouds from a surface observer and high clouds obscure low clouds when viewed from a satellite, the two observations rarely agree on the character of cloud cover over the globe.

The range of viewing angles which make up the surface observer's view of the sky and the range of nadir angles which are sampled in a single satellite image result in very different sets of statistics from the two methods. Differences in satellite frequency response and in the nature of the algorithm used in the cloud detection and cloud classification stages mean that agreement between different satellite climatologies is very variable.

It relies primarily on variable thresholds determined from statistical analyses of radiances retrieved for individual locations.

The Remote Sensing Data Book - PDF Free Download

The mean annual cloud amount is greater over the ocean than over the land. As well as deriving statistics for the distribution of cloud amount, some attempts have been made to characterise cloud cover using higher-order statistics, for example the ,B distribution, the Burger distribution and fractal analysis.

Cloud temperature The tops of cloud layers are, in the case of thick clouds, the emitting surfaces from which infrared radiation is received by satellite radiometers. The temperature of the top of the cloud therefore affects the amount of radiation received at satellite level. This quantity allows the height of the cloud top to be determined, provided that the profile of temperature through the depth of the atmosphere is known.

Medium level cloud is found between about 2 and 6 kilometres above the surface, which in mid-latitudes has cloud top temperatures between about 0 "C and "C. Although these temperatures are below the freezing point of water, the clouds are usually composed of supercooled water droplets. High level cloud, between about 6 and 13 kilometres above the surface, is often composed of ice crystals and the temperatures observed range from about "C to "C.

Cloud top temperatures can be combined with measurements of cloud thickness to estimate precipitation. The median values of water content for low level layer cloud e. However, the variability of water content is large and in situ measurements have indicated values of up to five times these figures.

For convective cloud, water content can be much higher with values of around 2gm-3 being measured near the top of stratocumulus and cumulus. The water content of individual clouds is dependent on a number of factors, including cloud base temperature and the degree of vertical development of the cloud. Clustering The process of identifying groups of pixels in an image that have similar properties.

Commonly the first step in an unsupervised classification of an image. The properties features of a pixel can be specified by a vector x in Ndimensional feature space. The components of this vector will often be the digital numbers or reflectances in each of N spectral bands, but could also be, for example, radar backscatter coefficients in different polarisation states, texture parameters, or single-band digital numbers in a multi-date composite image.

The basis of all clustering algorithms is a measure of the similarity of two pixels with vectors X I and x2. The simplest of these is the Euclidean distance 1x1 - X 2 V 2 , although the square of this quantity is often used to reduce the computational requirement. Other methods include hierarchical clustering and singlepass clustering.

Once the clusters have been defined, they are often modified by splitting, merging or deleting clusters. Splitting involves breaking into two or more clusters a single cluster that is excessively elongated in feature space, or that shows evidence of bi- or multi-modality.

Merging is the combining of clusters that show insufficient separability. Clusters that contain too few pixels typically less than 10N for further analysis are deleted. The property possessed by a system of waves usually electromagnetic radiation when there is a fixed phase difference between the signal measured at the same location but at different times temporal coherence or between the signal measured at the same time but at different locations spatial coherence. The availability of both amplitude and phase information from a detected signal, for example in synthetic aperture radar systems.

Columbus Polar Platform See Envisat. Commonwealth Scientific and Industrial Research Organisation The Australian organisation with responsibility for space research. Compression of data Data compression techniques are important in remote sensing because of the large volumes of data involved for example, one Landsat TM image requires over megabytes. Compression methods are either reversible, in which case all the information in the image is recoverable, or irreversible, in which case there is some loss of information.

The simplest irreversible lossy compression methods involve cropping images to remove uninteresting areas, or sub-sampling them to reduce their spatial resolution and size at the same time. Similarly, the Fourier transform or Hadamard trangorm of an image can be formed, and truncated to remove the higher spatial frequencies.

Reversible lossless data compression methods can be divided into two types: Huffman coding is a common example of the former; run-length encoding and tesseral addressing are examples of the latter, although tesseral addressing can also be used as an irreversible method.

Lossless compression is widely used for storage and transmission of image data. Computer-compatible tape See CCT. Confusion matrix See error matrix. Consumer's accuracy See error matrix. Contextual classification Modification of an image classification procedure to take into account the likelihood that neighbouring pixels should be assigned 39 to the same class, for example by incorporating into the discriminant function of a supervised class8cation a 'cost function' that penalises pixel-to-pixel variation in classification.

Contrast enhancement Radiometric transformation of an image to improve its visual interpretability. The operation can be specified by a transfer function f such that where Zi p is the digital number of pixel p in the image, and Id p is the digital number used to represent the pixel in the display.

The same transfer functionf is used for all pixels in the image, or a specified part of it. The transfer function is chosen to maximise the use made of the radiometric resolution of the display unit, by expanding the width of the image histogram.

The simplest contrast enhancement is a linear enhancement or linear stretch. These values can be chosen to cover the entire range of available digital numbers, or even to exceed it at the lower or upper end of the display range saturation of the display. Such saturation may be desirable from the point of view of increased radiometric separation of features of interest, if the pixels about which information is lost through saturation are not interesting. Many image processing systems can provide automatic linear enhancement.

This transformation has the property that values of Zi p less than p - Nu12 or greater than p Nu12 will lead to saturation of the display. Various forms of non-linear contrast enhancement are in common use.

The simplest of these is the two- or multi- part linear enhancement, in which the transfer functionf consists of a number of linear enhancements, with different values of a and b applying over different ranges of Zi p. Other non-linear enhancements include the exponential transform: See also histogram matching.

Convolution operator A linear spatial operation performed on an image, in which the new digital number Z' i,j assigned to the pixel with coordinates i , j is calculated as a weighted sum of the digital numbers I of the pixels in the neighbourhood of i , j.

The operation defined by the above equation is the convolution of the image with the function defined by the matrix w k,I. An alternative method of implementing a convolution filter is to calculate the Fourier tranflorm of the image, to multiply this by the Fourier transform of the matrix w k , l , and then to perform the inverse Fourier transform.

In general it is more efficient to use direct convolution if the matrix w k , l has a small spatial extent. When w k , l has a large spatial extent its Fourier transform occupies a smaller region of Fourier transform spatial frequency space, and it is more efficient to use the Fourier transform method.

See edge detection, line detection, shape detection, sharpening, smoothing. Co-polarisation A radar system operates in co-polarised mode if it detects radiation having the same polarisation as it transmits, for example HHpolarised or W-polarised see HH-polarisation, VV-polarisation. Compare cross-polarisation. Coriolis parameter See ocean currents and fronts. Corner-cube reflector See radar transponder.

Corona Series of U. The satellites carried panchromatic cameras Keyhole giving resolutions of 8 m to December , 3 m August to October and 2 m September to May I i is defined as the mean digital number in band i: An element Cii on the leading diagonal of the matrix is the variance of the digital numbers in band i. An off-diagonal term C, j i is related to the correlation coefficient p, between bands i and j through c.. Crop marks in archaeology The most sensitive method for detecting buried archaeological structures is based on the response of growing plants to differences in humidity.

Plant height and colour are affected when soil moisture is limited. Detection via crop marks has been responsible for the discovery of more archaeological sites than all other methods combined. Being a consequence of the interaction of growing vegetation, soil structure and climatic change, it is hard to analyse. Growth is either retarded or advanced when a dry spell causes depletion of soil moisture reserves and plants must acquire their moisture from lower levels.

Drainage is also a factor. Markings over buried ditches and pits are more common than those over walls which have a negative effect on crop growth. Feature contrast is usually quite weak, and is visible only when photographed at optimum angles at low altitudes. When grain crops ripen, visible contrast rises if moisture stress conditions continue, and lines of deep green stand out sharply against a yellow background.

After ripening, growth changes may be permanent and features may be seen as shadows in oblique illumination. Sites are visible for a week or more, unless differences become permanent at the end of the growing season prior to harvesting. See also archaeological site detection. Crossover A point at which the sub-satellite track of a satellite orbit intersects itself. Crossovers are particularly important in radar altimeter observations since they permit the orbital parameters to be calculated more accurately.

Cross-polarisation A radar system operates in cross-polarised mode if the polarisation states of the transmitted and received radiation are different, for example HV-polarised or VH-polarised see HV-polarisation, VH-polarisation. Compare co-polarisation.

Cubic interpolation cubic convolution See resampling. CZCS was primarily intended for ocean colour chlorophyll and suspended sediment measurements. DCS is now included on many remote-sensing satellites. Debye equation The Debye equation describes the variation of the dielectric constant of a simple organic material containing polar molecules.

It is normally written as follows: The Debye equation provides a good model of the dielectric constant of water in the microwave region of the electromagnetic spectrum. Decibe1 A logarithmic unit defining the ratio of two powers, intensities, radiances etc. A signal of power P I exceeds one of power P2 by P 10loglo1decibels.

Dense medium model A volume scattering model that is valid for a dense discrete random medium, i. The scattering from nearby scatterers is thus correlated, and they act as a group rather than individually. The phase relation between scatterers and the average spacing between adjacent scatterers are important considerations for such media. The radiative transfer model originally developed for sparse media can be adapted to a dense medium by deriving a phase function for a unity volume of scatterers and allowing for near-field interactions among the scatterers.

In a natural random medium the phase relation among scatterers is usually destroyed to a large extent by variations in size, shape and orientation of scatterers. However, the average spacing between the scatterers is not affected and hence must be taken into account. Density slicing A very simple image classification procedure, applied to a singleband image or to one band of a multi-band image, in which ranges slices of digital numbers are assigned to particular digital numbers in the image display.

Density slicing can be regarded as a contrast enhancement or as a one-dimensional parallelepiped classifier. Density slicing is commonly used where the image digital numbers have a direct relationship to a physical parameter of interest for example, thermal infrared radiance may correspond directly to sea-surface temperature. It is also used to mask out regions of an image from further processing, and to reduce the effects of noise in an image.

Desertification is the anthropogenic change of potentially productive land to give desert-like conditions. Remote sensing-based methods for monitoring desertification normally make use of vegetation mapping techniques, but the use of passive microwave methods has also been shown to be effective.

See also erosion. Destriping Correction for the effects of banding in a poorly calibrated image acquired by a scanning system, especially Landsat MSS. Most destriping algorithms work by adjusting the digital numbers of a single strip of pixels so that the mean and standard deviation match the mean and standard deviation of 45 a reference strip.

For example, a typical destriping algorithm applied to Landsat MSS imagery, which has a banding period of six pixels, processes the image in blocks pixels wide and 6 pixels in the along-track direction. Detector A device for converting electromagnetic radiation into an electrical signal. Detectors can be classified on the basis of the physical mechanisms that cause the conversion of radiation to signal. Photon detectors produce a signal when the mobility or number of free charge-carriers is changed by incident photons, and thermal detectors produce a signal when their temperature is changed by incident radiation.

At optical and infrared wavelengths, the commonly used detectors are lead sulphide, indium antimonide, mercury or cadmium telluride, photoelectric detectors see photodiode , thermopiles and thermistor bolometers. These all provide a near-linear relationship between radiance and electrical signal, but vary in their sensitivity to different parts of the spectrum, their ruggedness, and their response time.

Developing countries Some of the most successful, or potentially successful, applications of remote sensing are in developing countries. For some areas of some countries topographic maps are either non-existent or they are grossly lacking in detail or they contain serious errors.

Satellite remote sensing therefore provides an important source of data that enables maps to be made or updated reasonably quickly, where conventional field survey or even aerial photography would be slow, tedious and expensive. There is, however, a problem in terms of major cost if the work is done outside the country.

There may be a problem of training and technology transfer if the work is to be done within the country. A similar situation applies to geological maps as to topographic maps. In regard to applications of remote sensing beyond the field of mapping, such as resources monitoring, disaster monitoring, change detection, yield prediction etc. It is impossible to generalise about remote sensing in developing countries. Some developing countries have sophisticated technical installations for receiving and handling satellite data, some are building and launching their own satellites and some have extensive remote sensing applications programmes in hand, in some cases without sophisticated equipment or software but just relying on simple photo-interpretation techniques applied to hard-copy images.

On the other hand, there are instances where people in a developing country have been sold expensive technology that is either inappropriate, or for which the infrastructure and skilled indigenous manpower is not available to enable it to be used properly. There are regional activities, such as those of the Asian Remote Sensing Society, and activities of some international organisations FAO, the UN Outer Space Affairs Division, for example , which work very hard to ensure that developing countries are aware of the possibilities of remote sensing and have good opportunities to exploit the techniques.

See also legal and international aspects. D FA Dual-frequency altimeter French dual-frequency radar altimeter, proposed for inclusion on Jason satellites. Pulselimited footprint: DFT See Fourier transform.