Lectures 9-12 Geo

¡Supera tus tareas y exámenes ahora con Quizwiz!

How many wavelengths can the Landsat 7 satellite sense simultaneously?

7

GIS components:

A GIS has six components: hardware, software, procedures, data, people, and network. Figure 12 introduces a great way to think about the these components (or the anatomy) of a GIS. HARDWARE is the foundation of GIS. GIS hardware used to consist of mainframe computers. Today we can use smart phones and other mobile devices for use in a GIS. The most typical and common GIS hardware configuration consists of a desktop computer or workstation that is used to perform most of the functions of a GIS. A client-server arrangement is very common, meaning that simple computer hardware or mobile device (a client) is intentionally paired with a more sophisticated powerful computer hardware (a server) to create a complete working system. SOFTWARE allows a GIS to perform location-based analysis. GIS software packages include commercial software purchased from a vendor such as ESRI, Bentley, or Intergraph, as well as scripts, modules, and macros that can be used to extend the capabilities of GIS software. An important part of GIS software today is a web browser and all the associated web protocols, which are used to create novel GIS web applications. DATA is the essence of GIS. GIS data is a digital representation of selected aspects of specific area son the Earth's surface or near-surface, built to serve problem solving or scientific research. GIS data is typically stored in a database. The size of a typical GIS database varies widely. A GIS dataset for a small project might be 1 Megabyte in size. An entire street network for a small country might be 1 Gigabyte in size. Larger datasets, including elevation data for the entire Earth at 30m intervals can take up terabytes of storage, and more detailed datasets of the entire earth can be many many times larger. The dataset used in a GIS can be VERY large. A GIS, by nature, stores detailed information about the Earth. This means that a GIS Professional needs to be very careful and deliberate about data storage and the time required for data processing. In addition to these four components, a GIS also requires PROCEDURES for managing all GIS activities, such as organization, budgets, training, customer mapping requests, quality assurance and quality control. Finally, a GIS is useless without the PEOPLE who design, program, maintain, and use it. Among the six components of a GIS, which do you think is the most expensive? It's data.

Please select the question(s) which can be answered using GIS: How will the path of Superstom Sandy affect voter turnout in the 2012 US presidential election? Where are the famous dishes at restaurants and bars in Belo Horizonte, one of the host cities for the 2014 World Cup? What are the factors leading to the habitat loss of elephant in Africa? How do technical, educational, and financial assistance to land managers in Wisconsin help reduce soil degradation?

ALL CAN BE ANSWERED USING GIS

Please select the necessary component(s) of GIS: Hardware Software People Data

ALL! -data is most expensive though

The type of remote sensing in which the sensor generates its own energy, casts it at a target, and then measures the return of that form of energy, is:

Active remote sensing

Please match the following terms to the corresponding contents of GIS:

GIScience - concept and theory GISystem - techniques and technology GIStudy - social implications

Geospatial techonology components:

Geospatial technologies include three major components: GPS, RS, and GIS. While GPS and RS are useful in collecting data, a GIS is used to store and manipulate the acquired geospatial data. • Global Positioning Systems (GPS) A system of satellites which can provide precise (100 meter to sub-centimeter) locations on the earth's surface. • Remote Sensing (RS) Use of satellites or aircraft to capture information about Earth's surface. • Geographic Information Systems (GIS) Software systems with the capability for input, storage, manipulation/analysis, and output/display of geographic (spatial) information.

Which portion of the electromagnetic spectrum is located just beyond the visible wavelengths (the wavelength are longer than the visible lights):

Infrared light

MODIS has _____ spatial resolution and _____ temporal resolution.

Moderate, High

In a color infrared photo, near infrared signal is displayed in _____ chanel.

Red

The information handled by GIS is primarily:

Spatial

Which of the following is a satellite that was designed to monitor the Earth's terrestrial environment?

Terra

Match these types of remote sensing bandwidth with their appropriate uses

Thermal infrared: To show temperature differences between land and water Orthophotos: To generate a map-like image while retaining detail Microwave sensing: To show subsurface characteristics Color infrared sensing: To show dead or withering vegetation

T/F A satellite in geostationary orbit rotates at the same speed as Earth.

True

T/F In passive remote sensing, the sensor simply measures reflected or emitted energy.

True

T/F The vast majority of the electromagnetic spectrum is invisible to the human eye.

True

T/F Vegetation reflects most of the near infrared radiation from the sun.

True

geographic problems

problems that involve an aspect of location, either in the information used to solve them, or in the solutions themselves.

image interpretation:

the process of extracting qualitative and quantitative information from a photo or image using human knowledge or experiences.

Geographic Information Studies:

to understand the social, legal and ethical issues associated with the application of GI Systems and GI Science.

T/F Suomi NPP's overall mission is to examine global environmental phenomena and to advance knowledge and understanding of Earth's systems as a whole. It is the first satellite launched to build the next-generation satellite system to take over Earth Observation System (EOS)

true

Geographic Information Systems (GIS):

-Software systems with the capability for input, storage, manipulation/analysis, and output/display of geographic (spatial) information

GIS Workflow

-The workflow for a GIS is represented as a loop when applied in a real-world application 1.we collect and edit spatial data. 2.we visualize/display the data and perform spatial/statistical analysis to understand phenomena and patterns. 3.we design and produce maps as reported results for decision makers. The decisions made will produce effects in the real world. Therefore we collect the feedback and restart the loop to further improve our decisions and policies.

Geographic Information Science:

-a new interdisciplinary field built out of the use and theory of GIS, which studies the underlying conceptual and fundamental issues arising from the use of GIS and related technologies, such as: spatial analysis, map projections, accuracy, and scientific visualization

Which of the following is NOT a Suomi NPP instrument? A. ATMS b. VIIRS c. MODIS D. OMPS

C. MODIS

Which of the following is NOT true of satellites? a) They are constantly orbiting the Earth b) They can image much larger areas than single aerial photos can c) They are restricted to geographic boundaries, much the same way that aircraft are d) They are, in general, superior to aircraft in terms of their ability to capture images on Earth

C. They are restricted to geographic boundaries, much the same way that aircraft are

The Suomi NPP instrument used to study clouds and Earth's climate and temperature, is

CERES

Which one is not an advantage of remote sensing?

Can extract any information you want from any location on the Earth

visible wavelengths

- The light which our eyes can detect is part of the visible spectrum. -the wavelengths between 0.4 and 0.7 micrometers; energy at these wavelengths are visible to human eyes -- Blue, green, and red are the primary visible wavelengths, and most remote sensors are equipped to detect energy at these wavelengths. Visible wavelengths are useful in remote sensing to identify different object

Camera Axis

- an imaginary line that defines the path light travels to hit the camera lens or sensor. Depending on the camera's position and the camera axis angles with respect to the ground, aerial photographs can be vertical, low oblique or high oblique

infrared (Near and Thermal/Far)

- the portion of the electromagnetic spectrum from approximately 0.7 to 100 micrometers - more than 100 times as wide as the visible portion of the spectrum -Near IR radiation is used in ways very similar to visible radiation. The near IR covers wavelengths from approximately 0.7 um to 3.0 um. Near IR is particularly sensitive to green vegetation. Green leaves reflect most of the near infrared radiation they receive from the sun. Therefore, most remote sensors on satellites can measure near infrared radiation, which lets them monitor the health of forests, crops, and other vegetation. -The thermal/far IR covers wavelengths from approximately 3.0 um to 100 um. Thermal IR energy is more commonly known as "heat." Objects that have a temperature above absolute zero (-273 C) emit far IR radiation. Therefore, all features in the landscape, such as vegetation, soil, rock, water, and people, emit thermal infrared radiation. In this way, remote sensing can detect forest fires, snow, and urban areas by measuring their heat.

remote sensing

- the science of obtaining information about objects or areas from a distance, typically from aircraft or satellites -satellite remote sensing means senros aboard satellites capture "pictures" of Earth's surface

swath

- the strip of the Earth's surface from which geographic data are collected by a satellite - With a narrow swath, the sensor needs to orbit the earth many times to completely cover the whole globe, which means long revisit times and lower temporal resolution. At the same time, if the swath is narrow, it means the sensor scans a small area each time, which allows it to sense at a high spatial resolution. However, to achieve a higher temporal resolution, the swath size needs to be increased. In the same way, a large swath size means that spectral resolution has to be reduced as a compromise.

pixels

- tiny uniform regions that make up a remote sensing image, each with its own unique value

Terra

- was launched in 1999 as a collaboration between the United States, Canada and Japan, and continues its remote sensing mission today. "Terra" means Earth in Latin, which is an appropriate name for the satellite; its instruments measure the natural processes involved with Earth's land and climate

Interaction with the target

-After interacting with the atmosphere, some EM radiation finally makes it to Earth's surface and interacts with our targets. One of three things can happen to that energy (on a wavelength-by-wavelength basis): it can transmit through the target, it can be absorbed by the target, or it can be reflected off the target.

color composite image

-As there are only three color channels on a computer screen (red, green, blue), only three bands can be displayed at the same time

Resolution trade-offs

-Because of technical constraints, there are always limiting factors in spatial, temporal, and spectral resolution for the design and utilization of satellite sensors. -One of reasons is related to swath: -remote sensors are always designed with trade-offs. - some sensors have high temporal resolution, but low spatial resolution, like GOES, AVHRR, and MODIS. Some of them have low temporal resolution and high spatial resolution. There are also some with medium spectral, temporal, and spatial resolution, like Landsat and SPOT. However there are none with high resolutions in all three aspects.

Other remote sensing satellites:

-Envisat is a satellite launched by ESA in 2002, which carries 10 instruments including ASAR (Advanced Synthetic Aperture Radar), an active microwave sensor mainly used for sea forecasting and monitoring sea ice, and MERIS (Medium Resolution Imaging Spectrometer) a multi-spectral imaging spectrometer with 300m spatial resolution and 15 spectral bands. -The ESA's Sentinel mission has a few satellites in operation: Sentinel-1 is a polar-orbiting, all-weather, day-and-night radar imaging satellite that tracks ocean processes. The first Sentinel-1 satellite was launched on April 3, 2014. Sentinel-2 is a polar-orbiting, multispectral high-resolution imaging mission for land monitoring that provides imagery of vegetation, soil and water cover, inland waterways and coastal areas. Sentinel-2A, the first of two Sentinel-2 satellites, launched on June 12, 2015. It carries sensors that can image in 13 spectral bands, with 4 bands at 10 m spatial resolution, 6 bands at 20m resolution and 3 bands at 60m resolution. This will provide even more Landsat-type data. -The Meteosat series of satellites are Europe's geostationary weather observation satellites. The first generation of Meteosat satellites, Meteosat-1 to Meteosat-7, provide continuous and reliable meteorological observations. They provide images of the Earth and its atmosphere every half-hour in three spectral channels (visible, infrared, and water vapor) via the Meteosat Visible and Infrared Imager (MVIRI) sensor. The latest generation satellite, MSG-3, was launched on July 5, 2012. The next satellite (MSG-4) is planned for launch in 2015.

Geostationary Operational Environmental Satellites (GEOS) System: Your "Weather Guy" in the Space

-In the United States, your local television or newspaper weather report probably uses one or more weather satellite images collected from the Geostationary Operational Environmental Satellites (GOES) system. The GOES series of satellites is the primary weather observation platform for the United States. -The GOES system, operated by the United States National Environmental Satellite, Data, and Information Service (NESDIS), supports weather forecasting, severe storm tracking, and meteorology research. Spacecraft and ground-based elements of the system work together to provide a continuous stream of environmental data. The National Weather Service (NWS) uses the GOES system for its United States weather monitoring and forecasting operations, and scientific researchers use the data to better understand land, atmosphere, ocean, and climate interactions. -The GOES system uses geostationary satellites, 35,790 km above the earth, which — since the launch of SMS-1 in 1974 — have been essential to U.S. weather monitoring and forecasting. The GOES satellites provide new imagery every 15 minutes. The sensors in GOES satellites detect 3 spectral bands: visible, infrared, and water vapor.

Suomi NPP: the next-generation Earth observation satellite:

-The Suomi National Polar-orbiting Partnership (NPP) is the first satellite launched as part of a next-generation satellite system which will succeed the Earth Observation System (EOS). Suomi NPP launched on Oct. 28, 2001 from Vandenberg Air Force Base in California. It is named after Verner E. Suomi, a meteorologist at the University of Wisconsin - Madison. Suomi NPP is in a sun-synchronous orbit 824 km above the Earth. It orbits the Earth about 14 times a day and images almost the entire surface. Every day it crosses the equator at about 1:30 pm local time. This makes it a high-temporal resolution satellite.

Sensors used on Landsat Satellites:

-Landsat satellites carry sensors with medium spatial, temporal, and spectral resolution; their images are the most widely used of all satellite images. -MSS (Multi-Spectral Scanner) is the sensor that was used on Landsat 1 through 5. It takes measurements in 4 different bands: red, green, and two near infrared bands at 80-meter spatial resolution. -Landsat 4 and 5 were also equipped with a new sensor called the TM (Thematic Mapper). TM is also a multispectral sensor with seven bands: blue, green, red, three bands in the near infrared, and one band in far infrared. TM has 30 meter resolution in the visible and near infrared bands, and 120 meter resolution in the thermal/far infrared band. Landsat 7 was launched in 1999, not carrying either the MSS or TM, but a new sensor called ETM+ (Enhanced Thematic Mapper +). ETM+ senses the same 7 bands as the TM, but with improved spatial resolution in the thermal band: from 120 meter resolution to 60 meter resolution. It also includes a new band: a panchromatic band with a spatial resolution of 15 meters. Landsat 7's temporal resolution is still 16 days, but can acquire about 250 images per day. -Landsat 8, launched in 2013, carries two brand new sensors: the Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS). OLI collects data in 9 bands (Table 2), including four in visible wavelengths, one in near IR, two in shortwave IR, a panchromatic band in visible wavelengths, and an additional "cirrus" band. Compared to ETM+, OLI adds another band (Band 1) in the violet wavelength for better observations in coastal area and a band (Band 9 "cirrus") in near IR to detect cirrus clouds. TIRS collects data in two bands in thermal/far infrared wavelengths.

MODIS

-MODIS, the Moderate Resolution Imaging Spectroradiometer, is probably the most widely used instrument in remote sensing and global environment studies. MODIS produces over 40 separate data products related to environmental monitoring for both land and ocean monitoring. -MODIS imagery has moderate to coarse spatial resolution: 250m - 1 kilometer. The swath width of MODIS is 2,330 kilometers, making it possible to examine broad-scale phenomena over large geographic regions. MODIS is carried on both the Terra and Aqua satellites and can image most of the globe in one day, with complete global coverage in two days. With very high temporal resolution, MODIS allows scientists to quickly assess changes in landscapes, such as sea surface temperature, snow cover, global forest, and active large-scale fires.

Are aerial Photographs maps?

-Many people think that these images are the same as maps, but they are NOT. Maps are representational drawings of Earth's features while images are actual pictures of the Earth. -Maps have uniform scale, which means that the map scale at any location on the map is the same. -Maps are orthogonal representations of the earth's surface, meaning that they are directionally and geometrically accurate (at least within the limitations imposed by projecting a 3-dimensional object onto 2 dimensions). -Maps use symbols to represent the real world, while aerial photos, satellite images, and orthophotos show actual objects on Earth's surface. -Aerial photos have non-uniform scale, and can display a high degree of radial/relief distortion. That means the topography is distorted, and until corrections are made through rectification, measurements made from a photograph are not accurate. Nevertheless aerial photographs are a powerful tool for studying the earth's environment since they show actual features and not just symbols. -Orthophotos and some satellite imageries have been geometrically "corrected", and therefore have uniform scales. However, while you may be able to see roads on these image, they are not labeled as roads. You must interpret what you see on an image because it is not labeled for you. Therefore, they are NOT maps.

EOS: NASA Earth Observation System:

-Over the last decade NASA had launched many satellites to help us understand Earth's systems responses to natural and human-induced changes; this helps us predict and observe climate change, weather patterns and natural hazards. One of the most well-known programs is the Earth Observing System (EOS), which is a constellation of satellites that measure the clouds, oceans, vegetation, ice, and atmosphere of the Earth. EOS has three flagship satellites, Aqua, Terra and Aura, which are equipped with fifteen sensors

QuickBird: Commercial Very High Resolution Image

-QuickBird is a commercial satellite operated by DigitalGlobe (Figure 16, left). It was launched from Vandenberg air force base in California on October 18, 2011. At the time of its launch it was the highest resolution commercial satellite in operation. Now there are sensors with even higher spatial resolution, including the WorldView and GeoEye satellites also operated by DigitalGlobe. -QuickBird has two sensors --- a four-band (blue, green, red, and near-infrared) multispectral sensor with 2.4-meter spatial resolution and a panchromatic sensor with 0.61-meter spatial resolution. Both sensors provide off-nadir views (they can point in a direction other than straight down) and provide global coverage every 2 and 6 days, respectively. QuickBird circles the globe 450 km above the Earth on a sun-synchronous orbit.

Landsat Program: the longest continuous Earth-observation program

-The Landsat satellites were launched and managed by NASA and USGS for long-term continuous Earth surface observation. The first satellite in the program (Landsat 1) launched in 1972. Since then Landsat satellites have collected information about Earth's surface for decades. The mission is to provide repetitive acquisition of medium resolution multispectral data of the Earth's surface on a global basis. The data from the Landsat spacecraft constitute the longest record of the Earth's continental surfaces as seen from space. It is a record unmatched in quality, detail, coverage, and value for global change research and has applications in agriculture, cartography, geology, forestry, regional planning, surveillance and education. -Landsat 5, launched on March 1st, 1984, provided data for 29 years before retirement. It provided the longest continuous Earth observation records with 30 meter spatial resolution, 16-day revisit time, and 7 spectral bands that are essential to monitor changes on Earth's surface. -After the failure of Landsat 6, Landsat 7, equipped with a sensor similar to that on Landsat 5, was launched on April 15, 1999. However after working together with Landsat 5 to provide even more frequent observations, Landsat 7 had an equipment failure: the Scan Line Corrector (SLC) broke on May 2003, which means any images taken after that date have data gaps. However in spite of this failure, Landsat 7 still provides valuable images today.Figure 12 shows a comparison of images before and after the SLC failure. -Landsat 8, launched on February 11, 2013, joins Landsat 7 to continue capturing hundreds of images of the Earth's surface each day.

aerial photography

-the technique and process of taking photographs of the ground from an elevated position-flying an aircraft along flight lines - Typically more than one flight line is required to cover the area to be mapped, and adjacent flight lines get a 20%-30% side overlap to ensure no gaps in the coverage. This overlap allows for 3D viewing of aerial photographs using the principle of stereopsis

Comparison between panchromatic, true color, and CIR photos

-Usually it's easier to interpret true color than black-and-white photos; color photos show features in the same colors we see through our eyes, and capture the colors uniquely associated with landscape features. -The most obvious difference between true color and color infrared photos is that in color infrared photos vegetation appears red (Figure 9). Red tones in color infrared aerial photographs are always associated with live vegetation and the lightness or intensity of the red color can tell you a lot about the vegetation itself; its density, health and how vigorously it is growing. Dead vegetation will appear as various shades of tan, while vivid, healthy green canopies appear bright red. Color infrared photographs are typically used to help differentiate vegetation types -Panchromatic photographs are sensitive to light of all visible colors. They generally reflect how bright the surface is. Therefore, healthy deciduous vegetation is typically light (nearly white), roads are dark, and water bodies are usually black.

spatial resolution

-a measure of the smallest object or area on the ground that can be detected by the sensor - If a sensor has a spatial resolution of 10 meters, it means that one pixel in an image from the sensor represents a 10x10 meter area on the ground. -A sensor's spatial resolution will affect the detail you can see in an image -Images where only large features are distinguishable have low resolution, while in high resolution images, small objects can be detected. Generally speaking, the higher the resolution of an image, the more detail it contains. Commercial satellites provide imagery in resolutions ranging from less than a meter to several kilometers.

band

-a range of wavelengths detected by a remote sensor (e.g. green band, near-infrared band)

panchromatic sensor

-a sensor measuring the visible portion of the spectrum, which treats the entire 0.4 to 0.7 micrometer range as if it was one band. -measure and treat all wavelengths in a single broad band

multispectral sensor

-a sensor that collects information across several bands, each of which are broad portions of the electromagnetic spectrum. -For example, we can divide the visible and near-infrared portions of the spectrum into four bands: blue, green, red and a near-infrared band. -divide and measure wavelengths into several broad bands

hyperspectral sensor

-a sensor that collects information across very many narrow, contiguous bands. A hyperspectral sensor could be able to sense over 200 bands.

sun-synchronous orbit

-a type of near-polar orbit where satellites always pass the same location on Earth's surface at the same local time. - Most remote sensing satellites are in sun-synchronous orbits

polar orbit

-a type of orbit where satellites pass above or nearly-above both poles of the Earth and make several passes per day.

passive remote sensing

-a type of remote sensing technology where the sensor measures EM energy reflected by the target that originates from an external source such as sun or the target itself. -Energy that is emitted directly from the object (such as thermal infrared energy) can be detected day or night, as long as the amount of energy is large enough to be recorded by the sensor.

active remote sensing

-a type of remote sensing technology where the sensor provides its own energy source for illuminating the target, and then detects the reflected energy. -Some examples of active sensors are laser range-finding, radar, and lidar (which is like radar but uses laser pulses instead of microwaves or radio waves).

orthophoto

-aerial photos which have been geometrically rectified, correcting and removing the effects of relief displacement - Images in which distortion from the camera angle and topography has been removed and corrected. - An orthophoto is a uniform-scale photograph. Since an orthophoto has a uniform scale, it is possible to measure the ground distance on it like you would on a map. An orthophoto can also serve as a base map onto which other map information can be overlaid.

geostationary orbit

-an orbit where the satellite travels at the same speed as Earth's rotation, which means it is always monitoring the same region of Earth. - Many weather satellites are in geostationary orbit so they can continuously collect information about the same area.

relief displacement

-effect seen in photos where tall objects (such as cliffs, towers) have a tendency to bend outward from the principal point towards the edges of the photo -Typically, the higher the aircraft and the smaller the angle between the ground and the camera, the less severe the relief displacement.

color infrared photo

-false color image - captured using film or a digital sensor that is sensitive to both visible and infrared light -photos that display near infrared light in red, red light in green, and green light in blue. In these photos green plants look red. - Infrared energy is invisible to our eyes, but is reflected very well by green, healthy vegetation. In CIR photos near-infrared (NIR) energy is displayed in the color red, red light is displayed with the color green, and green light is displayed in the color blue. Blue light is not shown in the image, as it is filtered out by the sensor or film

Size

-information about the length and width of objects in the image

true color photo

-look similar to the photos you take with a camera or phone -photos that are displayed with red light in red, blue light in blue, and green light in green (how your eyes perceive color) - They capture three major wavelength of visible light - red, green, and blue. These colors are composited together in the digital imager or film in such a way that red light is displayed in red, green light is displayed in green, and blue light is displayed in blue.

temporal resolution

-the revisit period of a satellite's sensor for a specific location on the Earth. It is also the length of time for a satellite to complete one orbit cycle of the globe. -For example, Landsat needs 16 days to revisit the same area, and MODIS needs one day. -If a satellite needs less than 3 days to revisit the same place, we would say it has high temporal resolution. A high temporal resolution image is required to monitor conditions that can change quickly or require rapid responses (i.e. hurricanes, tornadoes, or wildfires). If a satellite needs 4-16 days to revisit the same place, it has medium temporal resolution; if it takes more than 16 days to revisit the same place, it has low temporal resolution.

oblique aerial photo

-photo taken by a camera at an angle; the camera's axis is inclined away from vertical -Low Oblique photographs -typically taken from 300-1,000 feet above the ground, at a 5-30 degree angle, through the open door of a helicopter. This is a good way to show the facade of a building without showing too much of the roof. The most detailed images of this type are low-altitude aerial photos -High Oblique photograph- taken from 8,000-13,000 feet above the ground from an airplane, at a 30-60 degree angle, from an open window of the airplane. This is a good way to show areas from 2-20 square miles. Photos taken at high altitudes (1,500-10,000 feet) provide less environmental since the image scale is much smaller, but high-altitude high-oblique photos have a distinct advantage: more ground area can be imaged in a single photo. -In a high oblique photo the apparent horizon is shown. In a low oblique photo the apparent horizon is not shown. Often because of atmospheric haze or other obstructions, the true horizon of a photo cannot be seen

aerial photo

-photos or images taken by sensors on board aircraft or other platforms above Earth's surface -when taking an aerial photo with a camera or sensor on an airplane the shooting angle matters

vertical aerial photo

-photos taken from an aerial platform (either moving or stationary) where the camera axis is truly vertical to the ground -Typically, vertical photos are shot straight down from a hole in the belly of the airplane. Vertical photographs are mainly used in photogrammetry and image interpretation

panchromatic photo

-records electromagnetic energy in visible wavelengths and displays it in grayscale/black & white -"all-colors" -photos record or capture electromagnetic energy in the 300 to 700 micrometer (nm) wavelength, including all visible portions of light (Figure 6). Panchromatic aerial photos are usually in grayscale (that is, they are black and white aerial photos). The more visible light gathered by the camera, the lighter the tone in the final photo

Association

-relation between an object and other nearby features in an image - Sometimes objects that are difficult to identify on their own can be understood from their association with objects that are more easily identified.

reflection

-the change in direction of a wave at an interface between two different media so that the wave returns into the medium from which it originated. In remote sensing, it is the process where EM energy is reflected back to the atmosphere instead of being absorbed or transmitted by the object. -Most of the radiation not absorbed is reflected back into the atmosphere, some of it towards the satellite. This upwelling radiation undergoes another round of scattering and absorption as it passes through the atmosphere before finally being detected and measured by the sensor. -Absorption and reflection of visible EM radiation explains why we see the world in various colors

channel

-the displayed color (red, green, or blue) on electronic screens. Different bands can be displayed in different channels, as in color composite images.

texture

-the frequency of change and arrangement of tones in particular areas of an image -a micro image characteristic - The texture of objects can be identified as fine (smooth) or rough (coarse). - Texture also refers to grouped objects that are too small or too close together to create distinctive patterns. Therefore, texture is a group of repeated small patterns - The difference between texture and pattern is largely determined by photo scale.

Shape

-the general form, structure or outline of individual objects - Shape can be a very distinctive clue in image interpretation - A vertical aerial photograph shows clear shapes of objects as viewed from above

principal point

-the geometric center point of the photo

Site

-the location characteristics of an item in the image

spectral resolution

-the number of spectral bands -- portions of the electromagnetic spectrum -- that a sensor can collect energy in. -The finer the spectral resolution, the narrower the wavelength ranges for a particular channel or band. • High spectral resolution: ~200 bands • Medium spectral resolution: 3~15 bands Low spectral resolution: ~3 bands

mosaicking

-the process of merging different photos from each flight line into one big aerial photo-done after capturing a series of aerial photos

Remote sensing

-the process of obtaining information "sensing" without physical contact "remote" - In geography and environmental sciences, remote sensing refers to technologies that measure objects on Earth surface through sensors onboard aircraft or satellites. Sensors are instruments that receive and measure electromagnetic signals. - designed to measure electromagnetic signals at different wavelengths - Satellite imagery: Images taken from sensors mounted on satellites.

spectral signature

-the properties of an object as described by its absorption and reflection properties at different wavelengths

electromagnetic spectrum

-the range of all possible wavelengths of electromagnetic radiation/energy - Remote sensing measures the electromagnetic radiation reflected by object

electromagnetic spectrum

-the range of all possible wavelengths of electromagnetic radiation/energy - Sensors used in remote sensing are much more sensitive than our eyes. They can detect EM energy in both the visible and non-visible areas of the spectrum

microwave

-the region of the electromagnetic spectrum from approximately 1 to 1,000 millimeters. -This covers the longest wavelengths used for remote sensing. In your daily life you use microwave wavelengths to heat your food. In remote sensing, microwave radiation is used to measure water and ozone content in the atmosphere, to sense soil moisture, and to map sea ice and pollutants such as oil slicks.

tone

-the relative brightness or color of objects in an image - Generally, tone is the fundamental element for distinguishing between different targets or features - Color is more convenient for the identification of object details.

Aura

-was launched from Vandenberg Air Force Base on July 15, 2004. The name "Aura" comes from the Latin word for air. Aura carries four instruments, which obtain measurements of ozone, aerosols and key gases in the atmosphere

Aqua

-was launched in 2002 as a joint mission between the United States, Brazil and Japan. In Latin "Aqua" means water, indicating that the main purpose of the satellite is to examine Earth's water cycle. Aqua's main goal is to monitor precipitation, atmospheric water vapor, and the ocean. Terra and Aqua were designed to work in concert with one another. Both Aqua and Terra carry a MODIS and a CERES sensor, which in essence doubled the data collection of these these sensors and increased their temporal resolution.

absorption

-when EM energy is trapped and held by an object rather than passing through or reflecting off it.

transmittance

-when an electromagnetic (EM) wave passes straight through an object -occurs when energy simply passes through a surface. Think of light passing through a windshield of a car.

Five Sensors aboard Suomi:

1. Advanced Technology Microwave Sounder (ATMS) ATMS is a multi-channel microwave radiometer. It is a passive sensor, which means it measures the microwave radiation of the sun reflected by objects on Earth's surface. ATMS has 22 bands and is used to retrieve profiles of atmospheric temperature and moisture for weather forecasting. These measurements are important for weather and climate research. 2. Visible Infrared Imaging Radiometer Suite (VIIRS) VIIRS is a sensor that collects visible and infrared imagery of the land, atmosphere, cryosphere, and oceans. It has 9 bands in visible and near IR wavelengths, 8 bands in Mid-IR, and 4 bands in Long-IR, which makes it a high-spectral resolution sensor. VIIRS has about 650 m - 750 m spatial resolution, and images the entire globe in two days. VIIRS data is used to measure cloud and aerosol properties, ocean color, sea and land surface temperature, ice motion and temperature, fires, and Earth's albedo. Climatologists use VIIRS data to improve our understanding of global climate change. For example, VIIRS captured a view of the phytoplankton-rich waters off the coast of Argentina (Figure 9). The Patagonian Shelf Break is a biologically rich patch of ocean where airborne dust from the land, iron-rich currents from the south, and upwelling currents from the depths provide a bounty of nutrients for the grass of the sea—phytoplankton. In turn, those floating sunlight harvesters become food for some of the richest fisheries in the world. 3.Cross-track Infrared Soundear (CrIS) CrIS is a fourier transform spectrometer with 1,305 spectral channels in far infrared wavelengths. It can provide three-dimensional temperature, pressure, and moisture profiles of the atmosphere. These profiles are used to enhance weather forecasting models, and they will facilitate both short- and long-term weather forecasting. Over longer timescales, they will help improve understanding of climate phenomena such as El Niño and La Niña. 4.Ozone Mapping Profiler Suite (OMPS) OMPS measures the global distribution of the total atmospheric ozone column on a daily basis. Ozone is an important molecule in the atmosphere because it partially blocks harmful ultra-violet light from the sun. OMPS enhances the ability of scientists to measure the vertical structure of ozone (Figure 10), which is important in understanding the chemistry of how ozone interacts with other gases in the atmosphere. 5. Clouds and the Earth's Radiant Energy System (CERES) CERES is a three-channel sensor measuring the solar-reflected and Earth-emitted radiation from the top of the atmosphere to the surface. The three channels include a shortwave (visible light) channel, a longwave (infrared light) channel, and a total channel measuring all wavelengths. These measurements are critical for understanding climate change.

Please order the wavelengths of the spectral bands in an ascending order (short-->long) to form the electromagnetic spectrum

1. Blue 2. Green 3. Red 4. Near-infrared 5. Mid-infrared 6. Thermal infrared

Advantages of remote sensing

1. It is capable of rapidly acquiring up-to-date information over a large geographical area. 2. It provides frequent and repetitive looks of the same area. 3. It is cheap with less labor input. 4. It provides data from remote and inaccessible regions, like deep inside deserts. 5. The observations of remote sensing are objective.

The spatial resolution of Landsat 8 (in visible and near IR) is _____. The revisit time (temporal resoltuion) is _____.

30 m, 16 days

Air photos are taken with overlaps both along a flight line and between flight lines. Generally speaking, there are ____ % overlaps along a flight line and ____ % overlaps between two adjacent flight lines

60-80% along a flight line 20-30% between two adjacent flight lines

Remote Sensing Process

A. Energy Source or Illumination - The first requirement for remote sensing is to have an energy source which illuminates or provides electromagnetic energy to the target of interest. • a. The main source used in remote sensing is the sun. Most visible light sensed by satellites is reflected solar radiation. • b. Sometimes the electromagnetic radiation is generated by the remote sensing platform. Some remote sensors emit microwave radiation to illuminate the object and measure the reflected microwave radiation; this is also known as radar. • c. The energy can also be emitted by the object itself. All objects on Earth emit thermal infrared radiation, which can be detected by remote sensors. B. Radiation passes through the Atmosphere - as the energy travels from its source (e.g. the sun) to the target, it will interact with the atmosphere it passes through. Another energy/atmosphere interaction takes place as the energy travels from the target to the sensor. C. Interaction with the Target - once the energy makes its way to the target through the atmosphere, it interacts with the target depending on the properties of both the target and the radiation. D. Recording of Energy by the Sensor - after the energy has been reflected by, or emitted from the target, the sensor onboard the satellite collects and records the electromagnetic radiation. E. Transmission, Reception, and Processing - the energy recorded by the sensor has to be transmitted to a receiving station where the data are processed into an image (hardcopy and/or digital). F. Interpretation and Analysis - the processed image is interpreted, visually and/or digitally, to extract information about the target. G. Application - the final element of the remote sensing process is achieved when we use the information extracted from the imagery to find new information or to solve a particular problem.

Which of the following is NOT a possible application of Landsat images? Monitoring deforestation rate in Brazilian Amazon Detecting forest wildfire and burned area in the Yellowstone National Park Mappying urbaniation of the Pearl river area in China Conducting a real estate survey in a neighborhood in Sanfranciso by outlining the area and shape of each building or house

Conducting a real estate survey in a neighborhood in Sanfranciso by outlining the area and shape of each building or house

1. Which is NOT a component of visual image interpretation? a. shape b. tone c. shadow d. aspect

D. Aspect

T/F A regular aerial photo has a uniform scale over the area it covers

False

T/F An Orthophoto can be accurately used as a map

False

T/F In a long wavelength, waves occur more frequently.

False

T/F Landsat 5 is still in operation as of today.

False

T/F Radar is a good example of passive remote sensing.

False

T/F Since panchromatic photo receives all wavelengths of visible light, it is colorful.

False

The geometric center of aerial imagery is

Principal point

In a multi-band composite image, where near infrared is shown in red channel, red band in shown in green channel, and green band in shown in blue channel, which color would the dense forest in the image show?

Red

Remote sensing is actually capturing:

Reflected light

The tendency for tall objects in aerial photos to lean away from a center point and toward the edges of the photo is:

Relief Displacement

T/F A satellite with a four-day temporal resolution passes over your house on January 2. The next day the satellite passes over your house will be January 6.

True

T/F Hyperspectral imagery is made possible by a sensor capable of sensing hundreds of bands of energy simultaneously.

True

T/F IKONOS has lower spatial resolution compared with Quick-bird.

True

____provides more ground area can be covered on a single photo _____ looks like if we look straight down from an aircraft ____ provides more details about the target objects to be viewed

Vertical aerial photo: looks like if we look straight down from aircraft; Low oblique aerial photo: provides more details about the target objects to be viewed High oblique aerial photo: provides more ground area can be covered on a single photo

eight elements/clues used in image interpretation

• Tone/color -- lightness/darkness/color of an object • Texture -- coarse or fine, such as in a corn field (distinct rows) vs. wheat field (closely-grown plants) • Shape -- square, circular, irregular • Size -- small to large, especially compared to known objects • Shadow -- objects like buildings and trees cast shadows that indicate vertical height and shape • Pattern -- many similar objects may be scattered on the landscape, such as oil wells • Site -- the characteristics of the location; for example, don't expect a wetland to be in downtown Chicago Association -- an object's relation to other known objects -- for example, a building at a freeway off-ramp may be a gas station based on its relative location


Conjuntos de estudio relacionados

Ch. 12: Food, Soil, and Pest Management

View Set

Real Estate Post-licensing Course - Unit Exams

View Set