Remote Sensing

¡Supera tus tareas y exámenes ahora con Quizwiz!

Color Theory (Image Display)

Additive -equal proportions of blue, green, and red light superimposed on top of one another creates white light, i.e., white light is composed of blue, green, and red light Subtractive -equal proportions of blue, green, and red pigments yield a black surface

SAR Advant and Disadvant

Advantages • SAR is monochromatic coherent system as opposed to optical systems, which are polychromatic and incoherent • All-weather, 24-hour operational Disadvantages • Single wavelength signal, image speckling due to in phase/out of phase interference • Image distortion due to terrain variations • High cost compared to optical imagery

Image File Formats

BSQ - Band Sequential Format; Good for multispectral images. (arranged sequentially by band) BIP - Band Interleaved by Pixel Format; Good for hyperspectral images. (arranged by band interleaved by line) BIL - Band Interleaved by Line Format; Good for images with specifically 20-60 bands. (arranged by band interleaved by pixel)

PCA Components

First Principal Components -The length and direction of the widest transect of the ellipse are calculated using matrix algebra *The transect, which corresponds to the major (longest) axis of the ellipse, is called the first principal component of the data **The direction of the first principal component is the first eigenvector, and its length is the first eigenvalue Second Principal Components -The second principal component is the widest transect of the ellipse that is orthogonal (perpendicular) to the first principal component. *the second principal component describes the largest amount of variance in the data that is not already described by the first principal component **the second principal component corresponds to the minor axis of the ellipse

Image Display/Quality Assessment

Histograms - evaluate features within an image based on the brightness values received and frequency those features are found at LUT Values - the brightness values from each pixel (which is from a specific #row and #column) Image Fusion (combine bands) -> Resolution Merge (merge/stack bands) -> Pan-Sharpening (sharpening of a multilayered image)

Image Regression

In this method of change detection, the pixels values at date 1 is assumed to be linearly related to those at date 2. Therefore, we can built a linear relationship between the images in date 1 and date 2. We can predict the pixel values from date 1 to date 2. Take a difference between the actual date 2 value with the predicted date 2 value to make a change/no change detection map.

Active RS Techology

LIDAR RADAR

Visual Enhancement Filtering

Meant to sharpen, smooth, enhanced edges, and filter an image in different ways: Neighborhood Filtering *filter pixels to match the majority of the surrounding pixels Convolution filtering *sharpening an image Statistical filtering

Passive Microwave Remote Sensing

Microwave region: 1-200 GHz (0.15-30cm) • Uses the same principles as thermal remote sensing • Multi-frequency/multi-polarization sensing • Weak energy source so need large IFOV and wide bands -• Thus, passive microwave brightness temperatures can be used to monitor temperature as well as properties related to emissivity *the brightness temperature and is linearly related to the kinetic temperature of the surface Applications *• Soil moisture, Snow water equivalent, Sea/lake ice extent, concentration and type, Sea surface temperature, Atmospheric water vapor

Spatial Resolution

Practically effective measure for the (theoretical) geometric resolving power of a sensor *Think different clarity depending on sensor (hyper vs. landsat vs. modis lab) *Instantaneous Field of View(IFOV): the solid angle through which a detector is sensitive to radiation. -the remotely sensed data are acquired by a sensor system that collects data with the same IFOV on each date is best for change detection

Image Interpretation by Humans

Preparation *obtain images Pre-Works *read annotation data/orient image w/ map Image Reading *read shape and pattern, then make a key Image Measurement *measure length and height ratio b/tw objects Image Analysis *analyze objects of phenomena w/in images Thematic Map *display interpreted results on basemap

Unsupervised Classification Pros and Cons

Pros • Takes maximum advantage of spectral variability in an image • Very successful at finding the "true" clusters within the data if enough iterations are allowed • Cluster signatures saved from ISODATA are easily incorporated and manipulated along with (supervised) spectral signatures Cons • The maximally-separable clusters in spectral space may not match our perception of the important classes on the landscape • Slowest (by far) of the clustering procedures.

Passive RS Technology

Reflected Energy *panchromatic *multispectral *hyperspectral Emitted Energy *Thermal *Microwave **Optical RS (visible - near-IR) = abilities of objects reflect solar radiation **Emissive RS (mid-IR - microwave) = objects abilities to absorb swir, nir, & visible and then emit

RADAR Remote Sensing

SAR: Synthetic Aperture RADAR RADAR: RAdio Detection And Ranging • Radar remote sensing involves active sensors • Transmits microwave radiation as short pulses and receives the reflected energy from the target object • Operational at all times because it is not dependent on an external radiation source • Able to operate in all weather conditions due to its ability to penetrate clouds =(long wavelengths) • Operates in the wavelength range 1 mm to 1 m • Records reflections at single band of a specific wavelength, X, C, L, etc. **These band letter names are a throwback to army days when they were secretly coded. Components *Transmitter *Receiver *One or more antennas *Computer processor RADARSAT SAR - Sensor

Change Vector Analysis

The reflectance values collected for a pixel can be thought of as coordinates for a vector in multidimensional space. The vector (or triangular line connecting) difference between two dates indicated the change for a pixel.

Image Enhancement Categories

There are three categories of image enhancement: Visual • General Contrast Enhancement • Image stretching • Filtering Spatial • Resolution Merge/ Pansharpening Spectral • Different Indices • Image Transformation

Digital Image Processing

meant to process images received, such as from satellites, and find meaning within them. Starts with receiving a (multispectral) image and digitally processing it to that of GIS leve, color-coded image. -Techniques necessary to obtain a map like the Land Use/Land Cover map from the acquired satellite image

Pushbroom Satellites

• A Pushbroom scanner (along track scanner) is a technology for obtaining satellite images with optical cameras. It is used for passive remote sensing from space. • In a pushbroom sensor, a line of sensors arranged perpendicular to the flight direction of the spacecraft is used. Different areas of the surface are imaged as the spacecraft flies forward. *Covers entirety of an area as it pans (bottom-up) -Narrow Swath Width • A pushbroom scanner can gather more light than a whisk broom scanner (longer time and exposure) Examples: *IKONOS, QuikBird, Hyperion, SPOT

Density Slicing

• A digital data interpretation method used in analysis of remotely sensed imagery to enhance the information gathered from an individual brightness band. • Done by dividing the range of brightness in a single band into intervals, then assigning each interval to a color to represent specific classes.

Whiskbroom Satellites

• A whiskbroom or spotlight sensor (across track scanner) is a technology for obtaining satellite images with optical cameras. It is used for passive remote sensing from space. • In a whiskbroom sensor, a mirror scans across the IFOV (ground track), reflecting light into a single detector which collects data one pixel at a time. -Wider swath width • The moving parts make this type of sensor expensive and more prone to wearing out. Examples: *Landsat, AVHRR, AVIRIS

Radiometric Resolution

• Ability to record many levels of values (# of DN) • How sensitive a sensor is to differences in the energy, or brightness levels - differences in DN # due to the bit data • Imagery data are represented by positive digital numbers which vary from 0 to (one less than) a selected power of 2. • This range corresponds to the number of bits used for coding numbers in binary format. Each bit records an exponent of power 2 (e.g. 1 bit=2 1=2). • The maximum number of brightness levels available depends on the number of bits used in representing the energy recorded -When the radiometric resolution of data acquired by one system (e.g., Landsat MSS 1 with 6-bit data) is compared with data acquired by a higher radiometric resolution instrument (e.g., Landsat TM with 8-bit data), the lower-resolution 6-bit should be converted to 8-bits for change detection.

Maximum Likelihood

• Assume multivariate normal distributions of pixels within classes • For each class, build a discriminant function • For each pixel in the image, this function calculates the probability that the pixel is a member of that class • Each pixel is assigned to the class for which it has the highest probability of membership Pros -Most sophisticated; achieves good separation of classes Cons -Requires strong training set to accurately describe mean and covariance structure of classes

Unsupervised Classification

• Based on spectral properties of different objects and statistics • An arbitrary categorization of data • Pixels are categorized by the statistics of the spectral properties of the features • User has controls over: *# of classes, # of iterations, Convergence thresholds • Two main algorithms: Iterative Self-Organizing Data (ISODATA) Clustering and K-means -It is a simple process to regroup (recode) the clusters gathered from the reflectance of the pixels into meaningful information classes (the legend) and use these to identify all the pixels.

Hybrid Classification

• Combination of supervised and unsupervised classification • Usually done to separate the features with overlapping spectral signatures

Spatial Enhancement Merging/Pansharpening

• Combine multiple images into composite products, through which more information than that of individual input images can be revealed. • Specifically pan-sharpening describes a process of transforming a set of coarse (low) spatial resolution multispectral (color) images to fine (high) spatial resolution color images, by fusing a cogeoregistered fine spatial resolution panchromatic (black/white) image. Typically, three low-resolution visible bands - blue, green and red - are used as main inputs in the process to produce a high resolution true color image.

Geostationary Satellites

• Earth-orbiting satellite, placed at an altitude of approximately 35,800 kilometers (22,300 miles) directly over the equator • Revolves in the same direction the earth rotates (west to east). (appear stationary within the sky) • One orbit takes 24 hours, the same length of time as the earth requires to rotate once on its axis. -A single geostationary satellite is on a line of sight with about 40 percent of the earth's surface. *Three such satellites, each separated by 120 degrees of longitude,can provide coverage of the entire planet (except the poles) **Weather satellites (GOES, METEOSAT),telephone and television relay satellites are of this kind **Good for storm mngmt and natural disasters

Minimum Distance

• Find mean value of pixels of training sets in n-dimensional space • All pixels in image classified according to the class mean to which they are closest Pros -All regions of n-dimensional space are classified and allows for diagonal boundaries Cons -Assumes that spectral variability is same in all directions, which is not the case

LiDAR Returns

• First *Ideal for surface models • Last *Ideal for generating bare-earth terrain models • Intermediate *Ideal for determining vegetation structure

Temporal Resolution

• Imaging revisit interval • How frequently repetitive coverage of an area can be obtained in imagery • Landsat: 16 days • MODIS: 2/Day -Temporal resolution should be held constant during change detection; at same time of day

Accuracy Assessment - Reference Data

• It is difficult to ground truth every pixel in the classified data. So, a set of reference data is used. • Reference pixels are points on the classified image for which actual data are known. • The reference pixels are randomly selected • By allowing the reference pixels to be selected at random, the possibility of bias is lessened or eliminated

Passive Microwave RS Emissivity Differences

• Microwave emissivity is a function of the "dielectric constant" • Most earth materials have a dielectric constant in the range of 1 to 4 (air=1, veg=3, ice=3.2) • Dielectric constant of liquid water is 80 • Thus, moisture content affects brightness temperature • Surface roughness also influences emissivity

SAR (RADAR) Basics

• Modern radar sensors are Synthetic Aperture Radar (SAR), in which the large antenna (aperture) is synthesized through the use of a transported small antenna and digital signal processing • SAR made radar sensors more portable, enabling transport on aircraft and spacecraft Terrain Interaction and Radar Imaging -Response to radar energy dependent on three factors: surface roughness, local incident angle and the dielectric constant of the target materials *Surface Roughness is measured as the average height variation in the surface cover (in cm). **A surface is smooth if the height variations are smaller than the radar wavelength

Sun-Synchronous Satellites

• Operates in the orbit that provides consistent lighting of the Earth-scan view (w/ the sun) • Satellite passes the equator and each latitude at the same time each day. Which means the area the satellite flies over always gets same sunlight angle. • Incidence angle of sunlight to the land surface is always the same • Most of the currently operating optical satellites are of this category

LiDAR vs. RADAR

• Primary difference: *LIDAR uses much shorter wavelengths of the electromagnetic spectrum **Typically uses ultraviolet, visible, or near infrared. *RADAR uses much longer wavelengths **Uses microwave and thermal

Spectral Enhancement Principal Component Analysis (PCA)

• Principal components analysis (PCA) is often used as a method of data compression. • It allows redundant data to be compacted into fewer bands - that is, the dimensionality of the data is reduced. • The bands of PCA data are non-correlated and independent, and are often more interpretable than the source data

SAR Imaging Factors (Terrain interaction/Reflection)

• Specular reflection=caused by a smooth surface, results to appear as darker toned areas on an image. • Diffuse reflection=caused by a rough surface, which scatters the energy equally in all directions. A rough surface will appear lighter in an image. • Corner reflection=occurs when the target object reflects most of the energy directly back to the antenna resulting in a very bright appearance to the object (usually with large buildings/metals)

Accuracy Assessment - Kappa Statistics

• The Kappa coefficient expresses the proportionate reduction in error generated by a classification process compared with the error of a completely random classification. • For example, a value of 0.82 implies that the classification process is avoiding 82 percent of the errors that a completely random classification generates. *Possible values range from +1 (perfect agreement) to -1 (complete disagreement). And a value of 0 means no agreement above that expected chance

Change Detection - Region of Interest

• The dimensions of the change detection region of interest (ROI) must be carefully identified and held constant throughout a change detection project. • It must be completely covered by n dates of imagery. • Failure to do so result in change detection maps with data voids that are problematic when computing change statistics. -Successful remote sensing change detection requires careful attention to: • remote sensor system considerations • environmental characteristics.

Parellelepiped

• The parallelepiped classifier uses the threshold of each class signature to determine if a given pixel falls within the class or not. • The thresholds specifies the dimensions (in standard deviation units) of each side of a parallelepiped surrounding the mean of the class in feature space. *If the pixel falls inside the parallelepiped, it is assigned to the class. Pros -Simple and makes few assumptions about character of the classes Cons -Parallelepipeds are rectangular, but spectral space is "diagonal," so classes may overlap and it may be unable to define n-dimension pixels

Regression

• Usually conducted to extract information regarding for a specific biophysical variable • Observed data are required to create a linear or non-linear relationship between the target feature/object (depended variable) and the image data (independent variables) *Non-linear regression can be done by Artificial Neural Networks (ANN)

Spectral Resolution

- A measure of the width of a spectral band or channel; also, the "granularity" with which a senor divides its spectral coverage into discretely observable parts. (BANDS!!) • Spectral Coverage: that portion of the spectrum which a sensor detects. • Spectral Band: a well-defined continuous range (interval) of wavelengths in the electromagnetic spectrum individually sensed by a detector. -For multiple sensor, select bands that are approximate to one another for change detection

Coordinate Transformation (Equation)

- To transform from one coordinate space to another we need a coordinate transformation equation -The statistical technique of least squares regression is used to determine the coefficients for the coordinate transformation equations *Which the equations require: **Ground control points (GCP) **Suitable geometric model

Electromagnetic Spectrum for Sensors

- Ultraviolet < 0.4 -Visible (10^-6) *Blue = 0.4 - 0.5 *Green = 0.5 - 0.6 *Red = 0.6 - 0.7 Infrared (10^-6 (near) to 10 ^-4 (far)) * Near IR = 0.7 - 1.3 * Mid IR = 1.3 - 3.0 *Thermal = 3.0 < *Microwave = 3.0 <

Image Retrieval

- obtain data (multispectral vs. panchromatic) -check images file format: export/resample if need - compile bands/stack layers -additions and subtractions made

LiDAR

-(Light Detection And Ranging) is an optical remote sensing technology that measures properties of scattered light to find range and/or other information of a distant target • Day or night operation except when coupled with digital camera • LiDAR provides a point cloud with X.Y,Z positions • Target must be visible within the selected EM spectrum (No rain or fog) **Must be taken below clouds • Able to "penetrate vegetation" **can penetrate openings in the vegetation cover but cannot see through closed canopies Components *GPS *Inertial Measurement Unit *Laser Range Finder

RMS Error

-Accuracy for the coordinates resulting from a transformation is normally expressed for each control point and overall, as root mean square error (RMS error) -Acceptable RMS error is < 1 pixel -

Remote Sensing

-Acquiring information without being in physical contact" -Scientists define the term more narrowly, restricting remote sensing to the use of techniques involving radiation on the electromagnetic spectrum. -Remote sensing uses the visual, infrared, and microwave portions of the electromagnetic spectrum. - Remote sensing is generally conducted by means of remote sensors installed in aircraft and satellites

Warping (Geometric Models)

-After GCPs, we need to transform or warp the raster to map coordinates -Warping uses a mathematical transformation to determine the correct map coordinate location for each cell in the raster -Different geometric models exist to transform raster images to map coordinates

Ground Control Points

-Are the locations that are found both in the image and in the map -The GCP coordinates are recorded from the map and from the image and used to determine the coordinate transformation equations Key Points *Should be distinct features in both the map and image needing georecitfication *Should have clear boundaries and regular geometry *Good GCPs would be house corners, road intersections, field

Tasseled Cap Transformation Indices

-Brightness—a weighted sum of all bands, defined in the direction of the principal variation in soil reflectance. -Greenness—orthogonal to brightness, a contrast between the near-infrared and visible bands. Strongly related to the amount of green vegetation in the scene. -Wetness—relates to canopy and soil moisture -Haze-the haze of the atmosphere within an image

Orthorectification

-Correction of the image for topographic distribution -This is important in mountainous areas where there can be distortions due to elevations vary

Image Interpretation by Computers

-Different classification schemes -Change detection analysis -Estimation of physical quantities -Indices (Image Enhancements) Can calculate area covered by taking DN value number of a class and multiplying it w/ pixel size

Bilinear Interpolation

-Distance weighted average of the DN's of the four closest pixels -More spatially accurate than nearest neighbor -Pixel DN's are averaged, smoothing edges, and extreme values

Hyperspectral - EOS(-1)

-Earth Observation System (EOS) and EO-1 mission *EO-1 carries three sensors: ALI, Hyperion and AC *Hyperion, space-based imaging system, similar to AVIRIS, 220 bands, 10 nm band width, 30 m pixels *See sample image for ALI, Hyperion, AC and AVIRIS

Cubic Convolution

-Evaluates block of 16 nearest pixels, not strictly linear interpolation -Mean and variance of output distribution match input distribution, however data values are altered -Can both sharpen image and smooth out noise

Coordinate Systems

-Geographic coordinate system is a latitude and longitude coordinate system -Projected coordinate system represents the projection of a geographic coordinate system onto a plane -Coordinate systems based on Universal Transverse Mercator (UTM) projections is an example of projected coordinate system - each country developed its own coordinate system depending on its position on the globe and on map scale. For example MSTM

Georectification

-Georeferenced raster data allows it to be viewed, queried, and analyzed with other geographic data -The imagery must be transformed from the acquisition coordinate system (rows/cols) to that of the other digital map/data sets

Image Extraction

-Image classification (hard vs. fuzzy) -parametric (real world data) *supervised classification **Max likelihood or parallelepiped -non-parametric (statistical ISODATA) *unsupervised class; no field data **Density slicing -non-metric (machine learning) *artificial neural network data

Image Rationing

-Image ratioing for change detection is based on the following fact that the ratio of the DN values for a stable feature over two dates would be unity, while changed pixels would have a ratio significantly different from unity.

PCA Ellipse Diagram

-In an n-dimensional histogram, an ellipse (2 dimensions), ellipsoid (3 dimensions), or hyperellipsoid (more than 3 dimensions) is formed if the distributions of each input band are normal or near normal. -The axes of the spectral space are rotated, changing the coordinates of each pixel in spectral space, as well as the data file values. The new axes are parallel to the axes of the ellipse.

Geologic Unit Mapping Data Requirements

-Mapping geologic units consists primarily of identifying physiographic units and determining the rock lithology or coarse stratigraphy of exposed units. -Data Requirements: 1. For site specific analysis, airphotos provide a high resolution product that can provide information on differential weathering, tone, and microdrainage. 2. Regional overviews require large coverage area and moderate resolution. An excellent data source for regional applications is a synergistic combination of radar and optical images to highlight terrain and textural information.

Image Pre-Processing

-Quality assessment *is there cloud coverage or missing data? -image reduction *select aoi = subset image/clip

Image Rectification/Restoration

-Radiometric corrections *systematic = stripping *non-systematic errors = topography, atmosphere, or cloud coverage issues -Geometric corrections *georecitify image position with gps coordinates *co-registration (rectifying images w/ gps coords)

Geological Applications of RS

-Remote sensing has become a widely accepted research tool by almost all Geological Surveys in the world. -A synoptic view of regional scale is a much different perspective than point ground when trying to map structural elements -allows a geologist to examine other reference ancillary data simultaneously and synergistically, such as geo-magnetic information

Resolution

-Resolving power -Any space-borne remotely sensed data has four different aspects of resolution: • Spatial • Spectral • Radiometric • Temporal

Nearest Neighbor

-Substitutes in DN value of closest pix. Transfers original pix brightness values w/o averaging them. -suitable for use before classification -

Second Order Polynomial Models

-Suitable for areas with high relief (terrain) -12 unknowns, need minimum of 6 GCP's -Nonlinear differences in scale in X & Y, true shape not maintained, parallel lines no longer parallel

Spectral Enhancement Tasseled Cap Transformation (TCT)

-The different bands in a multispectral image can be visualized as defining an N dimensional space where N is the number of bands. • Each pixel, positioned according to its DN value in each band, lies within the N-dimensional space. • This pixel distribution is determined by the absorption/ reflection spectra of the imaged material. This clustering of the pixels is termed the data structure

Image Classification

-The most common/popular method of feature extraction from satellite imagery • A continuous single/multiple band imagery converted to a thematic data to map the spatial distribution of specific land cover features or any biophysical variable Two Major Types: Unsupervised Supervise

Univariate Image Differencing

-The most straight forward way to see whether a change has happened is to take a difference between two images collected over the same place at different times. *The variable to use can be NDVI for vegetation change detection, or a single band reflectance

First Order Polynomial Models

-The polynomial model is the most simple model and can handle most of the georeferencing -Also known as an affine transformation -need 6 unknowns and minimum of 3 GCP's -difference in scale in X & Y, true shape not maintained but parallel lines remain parallel

Resampling

-The process works as follows: *Create output grid *Back transform from map to image *Resample to fill output grid

Thermal Remote Sensing

-Thermal infrared radiation refers to electromagnetic waves with a wavelength of between 3 and 20 micrometers. -thermal infrared is emitted energy (optical or emissively emitted), whereas the near-infrared is reflected energy -Sun-Target-Sensor System *Satellite electromagnetic sensors "see" reflected and emitted radiation **Reflected comes from the incident angle off of the ground from the sun's radiation towards the ground and to the sensor, and emitted is straight off the ground and up to the sensor

Geometric Distortions

-These occur due to earth rotation during acquisition and due to earth curvature *Systematic distortions **predictable; corrected by mathematical formulas these often corrected during preprocessing *Nonsystematic or random distortions **corrected statistically by ground control points often done by end-user

Supervised Classification

-This requires the analyst to select training areas where he/she knows what is on the ground and then digitize a polygon within that area Common classifiers: • Parallelepiped • Minimum distance to mean • Maximum likelihood

Image Enhancements

-image stretching (adding or removing values/blank pixels) -band rationing -contrast/brightness *Indices work -spatial filtering *PCA *PAN sharpening

Hyperspectral

-refers to the use of many narrow, continuous spectral bands (2-10 nm) that can cover 200-2450nm. *The technique allows us to identify the diagnostic narrow band spectral features *Twice or triple the number of bands and pixels -Typical development stages of imaging spectroscopy technique *The 2nd generation of airborne imaging spectrometer system, AVIRIS, 1987, 10 nm, 224 bands, 0.4 - 2.5 µm. (CASI).

Pixel

-short for picture element, which is the smallest discrete component of a digital image and is the area on the ground represented by each DN. *Ground Sample Distance (GSD): Ground area represented by one pixel **IKONOS (1m)->Spot (10m)->Landsat (30m) * Ground Resolving Distance (GRD): Measure of the smallest object expected to be detected

Image Post-Processing

-spatial filtering (removing outliers, random pixels) -accuracy assessments

K-Means Clustering

1. A set number of cluster centers are positioned randomly through the spectral space. 2. Pixels are assigned to their nearest clusters. 3. The mean location is re-calculated for each cluster on the basis of the all pixels in a cluster. 4. Steps 2 and 3 are repeated until movement of cluster center is below threshold.

ISODATA Clustering

1. Allows for different number of clusters while the k-means assumes that the # of clusters is known. 2. Calculates standard deviation for clusters. 3. After step 3 of K-means clustering process we can either: a. Combine clusters if centers are close. b. Split clusters with large standard deviation in any dimension. c. Delete clusters that are to small. 4. Then reclassify each pixel and repeat step 3 5. Stop on max iterations or convergence limit. 6. Assign class types to spectral clusters.

Georectification Steps

1. Choosing a map coordinate system 2. Coordinate transformation -Taking ground control points (GCP) -Transforming the raster with a geometric model *planimetric model - transfer of points from map 3. Resampling the raster

Digital Image Processing Steps

1.) Image Retrieval 2.) Image Pre-Processing 3.) Image Rectification/Restoration 4.) Image Enhancements 5.) Feature Extraction 6.) Image Post-Processing


Conjuntos de estudio relacionados

Network+Guide Chapter-6 Review Question

View Set

decimal, percentage and fractions

View Set

Chapter 58: Caring for Clients with Disorders of the Kidneys and Ureters

View Set

hesi evolve questions (respiratory, cardio)

View Set

Taylor's PrepU (Fundamental) Ch. 39 Bowel Elimination

View Set

CH 7 Innovation and Entrepreneurship

View Set