Lab 3A – Intro to ERDAS IMAGINE and Digital Data
Overview
This week’s lab was my first deep dive into ERDAS IMAGINE, learning how to calculate properties of electromagnetic radiation (EMR) and how to view, manipulate, and subset raster imagery. I explored different datasets, including AVHRR and Landsat TM, and practiced preparing imagery for mapping in ArcGIS Pro.
I’ll admit, I struggled a bit at first — some of the tools in ERDAS IMAGINE have been renamed or rearranged since the lab instructions were written. It took some exploring to find the modern equivalents of certain commands, but once I got the hang of it, the workflow made a lot more sense
.
Key Takeaways
-
Understanding EMR: I reviewed how wavelength, frequency, and energy relate through Maxwell’s and Planck’s formulas. It’s amazing how these invisible properties directly affect what we see in satellite images.
-
Viewer Basics: I learned to open and display raster layers, use the Viewer to switch between True Color and Pseudo Color, and even set up multiple 2D views for side-by-side comparison.
-
Navigation & Preferences: I customized the Viewer for easier zooming and set defaults like Fit to Frame and Background Transparent — which made a big difference when layering multiple images.
-
AVHRR vs. Landsat: Comparing coarse-resolution AVHRR data with high-resolution Landsat TM imagery really highlighted how spatial resolution impacts what you can actually see.
-
Subsetting Data: Using the Inquire Box and Subset Image tools, I clipped out a smaller area (tm_subset.img) and calculated the area for each land-cover class before exporting it for mapping in ArcGIS Pro
.
Deliverable
I created a map of the TM Subset image in ArcGIS Pro with color-coded land-cover classes and area values in hectares.
Lab 3B – Intro to ERDAS IMAGINE and Digital Data Part 2
Overview
Lab 3B built on everything from 3A and introduced me to the analytical side of ERDAS IMAGINE. I learned to interpret image metadata, understand the different types of resolution, and analyze thematic data. Again, a few menu names didn’t match the lab sheet, so I spent some time experimenting to find where certain options had moved — but figuring it out gave me a better sense of how flexible the software really is
Key Takeaways
-
Layer Info & Metadata: I examined image details like file type, projection, pixel size, and brightness statistics for subset_tm_00.img — useful context when deciding how to process or symbolize data.
-
Spatial Resolution: I compared a series of Pensacola images (sra – srd) with pixel sizes from 2 m to 16 m. It was eye-opening to see how smaller pixels capture more ground detail (down to visible cars!) while larger ones blur those features together.
-
Radiometric Resolution: Comparing 1-bit, 4-bit, and 8-bit images helped me understand how more bits allow finer distinctions in brightness — even if our eyes can’t always tell the difference.
-
Spectral & Temporal Resolution: I reviewed how multiple bands improve spectral detail and how satellite revisit frequency defines temporal resolution.
-
Thematic Raster Analysis: I opened soils_95.img and hydro_00.shp to calculate area and percent coverage of soil types. Then, I built a query to highlight soils that were fine-textured and humus-rich — indicators of higher erosion potential
🧭 Reflection
Together, Labs 3A and 3B gave me a solid foundation in ERDAS IMAGINE — from basic image viewing to raster analysis. While it was a little tricky adapting to tool names that didn’t quite match the instructions, the process really reinforced my understanding of image resolution and thematic mapping. These labs made me appreciate how much precision and preparation go into transforming satellite data into meaningful maps.

No comments:
Post a Comment