Lab 5: classification Accuracy Assessment

Goal

The goal of this lab is to understand how to evaluate the accuracy of classification results. Accuracy assessments are mandatory after performing an image classification, this is a portion of the post-processing stage of remotely sensed data.

Methods

An accuracy assessment is performed on a supervised and an unsupervised classification from labs 3 and 4. This is done by utilizing the accuracy assessment tool in Erdas. A high resolution reference image of the same study area is used to perform the assessment. This is done by generating  random points on the reference image, 125 random points in this lab,  and classifying those points based on the high resolution image utilizing the same classification codes for water, forest, agriculture, etc. (Table 1). By identifying these points, a comparison report is then produced (Figure 1). This allows for an error matrix to be created from the results of comparing the random points to the classified images.
Table 1: Values assigned to specific land cover land use classes.

Figure 1: This shows the accuracy report that is produced, this image shows it being edited in notepad to allow for it to be read easier.




Results


The results from both accuracy reports were then transferred into error matrix (Table 2 and 3), where statistics on the accuracy of the classification could be performed. These tables shows how many accurate classifications compered to the actual number gathered from the randomly generated points. By calculating the percent of correctly identified features, we can come up with an overall accuracy percentage, and an overall Kappa Statistics value, which is providing a measurement for the accuracy of the comparison based on comparing the agreement between the images.
Table 2: This error matrix shows the final accuracy assessment of the unsupervised classification scheme.
 Producer’s accuracy (Omission Error)
Water=92.31%
Forest=56.25%
Agriculture= 50%
Urban/built up= 75%
Soil=36.36%
User’s accuracy (Commission error)
Water=80%
Forest= 84.38%
Agriculture= 50%
Urban/built up= 20%
Soil= 32%
Table 3: This error matrix shows the final accuracy assessment of the supervised classification scheme. 
Producer’s accuracy (Omission Error)
Water=100%
Forest=70.21%
Agriculture= 35.556%
Urban/built up= 50%
Soil= 53.85%
User’s accuracy (Commission error)
Water=100%
Forest= 100%
Agriculture=  51.61%
Urban/built up= 18.51%

Soil= 29.167%

Each of the error matrix above show producer's accuracy (Omission Error) , and user's accuracy (Commission Error), these provide percentages for the individual classes. Omission error simply provides a percentage of how many pixels are classified correctly,  where commission error provides a percentage for how many pixels on the classified image are what they say they are. Based on overall accuracy, my supervised classification is better than my unsupervised classification. However both of these classification methods are very bad due to them not being over 75 percent.

Sources

[Landsat 7 ETM]. (2000, June 9). Retrieved from https://eros.usgs.gov/usa

[NAIP High Resolution Imagery]. (2005, June). Retrieved from https://www.fsa.usda.gov/index

Comments

Popular posts from this blog

Lab 6: Digital Change Detection

Lab 10: Radar Remote Sensing