Heather,
I'd make sure you provide that those figures only with caveats since the wilderness data is undocumented. You don't know how accurate or old those data are. Is it possible the wilderness area boundaries have changed since those data were collected? It's unknown how accurate they were when developed. It's possible the 2 acres of private land was changed to be wilderness later. There is plenty of room for uncertainty. A way to handle it is to state in writing along with the results that these figures are based on the specific source data. This means that using those same specific data sets one could calculate (reproduce) the same numbers using the same methodology. Also state that the source materials are unknown in terms of date and accuracy. Ask the agency what info / data they are using to substantiate that the wilderness area is 115 acres. If their data has known accuracy and is recent, one can put a certain amount of faith in it.
The underlying issue is that there needs to be a clear understanding that figures like these are only as good as the data from which they are generated. Often lacking is the importance of data quality.
Barb
Gordon, Heather C. wrote: > Thank you to everyone who has responded so far. It's true, as Mr. > Bannerman points out below, that many geophysical datasets have > ephemeral or 'fuzzy' boundaries, but vector data deals with crisp lines. > > However, when you're dealing with human-defined conditions, how do you > deal with accuracy? Here's an example at the point I'm trying to drive > towards: > > Say you are working for a land management agency, and you are using two > datasets: land ownership (collected at 1:24,000), and wilderness areas > (unknown source materials and collection date: 'institutional memory' > gone when the last person to do GIS for them retired). They ask you to > quantify the land ownership by wilderness area. You do so, perhaps for > over 100 wilderness areas, through a simple clip. Because you don't know > the accuracy of the wilderness dataset, you don't report any significant > digits like they had been in some of their previous documents, but for a > given area, you find: > > 112 acres public ownership, 2 acres of private land. > > That agency responds by saying they are absolutely sure that there are > no acres of private land, and they know for a 'fact' that the wilderness > area is 115 acres. > > So, not only do you have a possible positional error in relation to the > ownership, but a possible total area error. > > How do you handle this? > > Thanks again, Heather > > ________________________________ > > From: Bruce.Bannerman@dpi.vic.gov.au > [mailto:Bruce.Bannerman@dpi.vic.gov.au] > Sent: Thursday, August 09, 2007 11:46 PM > To: Gordon, Heather C. > Cc: gislist@lists.geocomm.com > Subject: Re: [gislist] Technical question: Areal accuracy standards? > > > > > Heather, > > This really needs to be considered on a case by case basis. > > Some areal features may be describing geographic phenomena that can be > accurately represented in polygonal form e.g. Cadastral boundaries. > > However, often areal features are used to describe phenomena that may > not have crisp, well defined boundaries, e.g. geology, soils, drainage, > vegetation, areas subject to inundation etc. Vector boundaries > describing these features are usually the result of an interpretation > and educated guess. These types of features may be better modelled as > raster surfaces. > > Often the original phenomena is captured in vector form at a point in > time, e.g. the edge of a lake. However the lake shoreline will usually > vary in location depending on season, rainfall, snowmelt, drought etc. > > > Considering the above, some data sets could be accurately defined, while > many probably could not. > > In the end it will still come down to validating your data against 'well > defined' points. > > However, also consider how the location used to define the 'well > defined' points was originally defined. > > > Considering the above, it may not be meaningful to place an arbitrary > accuracy value on the dataset. An approach that many people use is to > make an educated assessment of the accuracy of the data by relating it > back to well defined points, but also to include Metadata that > adequately describes the data, including: what it is intended to > portray: how it was defined and captured: as well as its intended use. > The end user will then be able to assess the suitability of the data for > their particular use. > > > Bruce > --------------------------------------- > > Bruce
|