Proceed to GeoCommunity Home Page


SpatialNewsGIS Data DepotGeoImaging ChannelGIS and MappingSoftwareGIS JobsGeoBids-RFPsGeoCommunity MarketplaceGIS Event Listings
HomeLoginAccountsAboutContactAdvertiseSearchFAQsForumsCartFree Newsletter

Sponsored by:


TOPICS
Today's News

Submit News

Feature Articles

Product Reviews

Education

News Affiliates

Discussions

Newsletters

Email Lists

Polls

Editor's Corner


SpatialNews Daily Newswire!
Subscribe now!

Latest Industry Headlines
SiteVision GIS Partnership With City of Roanoke VA Goes Live
Garmin® Introduces Delta™ Upland Remote Trainer with Beeper
Caliper Offers Updated Chile Data for Use with Maptitude 2013
Southampton’s Go! Rhinos Trail Mapped by Ordnance Survey
New Approach to Measuring Coral Growth Offers Valuable Tool for Reef Managers
Topo ly - Tailor-Fit for Companies' Online Mapping Needs

Latest GeoBids-RFPs
Nautical Charts*Poland
Software & Telemetry GPS
Spatial Data Management-DC
Geospatial and Mapping-DC
Next-Gen 911-MO

Recent Job Opportunities
Planner/GIS Specialist
Team Leader- Grape Supply Systems
Geospatial Developer

Recent Discussions
Raster images
cartographic symbology
Telephone Exchange areas in Europe
Problem showcasing Vector map on Windows CE device
Base map

GeoCommunity Mailing List
 
Mailing List Archives

Subject: Re: GISList: Reporting Data Quality
Date:  12/11/2002 04:01:41 PM
From:  David Nealey



On point 1. You may be right Robert but wouldn't it be good if your
geospatial database management system was able to warn you through a pop-up
window or other device that you had exceeded the quality threshold of your
data? Now users can zoom right down onto a single vertex or line segment.
With such a tool, the GIS could inform your users that they have gone too
far.

And yes, a GIS analyst can set the minimum and maximum zoom factors to
prevent users from going too far but how is that done? Usually it done is
on the basis of some arbitrary scale factor.

So wouldn't it be better if there was a link between the zoom operation and
the metadata document that prevented users from zooming down to a scale of
1:2400 when there was a geology data layer in the table of contents, while
it would allow zooming down to 1:100 when there was only say water
infrastructure data in the map?

Now, there is a disconnect between metadata and the map window. Data
generators produce the data and create the metadata document. Users may or
may not use the metadata in selecting data for their maps. I propose that
GIS software continuously check the metadata document to prevent users from
misusing geospatial data. That way, we could prevent the "Garbage In,
Garbage Out" situation in geospatial analysis.

On point 2. I would tend to disagree with you. Data quality can be
quantified. The graphical data in a GIS are point data at some scale level.
Each point has a true location and the displacement from true can be
calculated. For an area of a certain size, a specific number of points for
each data layer in the GIS can be measured and a statistic can be
calculated. That statistic can be the measure of positional data quality.
This is how the USGS National Map Accuracy Standard works. The NMAS says
that if you measure a certain number of points then a certain percentage of
them will be within XX distance of their true location.

Build and clean operations are other tools for data quality. If they cannot
validate the graphical data integrity then that information should be
reported to the user.

Attribute information can be quantified in at least two ways. The first is
completeness. If you do not allow null values in your database then a
simple count of all attributes will tell you if all values are present in
the database. The second measure is consistency. Database developers can
enforce a high level of consistency through the use of data entry forms, ala
MS Access. Then the person who creates the metadata document can create a
statistic about the level data consistency.

David

----- Original Message -----
From: "Robert Heitzman" <rheitzman@hotmail.com>
To: <tannas@vsnl.net>: <gislist@geocomm.com>
Sent: Wednesday, December 11, 2002 9:10 AM
Subject: Re: GISList: Reporting Data Quality


>
> >1. tools / facilities to report and communicate data quality metadata
>
> I guess I'm missing something here. "Tools" can't tell you much about data
> quality which really is a reflection of the methods used to collect and
> maintain the data.
>
> >2. options to provide fully automated tools / facilities to report and
> >communicate data quality
>
> Ditto. Data quality is not a metric that can be calculated in some general
> way.
>
>
> _________________________________________________________________
> Tired of spam? Get advanced junk mail protection with MSN 8.
> http://join.msn.com/?page=features/junkmail
>
>
>
> To unsubscribe, write to gislist-unsubscribe@geocomm.com
> ________________________________________________________________________
> GeoCommunity GeoBids - less than $1 per day!
> Get Access to the latest GIS & Geospatial Industry RFPs and bids
> http://www.geobids.com
>
> Setup a GeoCommunity Account and have access to
> the GISDataDepot DRG & DOQQ Catalog
> http://www.geocomm.com/login.php
>



To unsubscribe, write to gislist-unsubscribe@geocomm.com
________________________________________________________________________
GeoCommunity GeoBids - less than $1 per day!
Get Access to the latest GIS & Geospatial Industry RFPs and bids
http://www.geobids.com

Setup a GeoCommunity Account and have access to
the GISDataDepot DRG & DOQQ Catalog
http://www.geocomm.com/login.php


Sponsored by:

For information
regarding
advertising rates
Click Here!

Copyright© 1995-2012 MindSites Group / Privacy Policy

GeoCommunity™, Wireless Developer Network™, GIS Data Depot®, and Spatial News™
including all logos and other service marks
are registered trademarks and trade communities of
MindSites Group