|
|
| GeoCommunity Mailing List |
| |
| Mailing List Archives |
| Subject: | RE: GISList: web services data (was cart and data viewer) |
| Date: |
08/26/2003 06:40:01 PM |
| From: |
Dimitri Rotow |
|
|
> > One of the beauties of web service architecture is that it allows > data providers > (normally public sector agencies) to maintain/update the data at > the source: > users get access to the freshest data on-line instead of > downloading gigabytes > of roads files (for example) which are essentially out of date > within an hour > when the next traffic jam or constructuion project begins! >
Let me get this straight... you are suggesting that using web service architecture will result in hourly updates to data to reflect the next traffic jam or construction project? ...and this is something that will be done by a public sector agency? ...You're kidding, right? :-)
You surely are not suggesting that something about how the data is distributed controls its freshness, are you? Why would, say, the Census Bureau be any faster at updating TIGER/Line data that is distributed via a web services site than it would a site distributing the data via FTP'd files? Frankly, I think that any agency is going to be really slow at updating data and that the slower the process the slower the updates will be. That's the way it is.
Let's get back to reality. Just about all public sector data in the US (and most other places) is the result of a lot of grunt work by people sitting at ArcView, ArcINFO, Intergraph and similar legacy GIS products cranking out the updates feature by feature, etc. The *fastest* way of getting their work product out to the public is to simply drop the resulting files into some website where people can get them. *Anything* that interferes with that process of dissemination (such as a bureaucratic two-step to bring the data into some politically-correct, but slow, web services form) will slow down the process of getting fresh data to users.
I can hear the OGC crowd now... "but the solution is to do everything mediated by our standards." Fine, do that and the throughput of your workers will drop enormously due to the incredibly slow and inefficient technology OGC adopts. You'll end up spending a hundred times as much time to do less (which may be the objective of some of these bureaucracy builders...).
> Your assumption that "GIS users" will want to download everything > just in case > is flawed, I think. Gone are the days when the typical "GIS user" > is a bored > government researcher in a lab somewhere. General public users need quick, > simple, on-line solutions. >
Very true. And they have them in the form of quick, simple, on-line access to whatever data they want in a format that zillions of GIS packages can read. You don't need to invent a new bureaucracy to do what is already easily and quickly done by, for example, the various USGS FTP sites.
> Also, regarding your OGC analysis, again it is off. OpenGIS > interface specs are > (often) heavy documents, but their implementations need not be. I
That's not true. If you write a poorly designed spec you cannot wring good performance out of it through a clever implementation... Implementations will be bound by the stupidity of the spec. OGC's GML is a perfect example. It is by design slow and stupid, so whether or not one dresses up the implementation with really well-crafted code the result is still stupid and slow.
Let's take an analogy that even non-technical people are likely to understand. Suppose some bureaucrat came up with the idea of throwing out Microsoft Word and centralizing word processing by storing all Word processing documents on a DBMS, let's call it Oracle Text, where each *word* was an individual DBMS record. The idea is that you could then edit documents with Text-Enabled Queries to fetch those words you wanted, and to do elaborate word-topological things to keep words together in "sentences" and "paragraphs" with yet more extensions to allow formatting as one wanted. Oh, and here's the icing on the cake: interactions with Oracle Text would be through TML, Text Markup Language, an XML-based extensible language, and all interactions would be through the form of web services in which your interaction with the centralized Oracle Text repository would travel through the web.
Let's say IRS takes it into its mind to start publishing all tax documents using Oracle Text and TML and people can no longer just download a PDF or a Word document to fill out their tax forms... they now need a live internet connection and a Oracle Text client. If the connection dies, they lose what they were working on. And, of course, with thousands of people hitting the server at the same time it runs dog slow because every word in every document has to be fetched by a Text-Enabled Query. And, of course, because Internet is a million times slower than local work the process of even writing a simple note, let alone editing a PhD t
|
|

Sponsored by:

For information regarding advertising rates Click Here!
|