Bravo on the research group, would be interested in the results when completed. However, I respectfully disagree with your argument. The specification(s) that these PnP tools (software and hardware) directly impact their performance, stand alone or integrated with other PnP = tools, in a wide variety of situations and scenarios. This is in fact why we in = the US have such miserable and horribly insufficient wireless services for mobile phones while many countries (the rest of the world really) has = such incredible services, even countries where they have essentially no landlines, and have leapfrogged to a handphone in every hand with = incredible leverage. The specifications and subsequent technologies implemented by = the US based carriers WAS/IS the problem (not to discount the issue of the spectrum), we in the US decided to "do our own thing" and with regards = to wireless, continue to suffer for that short sighted decision. So, if you start with fatally flawed specifications, no amount of technological or financial "add-ons" later will improve that. OGC specs are not designed = or intended for a vacuum right, but for interoperable transactions. OGC = specs are entirely focused on the internet/intranets are they not? By = definition, this requires interaction with other specification-based tools over the internet, and the complexity and almost unlimited combination of specification-based tools under almost unlimited network scenarios is = almost certainly nonlinear. So, if the core issue is not the specifications themselves, and secondarily the subsequent implementations, what is to = be measured in a meaningful way. How is the law of unintended consequences = vis a vie OGC specification impacts on other OGC specifications measured ?=20 =20 Anthony
_____ =20
From: Michael Gould [mailto:gould@lsi.uji.es]=20 Sent: Thursday, February 17, 2005 3:44 PM To: Anthony Quartararo: gislist@lists.thinkburst.com Subject: RE: [gislist] google maps
You seem to be misinterpreting the role of OGC. Think of OGC as = developing the equivalent of plug-and-play specifications, but instead of for = hardware (CD-ROMS drives etc.) they are for (software) services.
If we substitute CD-ROM drives for OGC services, then perhaps you will = see that what you are asking is not so relevant.
What is the speed or throughput of a plug-n-play CD-ROM drive versus a non-p-n-p model? The answer doesn't have much to do with being plug-n-play...one or the other could be faster irrespective....
Our research group began (today!) a small project to benchmark WMS and = WFS (open source implementations) for speed in a wide range of situations = (data volume, multi-user load, Linux vs. Windows, available RAM, etc.). The implementations themselves are at the root of the actual performance however, not whether or how they support OGC specs. We are interested = in the results to be able to coach future orgs wishing to include these services in SDI projects: which combinations of machines, clients, = services and data bases are appropriate for selected situations or use cases. In = all case we assume OGC specs, as this provides the ability to combine the (heterogeneous) services in the first place.
M Gould
At 21:12 17/02/2005, Anthony Quartararo wrote:
Does anyone from OGC or anyone who deals with them have any publishable metrics on interoperability performance ? Theory and concept are grand ideals, but the reality of the situation is, what does the user(s) = community experience. The internet-bubble era maxim still stands, if a user does = not get prompt, first-time, accurate service for a website, that user is = gone and very hard to get back. This goes to the heart of the OGC open = standards initiative to promote the use of the technologies it fosters, but if the performance is circa 1990 WWW, then why bother ? I would be interested, = and even challenge anyone to make performance metrics available for general consumption on various scenarios of OGC-certified technology. I am = talking bits n bytes folks, network latency, refresh rates, network parameters, = test data used (raster/vector), analytical vs. static interaction. To my knowledge, there is nothing like this published on OGC website, nor by = any companies involved in OGC standards. Why not ? I cannot believe that = anyone forgets to do this type of testing, so, let's see the results please.... =20 Oh, and I know alot of factors go into these metrics, but something, anything would be helpful to assess the veracity and efficacy of all = this OGC work, no ? =20 Anthony _______________________________________________ gislist mailing list gislist@lists.geocomm.com http://lists.geocomm.com/mailman/listinfo/gislist
_________________________________ This list is brought to you by The GeoCommunity http://www.geocomm.com/
Get Access to the
|