|
|
| GeoCommunity Mailing List |
| |
| Mailing List Archives |
| Subject: | RE: [gislist] google maps and OGC Testing |
| Date: |
02/28/2005 08:00:02 AM |
| From: |
Jeff Harrison |
|
|
Anthony,
Let's deal with your points one by one:
First, the point of the demonstration referenced was to assess the "performance of combined OGC technology under real LIVE conditions" which, as you indicated in your second note below, was what you were asking for. Let me tell you, the conditions were very LIVE in the demo cited and actually more demanding than real-world deployments where there is much time for testing and integration and things can go through endless review boards and testing labs. This thread brings up another good point on testing. One of most important "performance factors" that you did not cite was - How well do OGC Specifications support the rapid integration of distributed geospatial systems? Well, it took only three weeks to bring together over 20 separate system components for the demo. This was possible because of the level of interoperability provided by open specifications integrated into multiple commercial software products. The key thing that enabled success was the existence of a Common Services Framework that enabled different components to plug-in and exchange information. The Services Framework was based on OGC, W3C and ISO standards. The really important thing was that much of software already had the interoperability built in and things could roll forward quickly to success.
Second, we are discussing how different vendors implement OGC (or any standard). The main point of discussion is twofold (I think). First, can different vendors do the same technical task differently with performance differences resulting? and second, can different vendors interpret the specification differently? On the point of can different vendors do the same technical task differently with performance differences resulting? The answer is yes. For example, anybody can write a Map Server or Client these days, but only a few really know what they are doing and they can distinguish themselves from others by doing the task faster and with more more performance benefits for their customers. I can't help that some vendors don't put much time into making their interoperable software solutions perform well. Perhaps they should complain less and work more. On the second point of can different vendors interpret specifications differently with some negative performance on interoperability resulting. The answer is yes to this as well. However, I have been able to test several dozens web map servers and web feature servers in recent weeks and access them all. Why? because I use tools that perform very well in this "space", demand interoperability, and won't deal with organizations that don't deliver interoperable, performant products.
Third, what you actually asked for was "Does anyone from OGC or anyone who deals with them have any publishable metrics on interoperability performance?" and then proceeded to cite a narrow set of examples on interoperability performance, requesting that test results be published. What I responded with was a list of two types of testing that had been done: Interoperability Testing and Conformance Testing. I cited multiple demonstrations as examples of Interoperability Testing where the products implementing the specifications have been shown to scale and perform under real-life conditions. I also cited the Conformance Testing suites (which are extremely extensive as you can see). Now I'll cite some published "Performance" references you ask for (the are others as well):
As an example, IONIC Software, has implemented an OGC based framework that is the engine for the Hutchison3g Location Services Platform. This implementation has been commercially deployed and stress tested to 300,000 hits per hour and was found to perform and scale very well. Here is that article
http://spatialnews.geocomm.com/features/ogcexplained/
As another example, one of OGC's member, CubeWerx, has implemented an OGC based framework that can scale to the growth in the number of Web users (especially the number of Web users with broadband Internet connections) and provide a system that can handle this demand. The CubeWerx-based system had to meet a volume of map operations that could easily overwhelm other map services, resulting in unacceptable wait times for users.
From a user's point of view in this context, being "performant" meant that someone with a PDA had to be able to navigate through a vast store of data under the map server without suffering long delays (they can give you the data size on the term "vast" if you like). The performance goal used in judging the system was to have a map completely displayed on a PDA in the field in 7-9 seconds using existing wireless cell phone bandwidth. To achieve this goal with an average PDA and an average cell phone connection, the ma
|
|

Sponsored by:

For information regarding advertising rates Click Here!
|