Does anyone from OGC or anyone who deals with them have any publishable metrics on interoperability performance ? Theory and concept are grand ideals, but the reality of the situation is, what does the user(s) community experience. The internet-bubble era maxim still stands, if a user does not get prompt, first-time, accurate service for a website, that user is gone and very hard to get back. This goes to the heart of the OGC open standards initiative to promote the use of the technologies it fosters, but if the performance is circa 1990 WWW, then why bother ? I would be interested, and even challenge anyone to make performance metrics available for general consumption on various scenarios of OGC-certified technology. I am talking bits n bytes folks, network latency, refresh rates, network parameters, test data used (raster/vector), analytical vs. static interaction. To my knowledge, there is nothing like this published on OGC website, nor by any companies involved in OGC standards. Why not ? I cannot believe that anyone forgets to do this type of testing, so, let's see the results please.... Oh, and I know alot of factors go into these metrics, but something, anything would be helpful to assess the veracity and efficacy of all this OGC work, no ? Anthony _______________________________________________ gislist mailing list gislist@lists.geocomm.com http://lists.geocomm.com/mailman/listinfo/gislist
_________________________________ This list is brought to you by The GeoCommunity http://www.geocomm.com/
Get Access to the latest GIS & Geospatial Industry RFPs and bids http://www.geobids.com
|