Wednesday, June 11, 2008

Traffic Based Routing on VE!

In my Routing Algorithms: Where are You Taking Me Now post I discussed wanting a mapping application to route me based on current traffic conditions. Well I am here to tell you that the good folks at VE listened. Who knew that my little blog would start shaping the direction of VE. :) Happy routing

Wednesday, May 21, 2008

GeoWeb Grid Lock

James Fee's recent post Don't Give Away the Farm! regarding the news that "Google and ESRI will allow indexing of ArcGIS Server services by Google (and anyone who crawls the web)" got me thinking. I could not help but wonder if those hosting ArcGIS Server solutions are prepared for the potential onslaught of traffic to their services once Google users begin scrapping their data?

This reminds me of a presentation I attended several years ago at the ESRI Annual Users Conference in San Diego. Some federal agency published an ArcIMS powered site showing current wildfires in the US. The site received an enormous amount of hits (for an ArcIMS web mapping site), far more than was expected for the site and far more than the servers were capable of supporting. The agency worked with ESRI to scale the backend infrastructure to support the unexpectedly high traffic.

I can foresee this scenario happening to data providers once they allow their content to be crawled by Google. I would guess most small to medium sized organizations will be ill prepared for the potentially enormous amount of traffic to their services will receive if they happen to have content the public wants.

Saturday, May 17, 2008

Government GIS: Going Public 2.0

We in government love our GIS data. We spend hours, days, weeks, months, years perfecting our data. Generally the audience for our data are the people within our jurisdictions. A controlled audience and a known way of disseminating the information. Let's face it, most of us are ESRI shops. Our data are in a RDBMS using ArcSDE/ArcGIS Server or on a file server. We share the data via desktop applications like ArcMap or via our intranet utilizing ArcIMS or ArcGIS Server.
What happens when our organizations decide to publish the GIS data to the public on the internet? Traditionally we have stuck to what we know: a watered down version of how we publish data internally. Turns out we have some new options available to us in the past few years. Google Maps, Google Earth, Microsoft Virtual Earth have burst upon the scene. While these applications have been around for for a few years now, many government agencies have not figured out how to incorporate their data within these freely available platforms.

Why is this? Are we scared? Do we not know how? For me it is a good bit of both. Scared to put our data out there in a way in which we may loose control over the data. Unsure how to hop on this new wave of cloud mapping.

The idea that a government release its data in to the collective internet universe, without retaining complete control over how that information is presented, represents a brave new world. The accuracy of a municipalities zoning data or location of its fire stations is of utmost importance. We are tasked with providing the public with mapping information and hang our hats on the accuracy of that information. Now we are starting to give our data to companies like Google so they can integrate our data into their products. Sure the data is still copyrighted to Municipality X, or County Y, or State Z; however, Google is building a mapping WiKi like universe on the interwebs.

Here is the scenario, Municipality X provides some type of data to Google, say building models for the whole of the municipality, then someone comes along and decides they have a better building model for an office building. Google approves the new office building created by John Q. Public mapper and the office building is replaced. So what's the problem? Is there one? Are future users of Google Earth bettered by the fact that the office building was replaced by John Q. Public? Sure if the model published by municipality was lacking information or was out of date. But not really if John Q. Public published something that is incorrect or is of questionable quality. Google cannot be expected to throughly vet everything published by the public. Will the municipality have some ability to request the information be removed? Should they? Will they even try? Where is the QC? Let me ask that again WHERE IS THE QC? Who is the ultimate decision maker?

On the flip side of the coin, is giving up a little control worth having our data reach a larger audience? How many people visit Municipality X's website versus Virtual Earth or Google Maps per day? Sure we can build the most amazing sexy public web mapping site filled with real time data published by the municipality. But what if now one finds it? If no one visits the site does it really exist?

Data providers have a far greater chance of having their data being used if it is incorporated into these new wildly popular applications. If I would like people to utilize the parking structures within my municipality where are people more likely to find them on the web prior to their visit? By combing through http://ci.municipalityx.ca.us/departments/parking/parkinglots.htm or simply going to http://live.maps.com or http://maps.google.com? Clearly one would be more apt to find the structures within the same website they used to provide them directions from their house to where ever they are headed. By releasing the data from our steely grips the general public might actually use it.

I realize I have more questions in the diatribe than answers. I wish I had answers. I wish there was more precedent on which to formulate answers. As I said earlier this is a brave new world for some of us within the GIS community. Time will tell how well government data is integrated within Virtual Earth or Google Maps. The release of ArcGIS Server 9.3, with it's ability to server KML, will be interesting to watch. Stay tuned...the World is watching.

Wednesday, October 24, 2007

Routing Algorithms: Where are You Taking Me Now?

The routing algorithm seems to be one of the most commonly used mapping algorithm by the general public. Google Maps, Yahoo Maps, Microsoft Virtual Earth, Map Quest all can route between two locations. There is really no functional difference between any of these sites; based on a quick survey all four sites route me the same way from my house to LAX. The resulting route is almost exactly the route I use to get to LAX (that is if I drive and do not take the Gold Line to Union Station Fly Away, i.e. the only truly useful public transport service in Los Angeles).

Yet we can all site many examples when we type in two addresses and are returned a route so obviously ridiculous the recommendation is thrown out without much consideration. For me this is typically a decision based on my local knowledge of the area. ‘That interchange is terrible’ ‘That section of freeway is always backed up’ ‘That route takes forever to get to the freeway’ This local knowledge is great if you have it, but what if you are visiting Los Angeles and you don’t know the 405 should be avoided at all costs almost anytime of the day everyday of the week. What if I need to get to somewhere and there is a bad accident along the “optimal” route chosen by the routing algorithm? Just because in the vacuum of the map that is the optimal route, and therefore should be the fastest route, does not mean it is optimal given the current local traffic conditions.

I would like to have more control over my routing. I am proposing the following functionality, which I feel will help produce more useful routing results.

Route through a user defined location:
When I lived in Torrance I knew it was faster to take PCH to get to the 110 north. Every routing solution would invariably direct me enter the freeway on Torrance Blvd. This route requires several miles of surface street driving, which is much slower than using PCH. In this case I would like to be able to enter a “must route through here” location.

Route around a user defined location:
Similar to the “must route through here” except “do not route through here”. If I know of a bad exit or construction in a certain area I would like to be able to add a barrier, preventing the routing solution from routing me through the problem area.

Google Maps has come the closest to achieving this Nirvana of routingdum. The ability to re-route on the fly based on user input is outstanding. WOW! How they do that? It is amazing. Have Google Maps create a route between two locations. If you don't like a part of it, simply select a portion of the route and drag it to an interchange, intersection, highway, street, etc. that you would like to travel through/on and it will re-route on the fly. Good stuff in my book.

Route be based on traffic speed:
VE, Google Maps, and Yahoo can all display traffic information. I want to be route between two locations base on the current traffic conditions. If a route is given in the vacuum of the interwebs and current traffic conditions are at a crawl along the designated route then the route provided is of little use.

Routing algorithms should use the near real time traffic information to route me based on current conditions. All the information is there. Someone just needs to tie it all together.

Friday, October 19, 2007

MS Virtual Earth COTS Anyone?

Will someone please make a Commercial Off The Shelf (COTS) application built on top of Microsoft's Virtual Earth API?

Microsoft's Virtual Earth is arguably the best web mapping API out there. As a 9 year GIS professional and ESRI user I am very impressed with what I have seen from the MS product. Now I am no programmer and therefore have not gotten that deep under the hood of the MS VE API; however, I find the Live Local site far superior to Google Maps or any ESRI ArcIMS or ArcGIS Server site. I have also read and heard from others that the API is far more advanced than Google's.

I struggle with how to implement a new web mapping application based on the VE API. I am a GIS Coordinator in an ESRI shop. We are heavily invested in ESRI and ArcIMS. As we continue to drink the ESRI punch I assume we will migrate to ArcGIS Server eventually. Don't get me wrong I think ESRI is a great company and makes damn good GIS software. ArcIMS was revolutionary for its time...in 2000. Yet when I move from Live Local back to my organizations ArcIMS powered web mapping application it is painful. Waiting for the endless back and forth between client and server for every operation.

My organization is in the process of migrating away from a very old, heavily customized, ArcIMS 3.1 era, HTML viewer with a healthy dose of ASP to an COTS package that will be powered by ArcIMS 9.1. As I step back from this project I realize that ArcIMS has been around long enough for a whole host of 3rd party implementors to create applications that sit on top of ArcIMS providing me with all the functionality I need. Back in the day, say 2001 or 2001, if one wanted to implement a web mapping application they had few choices, all involved heavy customization.

In 2001 the company I worked for implemented a beautiful looking application. The interface was modeled after ArcView 3.x. There were drop down menus and everything. Pretty cutting edge stuff back then. The only problem was it required considerable amounts of in house customization.

The organization I currently work for implemented our existing application in 2001. They hired a firm to build them a custom site utilizing the basic ArcIMS HTML viewer. There is code everywhere. It's like a rat's nest. Future development is a nightmare.

Now there are wonderful applications available for us to choose from. They can do everything for me. Connect to my Oracle, SQL Server, and Access databases. Produce beautiful looking maps from the same templates I create in ArGIS. I don't even have to open an AXL file anymore!!!

But what if I want to use the cool new Virtual Earth API. Yes I get the nice looking base data, the pretty darn good 1 foot imagery, excellently implemented Pictometry imagery (oblique/Birds Eye view), and fast panning and zooming that comes with the VE tiling algorithm. But to actually implement an application to meet the needs of my organization we are required to write tons of code. Granted the code is .NET and not HTML, javascript, or ASP, but it is still tons of code.

I want to be able to integrate my existing spatial data residing in Oracle or SQL Server into a VE powered application. I want to integrate my other tabular data within the site. I want a web based application management tool that will allow me to configure and manage the delivery of my data within the site.

Is there someone working on this? If not then why?

Inagural Post

Well I finally gave in. I have created my first blog; GIS Cogitations.

GIS: Geographic Information System
If you don't know what a GIS is move on, these aren't the droids you're looking for.
Cogitation: Cog-i-ta-tion
(kŏj'ĭ-tā'shən) n
1. Thoughtful consideration; meditation.
2. A serious thought; a carefully considered reflection.

I fully expect to have a minute existence in the global bogisphere. If by some chance others happen upon this blog I hope you find some useful information. Heck if you feel so inspired please feel free to comment on ramblings.

I do not expect posts to be regular or based on some compelling need to grace the world with my daily thoughts on GIS. Rather I will make additions when I am inspired, which is typically when there is a cool project going on at work, after reading an interesting article, or after returning from a conference, training, or presentation.