The README is missing too many things, like how to get data in and out using existing tools.
Are there any plans for GIS standards support? Can I use it with PostGIS/ArcInfo/civil engineering CAD software, upload SHP files, WEKT data, my muni's folio/block/lot tax data, the bus routes, population data from the census, or any other 100+ years worth of very expensive existing technology (like satellite imagery)?
There is a very large and older-than-internet software ecosystem this could play in.
It's the other way around too. Once you can use this with PostGIS, the platform becomes immediately more useful because you can now interact with most GIS data, tools, and standards.
Think separation of concerns - use a storage engine for spatial data and a separate engine for rasterization and tile generation - and how this could be used in a market that still expects proprietary data AND software.
The spatial data already exists and an entire industry built on pretty much a couple of companies tools (ESRI, Oracle, DigitalGlobe, GeoEye). You need to have the protocols or interfaces to interact with the storage engines. It will probably be another 50+ years before those databases change in any sort of major technological way.
In many cases the tile generation and access to those tiles (usually via web interface) is often lacking or extremely difficult / expensive to setup.
This is a really interesting project. I love the concept of a fast, in-memory geospatial database - kind of like Redis for Geo (interestingly Redis itself has grown some basic Geo features recently, but nowhere near as comprehensive as this - http://redis.io/commands/georadius ).
The way you can subscribe to geofenced areas is particularly clever, especially when combined with the WebSocket protocol option.
The docs don't cover how to communicate polygons/etc over the protocol just yet.
And the connection will be kept open. If any object enters or exits the 6 km radius around 33.462,-112.268 the server will respond in realtime with a message such as:
{"command":"set","detect":"enter","id":"truck02","object":{"type":"Point","coordinates":[-112.2695,33.4626]}}
Honest non-troll question: when you say high-performance, what's the measurement? I'm honestly curious what the right benchmark is for a DB like this, especially one that's doing a variety of spatial queries.
I'm specifically interested in how the measurement performs with a lot of writes constantly happening.
You bring up a good point. This is a subjective statement that I made. While I feel that it's a feature to have good performance, it's certainly something that should have a benchmark tool to back it up.
I'm going to generate some benchmarks and post in the coming days. Thanks a ton for the suggestion.
The README is a work in progress and I'm working to have a complete set of documentation soon.
Regarding elevation: All coordinates may have an optional 'z' member. For example, you can add a simple point `set garden tomatoes point 33.5091 -112.7998 125` which would create a point with 'z' equal to 125. Or, you could have a complex GeoJSON object containing multiple coordinates or features. There's no limits to this values except that it must be a number.
At this time the only supported coordinate system is WGS84. Though I'm open to suggestions.
I generally have the same reaction, however if it's an open source project then I suppose the language is a feature, because it tells you how easy it might be for you to modify it based on the languages you are most familiar with.
For closed source though I wouldn't class the language as feature
Since it's a Go binary, it should be pretty easy to run this as a standalone server (behaving a bit like Redis). It looks like it should be really easy to communicate with it from Ruby via the HTTP interface.
Are there any plans for GIS standards support? Can I use it with PostGIS/ArcInfo/civil engineering CAD software, upload SHP files, WEKT data, my muni's folio/block/lot tax data, the bus routes, population data from the census, or any other 100+ years worth of very expensive existing technology (like satellite imagery)?
There is a very large and older-than-internet software ecosystem this could play in.