I recently caught up on some TED videos having been meaning to look through their back catalog (well worthwhile by the way!). Anyway, Yann Arthus-Bertrand gave a captivating talk at TED earlier this year; he is best known for capturing stunning aerial imagery with his photos representing both art and story telling. He has taken this to the next level with the release in June of Home the story of Earth told via his imagery and the narration of Glen Close. It is a captivating 1 hr 35 mins and whilst not everyone will agree with this ecological politics, he is able to uniquely capture our moment on the planet and provide a forum for discussing the future. Unusually the film is also copyright free and can be distributed widely. YouTube provided a download until last week and this has now disappeared (for no apparent reason), however Legal Torrents have the widescreen mp4 available (Firefox users can use Fire Torrent to download). Note that if you burn this to a disc, many DVD players won’t support the screen resolution meaning you’ll need to transcode it to DVD size.
I attended the the OGC UK and Ireland Forum today which was an interesting experience. OGC (Open Geospatial Consortium) should be well known to all as the over arching industry body that defines geospatial standards (GML probably being the most well known). Perhaps what many won’t realise is that OGC has been around since 1994 and has been very active in in the geospatial arena, almost mirroring the W3C in terms of timeline but has, arguably, been far more successful.
Anyway, the Forum rolled out the big guns in the form of Mark Reichard (CEO) and David Schell (Chairman). And whilst the “advert” for the day didn’t really make it clear, but this was a “big” relaunch of the Forum. OGC are clearly very keen to promote the fact that it is an international organisation and whilst it is there as a group to define standards and disseminate, it is driven by its members and there are very specific cultural/legal perspectives. Local Fora are therefore intended to provide this focus.
What does that actually translate in to?? Key areas were persistant test beds, accessibility to data/software for testing, translation of standards in to “meaningful” real world case studies, persistance of information. The list can become quite large and fails to recognise “consumers” and what they can use OGC standards for. There is clearly some further discussion needed to facilitate the forum but I think it was a good start. Steven Feldman offers a more pessimistic view which I can understand as, in the intervening 4 years since the last UK Forum, the consumer has really laid down the gauntlet in terms of what they want from geospatial web services. And the insatiable deman for (government) geospatial data is only increasing. With the ineviable IPR relating to this whole area (particularly with commercial interests), this point in time may prove a tipping point (and I guess its telling that KML was taken on as an OGC standard for this very reason).
Smith, M.J. and Pain, C.F. (2009) Progress in Physical Geography, 33, 4, 568-582.
Remotely sensed imagery has been used extensively in geomorphology since the availability of early Landsat data, with its value measurable by the extent to which it can meet the investigative requirements of geomorphologists. Geomorphology focuses upon landform description/classification, process characterisation and the association between landforms and processes, whilst remote sensing is able to provide information on the location/distribution of landforms, surface/subsurface composition and surface elevation. The current context for the application of remote sensing in geomorphology is presented with a particular focus upon the impact of new technologies, in particular: (i) the wide availability of digital elevation models and (ii) the introduction of hyperspectral imaging, radiometrics and electromagnetics. Remote sensing is also beginning to offer capacity in terms of close-range (<200 m) techniques for very high resolution imaging.
This paper reviews the primary sources for DEMs from satellite and airborne platforms, as well as briefly reviewing more traditional multi-spectral scanners, and radiometric and electromagnetic systems. Examples of the applications of these techniques are summarised and presented within the context of geomorphometric analysis and spectral modelling. Finally, the wider issues of access to geographic information and data distribution are discussed.
It’s hardly a “new” story, but the whole issue of open source cropped up recently. I’m currently lead editor for a book on geomorphological mapping which will incorporate a DVD. We want to make best use of this and include as much imagery, data, software etc as is appropriate. I need to clarify the T&Cs of some “free” products as whilst some are free (as in beer), they are not necessarily free (as in freedom of speech) to do with as you wish.
Many products use GNU licensing which makes things much simpler: don’t sell the software (other than distribution costs) and if you modify it, that code needs to be made available.
Which of course brought us back to the issue of commercial software and this priceless quote which came my way:
“Seems like ESRI is the Microsoft of GIS (huge, bad, expensive, wrong, awkward, etc).”
Now commercial is not necessarily bad and Ryan Strynatka (ERDAS) gave a nice comment:
As for software: cost is always relative. In the commercial world the ROI for photogrammetry software is quite high, otherwise you wouldn’t see so many successful commercial mapping firms.
Clearly organisations believe they are getting good ROI from ArcGIS. The more generic GIS arena is somewhat different to the highly specialized photogrammetric one. And there are some credible open source alternatives starting to make in roads in much the same way the Open Office has. MapServer, Open Layers, UDig, QGIS and MapWindow are all proving popular. Of course I’d be the first to say that ArcGIS is as close to the “single stop shop” as you can get, but there is plenty that could be considerably better. And I wouldn’t be surprised if it is government that drives this area forward on the basis of open specification (ISO) file formats. And Google understands the importance of “place” so that, in the same way it is going after Microsoft, expect there to be in roads on the geospatial front. Competition is good for the user!
ASTER GDEM is here!! I blogged about this last year and there has been some eager anticipation over the arrival of the product (see GIM and the BBC). Whilst ASTER has been experimental, I don’t think any remote sensing scientist can same that it has been anything other than a huge success; 14 spectral bands, relatively good spatial resolution and continuous stereo (and so the ability to generate DEMs) make it an almost ideal Earth imaging system. And GDEM is the culmination of 10 years of stereo data collection allowing the creation of a near-global DEM at a 1 arc second (~30m) resolution. A real success.
The data are currently being distributed, freely, via ERSDAC and NASA. The ERSDAC is much easier to use but I never successfully downloaded anything due to (I assume) the load on the server. The NASA WIST interface is (very!) convoluted but works extremely well. You must register on the site, ignoring the protestations that the data is offline and that you need to pay. Once through the the full order process you will get a series of emails that will eventually let you download the data via an FTP server. I used the map to define all of Ireland and then grabbed the data which are stored as 1x1 degree tiles; all very easy to do. I then used ERDAS Imagine to mosaic the tiles together.
So what about the data itself?? Well the README is well worth reading, however the main points are land surfaces 83N to 83S, 1x1 degree tiles, 1 arc second resolution, vertical accuracy of 20m and horizontal accuracy of 30m. The file also presents some preliminary QA for this initial release and notes:
“while the elevation postings in the ASTER GDEM are at 1 arc-second, or approximately 30 m, the detail of topographic expression resolvable in the ASTER GDEM appears to be between 100 m and 120 m.”
My initial viewing of the data seems to support this and, generally, I would say that the product is of a lower quality than SRTM (C/X band). That’s in terms of vertical accuracy and resolvable features. 30m C-band or 25m X-band remain generally better products. However there are some things worth remembering: (1) GDEM goes to 83 degrees covering significantly more arctic/antarctic terrain than SRTM; (2) GDEM was collected over 10 years rather than the 11-days of SRTM. Environmental change will therefore be an issue and (3) SRTM is a consistent product that has been refined considerably since its first release. The quality of GDEM will be spatially variable dependent upon the imaging conditions.
It’s great to see this data out which should prove complimentary to SRTM. The coming years should see refinement.
On the one hand you have a series of government commissioned reports stating that releasing spatial data at marginal cost would generate significant economic benefits, including the Cambridge University study which looked purely at the economic drivers. On the other hand you have an internal OS report with some obvious flaws which Ed, rather bluntly, notes as “reads like a poor MSc thesis.” Pulling no punches there Ed! And who is the mysterious “internationally recognised expert”. After the way this report has been handled there is no hope in hell they will want to be identified as sanctioning “a poor MSc thesis” and without the name it remains a “poor MSc thesis”.
It’s about time someone put their hand up and said, actually we don’t really care about the country, economy or the quality of our GI. What we simply want to do reduce our published balance sheet. We don’t provide a service, we tick boxes. Now if only GI wasn’t quite so important…..