During one of the lunch sessions at DMT the question of using Google Earth to make geological data easily viewable and more accessible was raised. Harvey Thorleifson (many thanks for the heads up on this!) of the Minnesota Geological Survey organised a group meeting with the Google people to discuss just this. And they did actually go to Google headquarters and chat through possibilities. And the short and long of it is you can transparently drape point, line and polygon layers over the top of the surface much as with any other data set. What you CANNOT do is (and I quote from Harvey):
1. You can not query polygons 2. You can not place content below the earth surface image 3. You can not place content below earth surface elevation
Apparantly streaming speed is what drives web site visits and therefore revenue, so anything that hits this, without good reason, is not allowed. Whilst the latter two are largely restricted to geological/earth science phenomena and so possibly of limited scope, the first item is immensely important to just about every application area (if you link to JoMs RSS feed you will see that you can click on maps presented as points, but not as polygons). Will Google revisit this? Well the answer was possibly, with the BIG driver being real estate. This is one application area with enough users, and therefore potential revenue, to force a change from the current situation.
So the reason for going to the US was to attend DMT07 which is principally aimed at geologists and cartographers from state geological surveys, although this year there were representatives from the UK, Czechoslovakia and Japan. What I hadn’t appreciated was that whilst there is the overarching United States Geological Survey, much of the geological work is done at state level. As a result there are 50-odd mini countries all doing things slightly differently with different levels of resources. At the federal level the USGS also has to undertake topographic survey as well. In terms of resources some states (e.g. Illinois) have over 200 staff, others have less than 5, whilst some states (e.g. Hawaii) don’t have a survey! So all in all its an interesting dynamic.
And standards were very much a theme at the conference, with a desire to go to an entire end-to-end digital work flow, incorporating PDAs/laptops in the field through to map output using ArcGIS or Illustrator. To that end one of the groups has been working on “ArcGeology” an ArcMap data model (that has interested ESRI). Whilst geological maps present points, lines and polygons, polygons are actually inferred from point and line data. The data model therefore only stores points and lines at a primary data level, with all polygons being generated (using the points/lines to poly command). Any changes to the primary data requires new polygon creation. I thought this was a nice solution and echoes some ideas on data models in GIScience at the moment (e.g. Goodchild et al, 200X).
Another major thread of the conference was data transfer, with a particular focus upon conversion between different ontologies. GeoSciML is an XML schema currently in development that aims to address this issue. However this stores data as points, lines and polygons, with no transfer of cartographic detail (e.g. line weight, colour etc). This is clearly in contrast to the ArcGeology data model and we are potentially seeing the loss of geological information at either the data model or data transfer stage.
The final key in this digital workflow relates to 3D. Geology is more than representing the surface geology on a map, hence the reason there are often cross-sections. The map authors will have some kind of mental 3D model for the structure of an area, so the 2D paradigm is actually not appropriate. What is needed is a way of capturing this 3D information digitally so that it can be appropriately modelled, something that Ian Jackson (BGS) really brought home. He suggested that a move to define a data model for ArcMap may well be 5-10 years too late as it is purely 2D.
So whilst DMT is about digital mapping and cartography, this year it has demonstrated how subjects still struggle with modelling their subject, 25 years after GIS became a commercial product.
Well if the Chinese are wedded to their phones, then the Americans are wedded to their cars. I had forgotten just how much space there is in American cities, and Columbia is no different. They even have countdown timers on the pedestrian crossings because it takes so long to cross! Anyway, there are no shops downtown and, even if there were, they don’t open on a Sunday. Even walking around downtown itself is a challenge because things are so far apart. Really you do need a car to do anything meaningful. And everywhere you go you see constant reminders; drive-thru ATM, drive-thru pharmacy, drive-thru fast food. Should have brought my folding bike with me…..
(Footnote: I wonder what people think the Brits are wedded to….)
I’m actually presenting at Digital Mapping Techniques this week, hence the flight out of Gatwick, via Philadelphia, to Columbia (South Carolina). From my brief visit to the tourist attractions website, Columbia is one of only a few cities with four inter-states running right through it. Not something I would shout about.
Anyway, one uneventful flight over to Philadelphia and a 10 minute wait at Immigration led to me the “Secondary Inspection Office” where I was informed I had an outstanding warrant for my arrest. Now being the law abiding fellow that I am, I knew there hasn’t ever been a warrant issued for my arrest and, in this instance, turns out to be one of the less pleasurable aspects of being an anonymous Smith. Not having travelled to the US in over 10 years, the name on an unknown passport was flagged up. I have to admit to being just a little nervous, however the officer for Homeland Security was, I am positive (!), John Goodman (Roseanne, King Ralph etc etc) and (thankfully) rapidly cleared up the mistaken identity. Interestingly though, amongst all the “due process”, no one ever got around to taking my fingerprints and photo.
Even after all that, the bags still weren’t out which left 1hr 20mins to re-check the bags, make a slow trawl through security and catch a bus across what seemed like half of Philadephia. Moral of the story is that 2 hour changeovers can often be a tight call.
Pondering what to do at Gatwick airport whilst waiting to fly out, what better than to sit down and ponder life over a pleasant coffee. OK, I haven’t been to Gatwick for a while and, well, its a shambles of an excuse for an international airport. The only bonus being that you really do step off the train and in to departures. I must also congratulate US Airways on an excellent web check-in service that genuinely did mean zero queueing at the check-in desk.
Anyway, back to that coffee. Besides there being no “real” coffee houses (Starbucks, Costa, Nero etc etc), Upper Crust generally seems to do, if nothing else, a satisfactory job. Well not this time. A row of filthy coffee machines, pre-prepared “McDonalds style” coffee (and thats doing an injustice to McDonalds coffee which is half decent), rotting fruit and general mess. AVOID LIKE THE PLAGUE (which you might just catch there).
Well it’s a good question and something many authors want to know. ScienceDirect, one of the largest distributors of online journals, now has a site dedicated to listing article downloads. Called ScienceDirect Top 25 Articles it lists, not surprisingly, the top 25 articles at Science Direct. However whats powerful about the web based front end (and download statistics behind it) is that you can filter it by journal and time period. Of course it doesn’t include all journals worldwide, just the Science Direct stable, but it really is a good place to start.
There are various ways that this could be improved though. The time periods are fixed, whereas it would be nice to set your own. It would also be nice to be able to filter by author or institution. This could help you answer questions like “which institution has the most downloaded earth science articles?”.
Those of you who read this blog will know that I use WorldKit at the Journal of Maps to display locations where we have published material. Its lightweight, functional and ideal for a simple web map client. Well Poly9 have upped the ante in this area with a preview release of their Flash Virtual Earth client. Not surprisingly it mimics the way Google Earth works, but does it in Flash Player. And it does it all very nicely, so definitely one to watch.
Swim the bloody Atlantic Ocean!! All 3,462 miles of it!!
I love these Google people, they really have a sense of humour. What’s funny though, is that you don’t have to swim the channel first (Dover to Boulogne), then go across to Le Havre and swim the Atlantic. Funny that you then land at Boston. Must be the best places for swimmers!
I regularly get students on my courses to give assessed presentations which they are marked upon. Whilst we are primarily interested in academic achievement, we do utilise a variety of different assessment methods, of which presenting is one. And its also important because students will, at some point, have to present in their professional career, so practise and experience now is well worthwhile.
The question is, how do you actually assess presenting? Well, we are not expecting thespian oratories from our students, thats for sure. However it does need to be well organised, clear and generally delivered in a positive manner (as ever, there is always room for improvement). There is also the added complexity of assessing the academic content, even more so when this might run in tandem with a project and there is the possibility of assessing the content twice.
I first came across these issues when I was a Visiting Lecturer at the University of Luton (now the University of Bedfordshire, however we’ll leave branding for another blog entry) and, with the help of my wife, put together a marking sheet which has evolved over time. I would be interested to see other peoples experiences both in geography, and across in to other disciplines. In this scheme 60% of the mark is on the presentation itself (with the remaining 40% on the academic content). This is split between 30% on presenting (visual aids, pacing, engagement) and 30% on structure/organisation (objectives, explanation, structure). Its by no means perfect but does offer a consistent way to mark presentations and shows students what to focus upon.
As an avid Firefox user I have a whole load of extensions I’ve installed for various purposes. I’ve blogged on some of then, including Sage and AllPeers amongst several. When you install an extension in Firefox it loads the data in to your profile and gives you no opportunity to backup any of the data. Whilst you can clearly run Portable Firefox and save the entire installation, it doesn’t help you when you need to reinstall everything from scratch. There is where FEBE ( Firefox Extension Backup Extension) comes in; and does exactly what it says in the tin. If you want a backup of all your extensions this tool will be invaluable.
On its own, FEBE is a great little tool, however the authors have gone further and released another extension called CLEO (Compact Library Extension Organizer). CLEO takes any number of extensions you select and produces a single, installable, .xpi file. A brilliant way to produce a single package to reinstall if you have a crash or to distribute to friends/colleagues. All genuinely useful!