More Editorial Musings

Tuesday, 28 November, 2006

Some more musings on the role of editors in academic journals as a result of a paper I submitted a paper to the Journal of the American Society for Information Science and Technology last year. In its original form, two reviewers highlighted both the strong and weak aspects of the paper. One suggested it would sit well if re-submitted (and actually noted that the topic was “excellent”. Warm glow!) as a short communication and on this basis the editor recommended a re-write. After six months (yes, I really should have done it sooner) I sat down and shortened the paper, re-submitting it. It then took another six months for the review to be completed before it was finally rejected. What surprised me was that the paper was reviewed from scratch and the original reviewers comments were abandoned, with one of the new reviewers stating it is “not likely to be of interest to the readers” (definitely not a warm glow on that count!).

So what is going on in all of this?? Well, I would normally re-submit a paper, addressing the points raised by the reviewers in an attached letter. An editor would be expected to check that these points had been correctly addressed and then either accept or reject on this basis. For whatever reason, the paper went out for a second review, which was not favourable. Clearly this placed the editor in a difficult position. Two sets of reviews, the first generally positive and the second generally negative. Which are “better”? In the end the paper was rejected but it clearly highlights both the role of the editor in the whole process and, more importantly, the careful selection of referees (something also highlighted by the IJRS article retractions). And it is referees that are both the strong and weak link in the whole review process. You need “experts” in a field of study, but can you find them? Are they expert across the scope of a whole paper? Are they biased? And will they do it?! Ultimately, these things need to be balanced.

Online Backup

Friday, 24 November, 2006

I blogged last week about having a reliable backup routine for data on a PC. In this I mentioned that I have five copies of my data, including archives and offsite backup. Whilst it is relatively simple to set up a backup routine to another hard disk drive (internal or external, or indeed both!) using something like Second Copy or Mircosoft’s free SyncToy, offsite backup is a little more complicated. This would traditionally have been performed to a tape, which would have been taken away at night. People have more recently used CDs and DVDs and the new generation of BluRay discs.

There are two problems with this approach; the first one is actually remembering to do it, making sure there is a disc in the drive, and taking it offsite! The second is disc capacity. I have 40Gb of data which wont easily fit on a DVD (and I dont want to be sat at a PC slotting discs in and out).

The simplest method is actually online backup. This ranges from free space at places like Box.net through to the rather neat firefox extension called GSpace that allows you to dump files into free space within a GMail account. Ultimately though these are only of he order of 1-2Gb so suitable for some but not a total solution.

The cheapest of the online storage solutions is Carbonite (ny referral URL), offering unlimited space for $5 a month. This itself is great, but for me its the software the totally sells the solution. Operating as an extension within Windows explorer it simply monitors these directories and automatically backs up your data, compressed and encrypted, to Carbonite. You don’t have to think about it. And once its done the initial upload it simply copies file changes. The restoration of files is painfully simply, again using Windows Explorer to access your remote data and marking files you want to restore. All in all its a brilliant solution that I can heartily recommend.

Firefox EXIF Extension

Wednesday, 22 November, 2006

Following on from my blog about the potential use of EXIF headers in JPEGs, I cam across an extension for Firefox called EXIF Viewer. It does what it says on the tin in that it allows you to view EXIF information for JPEGs. This is an early version so it’ll be interesting to see how this develops.

Exporting References from Endote

Friday, 17 November, 2006

I was putting together a relational database recently that needed to contain a table of references. The references themselves were sat in Endnote so I thought it would be straightforward to export them into something like a CSV or Tab-delimited file. It’s not though!! The “Export” feature doesn’t do what you expect and supports TXT, RTF or XML, exporting using the currently selected output style.

The solution is to create an output style using the (text) format you prefer. I like working with CSV because they are straighforward to manipulate. Whilst a tab-delimited output style is made available as part of Endnote, a CSV is not. So I created a very simple CSV output style to generate a CSV file. With this output style selected you make sure all the references are highlighted and then go to Exprot in the File menu. A new TXT file will be generated that is a CSV and can be dumped straight in to Excel or a database.

Note: I only created the ouput style for “Reports” and “Journals” and, for some strange reason, Endnote wouldnt put a comma after the author field (but did after all the others). I changed this to a * and then did a simple find and replace to put commas back in, in my text editor.

IJRS Journal Article Retractions

Thursday, 16 November, 2006

Beinga journal editor, I am concerned about the quality of the articles we publish, but have to balance this against the maintenance of a throughput of appropriate material. Add in to the mix the management of 1 internal reviewer, 1 cartographic reviewer and 2 external reviewers, and it all makes for alot of effort to publish one article.

The whole “quality” issue came starkly in to focus recently with the publication of a “Statement of Retraction” by the International Journal of Remote Sensing. If you read the statement you will see that not one, but three, papers have been retracted from publication (Sidenote: not sure if you can physically retract something that’s already published; I guess it’s more like disowning) where the same (ish) group of authors substantially reproduced material that had already been published (i.e. plagiarised). This really does highlight the whole peer-review process. It’s not perfect by any means, but does provide a good way of assessing the “worthiness” of research. So it is a case of selecting reviewers with care and passing a careful editorial eye over the results. What I find slightly strange is that several of the papers plagiarise were themselves published (earlier) by IJRS. Not quite sure what was going through the minds of the authors….

Backup your data ;)

Wednesday, 15 November, 2006

Ontrack Data Recovery have a rather amusing selection of “Top 10 Data Loss Disasters” from 2006. They include the usual dropped from helicopter, run over and packed in wash bag they seem to crop up regularly. I found the “banana on the external HDD” rather good, whilst my favourite definitely has to be the university researcher who sprayed his HDD with WD40 to stop it squeaking!!!

Of course this is all designed to highlight Ontracks data recovery services whilst suggesting that backup might actually be quite a good idea. Comedian Dom Joly had 5,000 photos, 6,000 songs and a half-written book on his dropped laptop. All I can say is more fool him for not having a backup. My own data has four automated backups, with one off-site, and includes archiving so that I can access previous versions of files (and I use the excellent SecondCopy for most of this). With the proliferation of digital media (audio, video, photos), backup really is very important but very few people (and companies!) actually do it.

LaTeX: LaTable

Monday, 13 November, 2006

One of the other bits of LaTeX “support” software I’ve come across in recent weeks is LaTable. Laying out tables, to quote Robbie Coltrane from “Nuns on the Run” when explaining the holy trinity, is “it’s a bit of a bugger”. Whilst most things in LaTeX are generally straightforward, tables are not. I have sweated many hours trying to get the layout look right. Anyway LaTable is a small Windows utility that allows you to layout a table in a spreadsheet fashion, inserting rows/columns, deleting rows/columns, merging cells etc. It has a “code” view so that you see the raw LaTeX code ready for copying straight in to your document. It also supports importing CSV files which is handy. It’s not the panacea for table layout but does take the pain out of all the initial hard work.

Planetary Geosciences, Geological Society

Thursday, 9 November, 2006

I’ve just spent a couple of days at the Geological Society in Picadilly at a conference on Planetary Geosciences. As a side note, the GS has the original (very large!) William Smith geological map of England and Wales. Its the first ever geology map of any country and was published in 1815. Because of decoration work its been on full display in the lower library; rather pleasant during coffee!!

Anyway, for “geosciences” read “geology”. However the breadth of the conference was much much wider than this. Dave Rothery (OU) should be congratulated on putting together a good conference package that was thorough, complete and relaxing! And whilst geology was clearly the major theme, there were delegates from remote sensing, GIS, astro-biology, astronomy and technology. The keynote speech was given by Steve Squyres (Cornell and PI on the Marsr Exploration Rovers), as well as talks from science teams on sensors such as C1XS, MIXS and ExoMars PanCam. Somewhat gratifying from my perspective was the continual use (and requirement) for topographic data from many speakers (e.g. Lionel Wilsoin, Lancaster, on magma venting dikes). DEMs are now central to many process studies, with MOLA taking central stage for planetary work. However we are starting to see much HRSC data coming on tap (thanks to some of the work from Jan-Peter Muller’s team) as well as the promise of HiRISE data.

My own talk (audio and slides) was a broad review on GIS and how data, software and technology are converging such that there is increasing usage within planetary geosciences. However there are still barriers to “entry” so that many remain unable to use GIS. I finished up with the suggestion that, if it didn’t already exist, a GIS Special Interest Group might well be a good idea. Something I hope to follow-up (although feel free to comment).

Copyright or database right… Does it matter?

Friday, 3 November, 2006

One of the follow-on outputs to the GRADE Project from my report on the Use Case Compendium of Derived Geospatial Data deals with the legal aspects relaing to geospatial respositories. In particular there was a need to look at the framework for designing a licensing strategy for the sharing and re-use of data submitted to a repository. This part of the project was led, and reported on, by Charlotte Waelde at the AHRC Research Centre for Studies in Intellectual Property and Technology Law, Edinburgh University. Whilst my compendium highlighted the problems relating to data re-use, particularly with respect to the Ordnance Survey (as this is where UK HE has the greatest experience), Charlotte has taken a step back and assessed the basis for accepting the terms and conditions upon which the data use are based upon. And the conclusion that she has come to:

geospatial data (generally) does not come under copyright, but rather database right. The former covers original, creative, pieces of work (and includes things like photos and maps), whilst the latter is designed to protect databases that have been collated. Her argument (and you need to read the complete document) is that products such as Mastermap are covered by database right only.

This has some important implications, but don’t read this as a free-for-all grab at everyone’s geospatial data; its not. I would like to highlight the following point that Charlotte makes in her report (and I quote):

A lawful user of the database (e.g. the researcher or teacher in an educational institution) may not be prevented from extracting and re-utilising an insubstantial part of the contents of a database for any purposes whatsoever.

This has the following implications:

  • 1. If you’re a licensed user (e.g. Digimap user) you can use an insubstantial part of the database as you see fit (although Charlotte explains that the term “insubstantial” is still a little vague but possibly <50%)
  • 2. This includes re-distribution of that insubstantial part, creation and re-distribution of derivative data and publication of figures directly relating to the utilisation of the data or any derivatives.
  • 3. Any terms and conditions applied in relation to the original license are null and void for the insubstantial part

Of course does any of this really matter to anyone? Well on one level you could argue no. Those that are happy with the status quo, utilise geospatial data and publish within the “restrictions” are not affected in any way. There are those that find some of the current restrictions irritating and just want a “sensible” license. Finally there are those that want to get well on the way to “freeing our data.”

Whatever our outlook is, these are important and highly relevant conclusions to draw and will effect us all in the geospatial community. Indeed Europe as a whole (as this relates to the European Database Directive) is going to have to take a deep breath and work out the next step. Not least, the HE communitys two biggest bug-bears (with the JISC-OS license at least) are addressed in that, theoretically:

  • 1. No copyright subsists in derivative products and these would be freely distributable (as long as they are not “substantial”)
  • 2. For the same reason, academics would be more or less free to publish whatever diagrams they see fit



So really the big question is:

what happens next?

GIS File Formats: how do we distribute data?

Wednesday, 1 November, 2006

One of the other discussions that cropped up at the GRADE meeting was file formats for spatial data in relation to repositories. In the back of my mind I’m also thinking about the Journal of Maps as there is interest in publishing data. So what file formats do we use for data distribution, particularly bearing in mind the need for ready accessibility as well as preservation for future use. You have your ECW, IMG, SHP, TIF etc etc. What is good? Why? Will it work? What doesn’t it support?

In terms of formats we are really talking about raster, vector and attributes. At the lowest level these are all that are need to import data for use in any processing system. But that is all they are; low level. There is no preservation of symbology for instance. I think this is a good starting point (and its where I am going to start!), but I’m happy to be contradicted.

So, what formats?
Vector
Well SHP is good because its well understood and there are plenty of tools to deal with them (and I think its important that support from projects like GDAL/OGR is maintained). OK it’s not topological but very flexible. The same case could be made for DXF as well; is this worth including?. On the topological front E00 might well be worthwhile (in that its simple). How widely supported is it? Are there any other formats worthwhile considering??
Raster
GeoTIF is really the dominant player for open formats and JPG should be included because it is also a standard photographic format (along with the EXIF data).
Attributes
So that deals with the spatial side, we then have attributes. Is DBF the most apt? It is well supported. It might also be useful to add CSV for tabular data, particularly as its ASCII based.

What I haven’t mentioned here at all is GML. It is still very early days with limited support, but this has the potential to be the way forward. Even in the standard XML arena limited inroads are being made, although with both OpenOffice and MSOffice supporting it in office apps things could change rapidly.