Wikileaks posts “new” proposed OS business model

Thursday, 27 August, 2009

Wikileaks has posted a presentation purporting to originate from the OS outlining a change to their business model. Posted 20 August, the document itself is simply dated 2009 so it’s hard to know when it was drafted but given the content it seems likely that it is relatively recently (assuming it does come from the OS). It certainly makes for interesting reading as it outlines three possible modes of operation: full commercial, free data and a hybrid. Not surprisingly, given all the criticism, it suggests adopting a hybrid mode, dismissing the “utility” model out of hand. Whilst drafted in such a way as to mark a “step change” in the way they do business, it comes across more as applying the patches to the trading fund model.

What are the largest areas of criticism of the OS? Well public access to data, derived data and cost? The “hybrid” model therefore tries to tackle these problems. In particular there is a focus upon easier public sector licensing (one license for England and Wales), non-commercial reuse of derived data (something the OS has been hammered for)and increased reuse of data through OpenSpace. Interestingly a cost cutting programme forms part of the package.

EO-1 Open for Tasking

Tuesday, 25 August, 2009

After the news last week of the death of TopSat, it is good to see that NASA have opened up EO-1 for tasking. EO-1 was “launched on November 21, 2000 as part of a one-year technology validation/demonstration mission.” Its been very successful and lasted considerably longer than most thought. It carries the ALI multi-spectral (10m Pan and 30m MS) and Hyperion hyperspectral (220 30m bands) sensors. If you are in need of data then visit the Data Acquisition Request page and submit a request; this will be reviewed and, if deemed appropriate, tasked.

And as if by magic…..

Friday, 21 August, 2009

Twitter announces the addition of geolocation to tweets. Its currently being added to the API (with it’s implementation being made available to developers) and thereafter to the interface. To be honest, that’s all the announcement says and I imagine lat/long will come out of the 140 characters. No information on how location will be implemented although All Points notes that its likely to use GeoRSS.

TopSat is dead. RIP.

Wednesday, 19 August, 2009

I blogged a while back about the availability of TopSat for academic research and whilst the data I received was not great it did provide very good data for many users. I had noticed that they had ceased tasking the satellite formally, but managed to get two late requests in. The first was completed but the data was not good. Whilst waiting for these to be re-collected I received the following news on 18th August:

Unfortunately QinetiQ have come to the decision to end TopSat operations by the end of the week. We would have continued to schedule up until tomorrow had it not been for an unexpected hardware failure here in the office.

The “end TopSat operations” seems pretty final. Let’s hope that the TopSat experiment will be followed up with something equally interesting.

Field spec processing scripts

Monday, 17 August, 2009

I’ve been involved with a project looking at the reflectance of loess and seeing how well this correlates with traditional measures, including magnetic susceptibility and grain size. The data sets rapidly grow so I’ve written several scripts in R to process them. To give you an idea of the problem, we used a GER1500 (400-1100nm) to collect 40 point samples in the field; each sample collects 700 data points (28,000). A further 12 field samples were collected and re-measured in the lab using an SVC HR1024 (400-2500nm) giving a further 105,000 measurements. The samples were then powdered and re-measured using an ASD Field Spec Pro (400-2500) giving another 105,000 measurements. That’s a total of 238,000 and that’s before you move on to looking at first derivative or continuum removed spectra.

Clearly a good data processing environment is needed and R fits the bill very well, although Matlab is used in equal measure by many (and Alasdair MacArthur over at NERC FSF is currently porting many of their pre-processing scripts). Matlab has the benefit of being known as a dedicated data processing facility with good graphing capabilities and a lot of bespoke, application specific, scripts. R is a statistical programming environment and is easily scriptable and good at the statistical analysis of massive data sets. It’s horses for courses, but R is open source which is good for me. And there is a portable version to boot (and for those using Excel, yes it does work, but as soon as you need to do anything iterative you are better off using something designed for the job).

In order to expedite the project I was working on I used the standard NERC FSF Excel template to do all the initial pre-processing. I then needed to produce some initial plots of the raw and first derivative spectra at each data point on multi-graph plots, before generating corelogram plots (correlation line graph). Most of this is relatively straight forward, just requiring importing and iterating over the data sets to produce nice looking graphs. I was particularly interested in using continuum removal as a technique for analysing the absorption features in the lab spectra and couldn’t (obviously) find any software that did it. So one of the scripts specifically processes the data in pre-defined ranges and calculates corelograms. I’m hoping to get a general purpose importation routine running for the HR1024 and FIeld Spec Pro sometime this year.

All scripts are available on my webpage and include a description of the files and sample data. They are not “general purpose” in so far as you need to edit them to load your own data. However they should ten work fine. I hope they prove useful and if you use them for any published academic work then please reference them as:

Smith, M.J. (2009) Reflectance Spectroscopy Scripts [Online]. Available from: http://www.kingston.ac.uk/gge/staff/smith.htm, [Last accessed: Access Date]

Cookie Cutter scripts

Thursday, 13 August, 2009

I copied in an earlier blog the abstract for a paper I had published earlier this year on calculating material volumes of landforms (drumlins in this case). The algorithm was scripted in Python on ArcGIS and is available on my webpage. There are 3 versions that can be downloaded; the first works with ArcGIS 9.2 and uses a workaround to reset the mask when processing landform outlines. This “bug” was removed with ArcGIS 9.3 and so the script has been modified to reflect this. The final “developmental” version does away with the need to run the script from ArcToolBox and is now “headless.” Just set up the ini file with all the parameters set and double-click on the script; it will run from a Python command-line interface and automatically call the relevant ArcGIS routines. It is faster and, I think, less prone to ArcGIS crashing.

I hope it proves useful and if you use it please reference the original paper:

Smith, M.J., Rose, J. and Gousie, M.B. (2009) “The Cookie Cutter: a method for obtaining a quantitative 3D description of glacial bedforms.” Geomorphology, 108, 209-218

Aeryon UAV

Saturday, 8 August, 2009

The UAV market continues to develop at a pace. The Aeyron Scout as an example of a neat quadcopter design for military and security applications. It is fully programmable as well as manually controllable using a tablet PC. The camera is their own design and is specced at 5MP stills, upto 1/1000s and real-time MPEG-4 compressed video (640 x 480) all on a gimballed mount. And the amazing bit: 112g.

And the UAV part is equally interesting: 3 km range, 20 min duration, 36 kmh, 500m altitude and 1kg weight. It uses wireless modem of wifi for communication and is DGPS/WAAS capable. The comms are needed for security, but it would be interesting to know what bandwidth it needs and how much on-board storage there is. Wifi range has to be quite limited. The DGPS is an interesting option and again it would be interesting to know what chipset this is and how they intend it to be used (and, as a result, the levels of accuracy you can expect to get).

New JISC-OS License: devil is in the detail….

Friday, 7 August, 2009

EDINA proudly announced a license renewal of OS data for the digimap collection which included a variation to the original agreement and some new clauses. Of most interest to academics are the changes to maximum allowances for internet publication, something Ive banged on about at the Journal of Maps for some time (maximum ~A5 map was essentially allowable for any academic journal publication). Thankfully they have now ditched the ludicrous maximum physical size/ground area rules (meaning you can now legally produce a map of the whole of the UK. Miracle!) and replaced it with, to be frank, an equally ludicrous “number of pixels” measure. All data now must be rasterized; no vector linework is allowed whatsoever regardless of the impact upon quality. The limitation is the maximum number of pixels at 1,048,576. Yes that’s 1 megapixel.

Let’s run through an example. Most people are familiar with pixels per (linear) inch (analagous to dpi for raster imagery) which means at 100ppi it takes 10,000 pixels to represent 1 square inch. A5 (148×210mm) weighs in at ~48 square inches, meaning you need ~480,000 pixels. However 300ppi is common for printing (and PDF viewing), but remember this is area, so that’s nine times the size of a 100ppi file. Yup, an A5 image at 300ppi is ~4,320,000 pixels, four times over the OS limit.

Either I’ve made a horribly (and blindingly) obvious mistake (and do correct me if I’m wrong and I’ll eat humble pie) or they must have been smoking something strong when they came up with this. This is actually worse than the previous agreement and by my reckoning means that the biggest figure you can have at 300ppi is 8.4×8.4cm. Now that really is great value for money…. and I thought things really couldn’t get any worse.

P.S. Not quite sure why the figure is set at 1,048,576, but that’s 1024×1024, which is of course 1kbx1kb.

Get your own satellite in to orbit

Friday, 7 August, 2009

You’ve heard of personal computers well now it’s time to own your own personal satellite. Space Fellowship has a nice story on tube satellites. For $8,000, yes $8,000 (!), you get to place your own satellite in to a decaying orbit. It last for a few weeks, but this is no toy. The kit includes the satellite’s structural components, safety hardware, solar panels, batteries, power management hardware and software, transceiver, antennas, and microcomputer and as long as it stays within the 0.75kg weight limit you can design your own experiment including, for example, remote video monitoring. Plenty of scope for some innovative amateur work here.