GIS Skills for a GIS career

Friday, 20 November, 2009

Nice post on the essential skills for people to have in order to pursue a career in GIS. Worth seeing how this matches a GIS curriculum for any program you are looking it. It is very technology driven and perhaps emphasises the “technician” as opposed to an “analyst” but none-the-less its an interesting list. Worth a look.

Government data to be freed

Tuesday, 17 November, 2009

The Guardian covered the news piece today on the government announcement to release over 2,000 datasets for public consumption, including (some) Ordnance Survey data down to 1:10,000. As ever the detail will be interesting to read and it probably won’t include MasterMap. That said the sheer scale of the announcement is remarkable and “possibly includes all legislation, as well as road-traffic counts over the past eight years, property prices listed with the stamp-duty yield, motoring offences with types of offence and the numbers, by county, for the top six offences.” Will it include address data for geocoding?? Who knows, but that’s a biggie.

The Guardian page, funnily, also has a picture of a surveyor out with an (Leica RTK?) GPS hooked up to a laptop with the following caption:
“A cartographer out and about while mapping for the Ordnance Survey”

Whilst at its broadest definition cartography does include this (“science and art of map making”), I sure most cartographers wouldn’t think of this as their Raison d’être.And perhaps more funnily the DCLG page notes the date as “17 noviembre 2009.” What are people drinking today?!

WorldView-2 First Images

Wednesday, 21 October, 2009

Digital Globe’s WorldView-2 successfully launched last week and acquired its first imagery on 19th October over the US. And with US government restrictions on the resale of commercial high resolution satellite imagery set at a maximum of 50cm, Digital Globe has focused upon increasing the number of spectral bands and agility of the satellite. So there is a 0.5m panchromatic band, but then 8 multispectral bands at 1.8m (full details). In addition to the traditional VNIR (RGB, NIR), there is also:

-400-450: used for coastal studies; blue light penetrates far deeper in to water (some studies have shown upto 26m for MSS band 1) and so, amongst other things, this could be used for near-shore bathymetry-585-625: “yellow” band targeted at vegetation studies-705-745: “red edge” again targeted at vegetation studies-860-1040: a second NIR band

And predicted accuracy of geolocation is meant to be ~4.1m.

Degrees can cost less….

Monday, 12 October, 2009

An interesting article over at the BBC about how to get a debt-free degree. Really it’s about being sensible and tapping up potential sources of income during your 3 year stint. None of it is rocket science, but there are a lot of people that just don’t twig. Well worth a read.

USB3 drives out shortly

Tuesday, 29 September, 2009

PCP Pro briefly rounds up the latest news on the USB3 standard. This promises speeds of up to 600MBs over the interface and one area that could really benefit is external hard drives. Freecom and Buffalo both have drives for release by the end of the year, although whilst the USB3 headline speed is high, they will be aiming for ~125MBs, apparantly due to restrictions in drive speed (rather than the interface). That said, it’s considerably faster than current USB2 speeds and they will ship with a USB3 expansion card (so you can use them!!). Good news for those of apps that carry all our USB apps around with us.

Locate your postboxes

Monday, 21 September, 2009

The Guardian reports on something useful to come out of the “Show Us a Better Way” competition run by that Cabinet Office and that is a website to find your nearest postbox and its collection times. A genuinely useful application that, not surprisingly, fell foul of the Post Office saying “that’s our data.” Anyway, I’ll leave it up to more avid readers to skim through The Guardian’s article. The site itself is useful and is using crowd sourcing to get over the problem of recording the exact location of boxes; they know the postcode and the collection times, but not the 10 figure grid reference. So, through the power of people, about 25,000 of the ~100,000 boxes have been located. Just visit the site, enter the first 3-digits of your postcode and tie the list on the right to locations (or add a location) on the map on the left. Its all done inside OSM data to make life simpler.

It’s great to see genuinely useful services kick off so well, but isn’t it just insane that users are not allowed access to: base maps, electoral boundaries, postcodes, postboxes, weather data, hydrography etc etc. The list goes on and on and so mad is this that users are having to generate the content from scratch again. Doesn’t say much for “joined up government.”

Win7 Student Offer

Thursday, 17 September, 2009

Microsoft are offering HE students Windows 7 for £30. That’s a damn good deal and includes either Home or Professional; the latter offering quite a few more features. It is download only (universities will be pleased with that!) and you must have an HE email address. You can pre-register from 1st October.

Wikileaks does it again…. this time with postcodes!!

Wednesday, 16 September, 2009

Well Wikileaks is at it again. This time it’s supposedly the entire Royal Mail Postcode Address File (PAF) as a 241Mb CSV text file. Its a little smaller compressed and purports to be from July 8th. I’m sure this will upset a few people, not least the Royal Mail as it’s an income stream (and probably the OS as well). Of course postcodes change so for “live” lookups (e.g. address filling on shopping sites) it won’t be good enough, but for everyone else…..

Portable GIS v2

Thursday, 3 September, 2009

Portable GIS v2 has just been released allowing you to take all your key GIS apps on a USB key. Great stuff and a very useful summary over at Mapperz.

Wikileaks posts “new” proposed OS business model

Thursday, 27 August, 2009

Wikileaks has posted a presentation purporting to originate from the OS outlining a change to their business model. Posted 20 August, the document itself is simply dated 2009 so it’s hard to know when it was drafted but given the content it seems likely that it is relatively recently (assuming it does come from the OS). It certainly makes for interesting reading as it outlines three possible modes of operation: full commercial, free data and a hybrid. Not surprisingly, given all the criticism, it suggests adopting a hybrid mode, dismissing the “utility” model out of hand. Whilst drafted in such a way as to mark a “step change” in the way they do business, it comes across more as applying the patches to the trading fund model.

What are the largest areas of criticism of the OS? Well public access to data, derived data and cost? The “hybrid” model therefore tries to tackle these problems. In particular there is a focus upon easier public sector licensing (one license for England and Wales), non-commercial reuse of derived data (something the OS has been hammered for)and increased reuse of data through OpenSpace. Interestingly a cost cutting programme forms part of the package.

EO-1 Open for Tasking

Tuesday, 25 August, 2009

After the news last week of the death of TopSat, it is good to see that NASA have opened up EO-1 for tasking. EO-1 was “launched on November 21, 2000 as part of a one-year technology validation/demonstration mission.” Its been very successful and lasted considerably longer than most thought. It carries the ALI multi-spectral (10m Pan and 30m MS) and Hyperion hyperspectral (220 30m bands) sensors. If you are in need of data then visit the Data Acquisition Request page and submit a request; this will be reviewed and, if deemed appropriate, tasked.

And as if by magic…..

Friday, 21 August, 2009

Twitter announces the addition of geolocation to tweets. Its currently being added to the API (with it’s implementation being made available to developers) and thereafter to the interface. To be honest, that’s all the announcement says and I imagine lat/long will come out of the 140 characters. No information on how location will be implemented although All Points notes that its likely to use GeoRSS.

TopSat is dead. RIP.

Wednesday, 19 August, 2009

I blogged a while back about the availability of TopSat for academic research and whilst the data I received was not great it did provide very good data for many users. I had noticed that they had ceased tasking the satellite formally, but managed to get two late requests in. The first was completed but the data was not good. Whilst waiting for these to be re-collected I received the following news on 18th August:

Unfortunately QinetiQ have come to the decision to end TopSat operations by the end of the week. We would have continued to schedule up until tomorrow had it not been for an unexpected hardware failure here in the office.

The “end TopSat operations” seems pretty final. Let’s hope that the TopSat experiment will be followed up with something equally interesting.

Field spec processing scripts

Monday, 17 August, 2009

I’ve been involved with a project looking at the reflectance of loess and seeing how well this correlates with traditional measures, including magnetic susceptibility and grain size. The data sets rapidly grow so I’ve written several scripts in R to process them. To give you an idea of the problem, we used a GER1500 (400-1100nm) to collect 40 point samples in the field; each sample collects 700 data points (28,000). A further 12 field samples were collected and re-measured in the lab using an SVC HR1024 (400-2500nm) giving a further 105,000 measurements. The samples were then powdered and re-measured using an ASD Field Spec Pro (400-2500) giving another 105,000 measurements. That’s a total of 238,000 and that’s before you move on to looking at first derivative or continuum removed spectra.

Clearly a good data processing environment is needed and R fits the bill very well, although Matlab is used in equal measure by many (and Alasdair MacArthur over at NERC FSF is currently porting many of their pre-processing scripts). Matlab has the benefit of being known as a dedicated data processing facility with good graphing capabilities and a lot of bespoke, application specific, scripts. R is a statistical programming environment and is easily scriptable and good at the statistical analysis of massive data sets. It’s horses for courses, but R is open source which is good for me. And there is a portable version to boot (and for those using Excel, yes it does work, but as soon as you need to do anything iterative you are better off using something designed for the job).

In order to expedite the project I was working on I used the standard NERC FSF Excel template to do all the initial pre-processing. I then needed to produce some initial plots of the raw and first derivative spectra at each data point on multi-graph plots, before generating corelogram plots (correlation line graph). Most of this is relatively straight forward, just requiring importing and iterating over the data sets to produce nice looking graphs. I was particularly interested in using continuum removal as a technique for analysing the absorption features in the lab spectra and couldn’t (obviously) find any software that did it. So one of the scripts specifically processes the data in pre-defined ranges and calculates corelograms. I’m hoping to get a general purpose importation routine running for the HR1024 and FIeld Spec Pro sometime this year.

All scripts are available on my webpage and include a description of the files and sample data. They are not “general purpose” in so far as you need to edit them to load your own data. However they should ten work fine. I hope they prove useful and if you use them for any published academic work then please reference them as:

Smith, M.J. (2009) Reflectance Spectroscopy Scripts [Online]. Available from: http://www.kingston.ac.uk/gge/staff/smith.htm, [Last accessed: Access Date]

Cookie Cutter scripts

Thursday, 13 August, 2009

I copied in an earlier blog the abstract for a paper I had published earlier this year on calculating material volumes of landforms (drumlins in this case). The algorithm was scripted in Python on ArcGIS and is available on my webpage. There are 3 versions that can be downloaded; the first works with ArcGIS 9.2 and uses a workaround to reset the mask when processing landform outlines. This “bug” was removed with ArcGIS 9.3 and so the script has been modified to reflect this. The final “developmental” version does away with the need to run the script from ArcToolBox and is now “headless.” Just set up the ini file with all the parameters set and double-click on the script; it will run from a Python command-line interface and automatically call the relevant ArcGIS routines. It is faster and, I think, less prone to ArcGIS crashing.

I hope it proves useful and if you use it please reference the original paper:

Smith, M.J., Rose, J. and Gousie, M.B. (2009) “The Cookie Cutter: a method for obtaining a quantitative 3D description of glacial bedforms.” Geomorphology, 108, 209-218

Aeryon UAV

Saturday, 8 August, 2009

The UAV market continues to develop at a pace. The Aeyron Scout as an example of a neat quadcopter design for military and security applications. It is fully programmable as well as manually controllable using a tablet PC. The camera is their own design and is specced at 5MP stills, upto 1/1000s and real-time MPEG-4 compressed video (640 x 480) all on a gimballed mount. And the amazing bit: 112g.

And the UAV part is equally interesting: 3 km range, 20 min duration, 36 kmh, 500m altitude and 1kg weight. It uses wireless modem of wifi for communication and is DGPS/WAAS capable. The comms are needed for security, but it would be interesting to know what bandwidth it needs and how much on-board storage there is. Wifi range has to be quite limited. The DGPS is an interesting option and again it would be interesting to know what chipset this is and how they intend it to be used (and, as a result, the levels of accuracy you can expect to get).

New JISC-OS License: devil is in the detail….

Friday, 7 August, 2009

EDINA proudly announced a license renewal of OS data for the digimap collection which included a variation to the original agreement and some new clauses. Of most interest to academics are the changes to maximum allowances for internet publication, something Ive banged on about at the Journal of Maps for some time (maximum ~A5 map was essentially allowable for any academic journal publication). Thankfully they have now ditched the ludicrous maximum physical size/ground area rules (meaning you can now legally produce a map of the whole of the UK. Miracle!) and replaced it with, to be frank, an equally ludicrous “number of pixels” measure. All data now must be rasterized; no vector linework is allowed whatsoever regardless of the impact upon quality. The limitation is the maximum number of pixels at 1,048,576. Yes that’s 1 megapixel.

Let’s run through an example. Most people are familiar with pixels per (linear) inch (analagous to dpi for raster imagery) which means at 100ppi it takes 10,000 pixels to represent 1 square inch. A5 (148x210mm) weighs in at ~48 square inches, meaning you need ~480,000 pixels. However 300ppi is common for printing (and PDF viewing), but remember this is area, so that’s nine times the size of a 100ppi file. Yup, an A5 image at 300ppi is ~4,320,000 pixels, four times over the OS limit.

Either I’ve made a horribly (and blindingly) obvious mistake (and do correct me if I’m wrong and I’ll eat humble pie) or they must have been smoking something strong when they came up with this. This is actually worse than the previous agreement and by my reckoning means that the biggest figure you can have at 300ppi is 8.4x8.4cm. Now that really is great value for money…. and I thought things really couldn’t get any worse.

P.S. Not quite sure why the figure is set at 1,048,576, but that’s 1024x1024, which is of course 1kbx1kb.

Get your own satellite in to orbit

Friday, 7 August, 2009

You’ve heard of personal computers well now it’s time to own your own personal satellite. Space Fellowship has a nice story on tube satellites. For $8,000, yes $8,000 (!), you get to place your own satellite in to a decaying orbit. It last for a few weeks, but this is no toy. The kit includes the satellite’s structural components, safety hardware, solar panels, batteries, power management hardware and software, transceiver, antennas, and microcomputer and as long as it stays within the 0.75kg weight limit you can design your own experiment including, for example, remote video monitoring. Plenty of scope for some innovative amateur work here.

Home

Friday, 24 July, 2009

I recently caught up on some TED videos having been meaning to look through their back catalog (well worthwhile by the way!). Anyway, Yann Arthus-Bertrand gave a captivating talk at TED earlier this year; he is best known for capturing stunning aerial imagery with his photos representing both art and story telling. He has taken this to the next level with the release in June of Home the story of Earth told via his imagery and the narration of Glen Close. It is a captivating 1 hr 35 mins and whilst not everyone will agree with this ecological politics, he is able to uniquely capture our moment on the planet and provide a forum for discussing the future. Unusually the film is also copyright free and can be distributed widely. YouTube provided a download until last week and this has now disappeared (for no apparent reason), however Legal Torrents have the widescreen mp4 available (Firefox users can use Fire Torrent to download). Note that if you burn this to a disc, many DVD players won’t support the screen resolution meaning you’ll need to transcode it to DVD size.

OGC UK and Ireland Forum

Friday, 17 July, 2009

I attended the the OGC UK and Ireland Forum today which was an interesting experience. OGC (Open Geospatial Consortium) should be well known to all as the over arching industry body that defines geospatial standards (GML probably being the most well known). Perhaps what many won’t realise is that OGC has been around since 1994 and has been very active in in the geospatial arena, almost mirroring the W3C in terms of timeline but has, arguably, been far more successful.

Anyway, the Forum rolled out the big guns in the form of Mark Reichard (CEO) and David Schell (Chairman). And whilst the “advert” for the day didn’t really make it clear, but this was a “big” relaunch of the Forum. OGC are clearly very keen to promote the fact that it is an international organisation and whilst it is there as a group to define standards and disseminate, it is driven by its members and there are very specific cultural/legal perspectives. Local Fora are therefore intended to provide this focus.

What does that actually translate in to?? Key areas were persistant test beds, accessibility to data/software for testing, translation of standards in to “meaningful” real world case studies, persistance of information. The list can become quite large and fails to recognise “consumers” and what they can use OGC standards for. There is clearly some further discussion needed to facilitate the forum but I think it was a good start. Steven Feldman offers a more pessimistic view which I can understand as, in the intervening 4 years since the last UK Forum, the consumer has really laid down the gauntlet in terms of what they want from geospatial web services. And the insatiable deman for (government) geospatial data is only increasing. With the ineviable IPR relating to this whole area (particularly with commercial interests), this point in time may prove a tipping point (and I guess its telling that KML was taken on as an OGC standard for this very reason).