The Journal of Maps has hit many problems when trying to publish maps containing Ordnance Survey data. So much so that we recommend that authors donot submit maps containing OS data licensed through JISC (distributed from Digimap). At the Journal of Maps I have also listed papers I have had published related to the work we do and one of the “common threads” is the problem in publishing third-party data. One partial solution to this problem is the “relaxing” of license conditions for the non-commercial use of copyright data (through a creative commons style license). This has been adopted by the forward thinking Creative Archive Group, which includes the BBC. This allows limited free access for non-commercial use of their data.
Can the OS adopt a simlar “style” license?? We can but hope, however its gratifying to note that people within the OS clearly think this is a good idea (see Ed Parson’s blog).
Nobody can have missed the massive explosion (supposedly the largest peacetime fire in Europe) that rocked the Buncefield oil depot in Hemel Hempstead, UK, this week. I actually live within 10 miles of the depot and felt the blast wave early on Sunday morning. The depot is sited close to a motorway within an industrial estate. And although there is residential housing close by, the area is primarily light industry. Safety concerns rapidly moved from those close by, to the effects of smoke as they rapidly covered London and the south-east. Sunday was an unusually windless day and, due to a winter temperature inversion, the effects of both the sound of the blast and the pollution, were concentrated at lower altitudes and not dissipated.
Satellite imagery has played a key role in monitoring the spread of smoke and two key sensors have been employed:
MERIS - mounted on board the European Space Agencies ENVISAT, MERIS is a hyperspectral system with a 300m spatial resolution. With a rapid 3-day revisit capacity it is very good at monitoring a variety of environmental phenomena.
MODIS - mounted on board NASA’s TERRA and AQUA satellites, MODIS offers daily revisits at 250m spatial resolution and, because there are two satellites, two images per day are available (AM for TERRA and PM for AQUA). One of the key sites for environmental monitoring is the Rapid Fire system. Near-real time imagery is delivered to the site for download and use. If you look through the archives for Sunday (11th) and Monday (12th) there are images of the fire available.
I regularly need to log back in to my broadband (1Mbit) system at home and have used a variety of different bits of software to do this. I’ve brought together a quick summary of the more useful ones:
UltraVNC (open source) - a very good development of the VNC product that allows you to “remote control” your system by viewing the remote desktop screen. Typical configuration involves the use of a server (on the remote machine) and client that you use to connect to it (although there are a variety of variations on this theme). Of all the VNC developments this is one of the fastest and has added extra developments including file transfer. Simple to set up as well.
Barracuda Drive (freeware) - a web based (both http and https) server that allows you to log in to your file system and upload, download or delete files. Very handy for quick access to files; it’s to simple install the server and then simply use a web browser to access it.
OpenVPN (open source) - VPNs are the most grown up solution to remote access. They require both a client and a server to allow you actually become part of the computer network on the remote machine, meaning you can access resources as if you were physically on the network. As a result they tend to be more expensive and require more knowledge to setup. OpenVPN is a robust open source VPN that works very well. Only really suited to combining disparate machines together (e.g. home and office) rather than access your home machine on the hoof.
It’s worth noting that if you have a “home” broadband connection you need to know the IP address of your connection and this is often prone to changing. A very good solution is to use a free service like DynDNS which can monitor changes in your IP address and dynamically map those changes to a fixed URL (e.g. myurl.domainname.com).
As a footnote to the earlier blog on good places to buy fresh coffe, it goes without saying that coffee beans remain fresh far longer than ground coffee. Good coffee means coffee beans, which means a coffee grinder.
There are quite alot of grinders (or add-ons for blenders) available around the £20 mark, however they are all based around blades that chop the beans up. Although it works, it generally produces a poor ground coffee for two reasons. Firstly, it generates quite a lot of heat which starts to “cook” the beans releasing essential oils (you might notice the beans becoming oily). Secondly, it is actually quite difficult to get a consistant (and fine) grind. The (recommended) alternative is to get a burr-type grind which crushes the beans. These are (naturally!) more expensive but produce a reliable grind. I personally use a Dualit Grinder which has been faithful.
By the way, for a good laymans guide to all things coffee I highly recommend The Joy of Coffee by Corby Kummer.
Of course you can’t always been near a quality espresso all the time which means the need for portable coffee solutions. My two favourites (because the are neat in a “portable” way) currently are:
Swiss Gold Filter - this is an “on-cup” filter that is re-usable (rather than the disposable version used in restaurants). Drip filter coffee provides a good brew and the simplicity of the design, as well as portability, make this a great office device. It incorporates a gold-plated filter for a btter brew.
Smart Cafe Travel Cafetiere - cafetieres also produce nice brews, although they tend to be fuller and less smooth than filter or espresso. The Smart Cafe design is clever and ideally suited to hotels and cards.
We use GPS fairly regularly in the Centre for GIS at Kingston so tend to keep an eye on developments in kit, particularly those that are useful for human geography/geoscience. We currently use (in conjunction with PDAs running ESRI’s Arcpad) some Fortuna Slim bluetooth GPSs which are cheap and sufficient for many purposes.
Recent announcements up the ante for low-cost, accurate, GPS. In particular the fully integrated Trimble GeoXH which claims a <30cm post-processed accuracy. For a little bit more money you can acquire the Thales Promark 3 which boasts <1m real-time accuracy and <1cm post-processed.
I don’t know about other users, but I have a mixed reaction to the use of “virtual learning environments” (in my case Blackboard). On the one hand I am a proponent of the web based dissemination of learning materials. If you are to integrate this within an environment that incorporates enrolment, then the benefits can be clearly seen (e.g. summative and formative testing, access restrictions etc). However this is not a grown up networking environment; just look at the rich facilities available to users of Microsoft Sharepoint. The overarching web-based, group-based, environment is still the same, but the facilities and power are so much more apparent. That said, Sharepoint is clearly not directed at educational environments. So for the moment Blackboard it is and, to a large extent, it does a reasonable job. However one key area is dissemination of learning materials and in Blackboard this requires adding learning materials one at a time; a painfully slow experience. Not only that, but if you come to download past modules, then the XML-only format is painful-in-the-extreme to work with.
One of the features supported by Blackboard is the uploading of “package” files. These are standard ZIP files that you can use to contain mini web sites. For me, they are the ideal means to create your own easily updatable material that can be uploaded to Blackboard. Not only that, but they are easy to extract from the debris of an exported Blackboard XML file should you so need. You can also apply your own design, including the use of forms, to create a bespoke and, if you’re good enough, rich environment. A little bit of flexibility and lateral thinking is a good thing, particularly in the world of Blackboard.
At Kingston University we use Blackboard as our learning management system for registering students on modules and allowing interaction with course material. Specifically I use Blackboard for the distribution of lectures/practicals (including data), assignment submission, sitting exams and group interaction.
Something I have been increasingly using is the exam mode where students take formative and summative exams. The benefits for me are a unified environment for sitting exams, automatic grading, instant feedback and easy to download marks. In addition I can monitor at how students perform on different parts of the exam and so help me write better exams in the future. This semester I have been experimenting with weekly formative multiple choice questions (MCQs) and have generally been pleased with the rapid feedback and progression of the students. Setting MCQs takes considerable time (I have written well over 100!) which then requires them to be incorporated in to the Blackboard environment. KU have therefore invested in Respondus for generating tests and uploading/integrating them in to Blackboard. I have to say that I am converted to using Respondus; it is relatively simple to use, generates fully operational questions and has so far seamlessly integrated in to Blackboard. And if you purchase the “campus” version it can publish to multiple courses.
For me the benefits are off-line exam composition, local storage of exams and easy upload/setup of exams. Its not perfect though as it uses an almost completely non-standard window environment that feels antiquated and loosely based upon an HTML page. It does try to present a wizard-style interface but I personally don’t like it. It also insists upon storing all exams in a “data” directory; yes you can change this to any directory on your system (and I have changed this from its location in “Programe Files” to “My Documents”), however this is irksome as I much prefer working in a file-based manner. This means I would normally store the exams with the other teaching materials for a module and then double click on it to start Respondus. Not so, you have to start Respondus and then load a file from its data directory. I can only request that software companies try to stick to default windows work methods otherwise you end up with a horrible mush of usability add-ons that end up confusing the user. That said, it does what it says on the tin and does it very well.
As part of my project exploring the use of kite-based remote sensing I have been using a Nikon D70 to take aerial imagery. Initially I have been shooting images and storing then as RAW mode files. This is an interesting area as, for the D70, the images are initially captured in 12-bit mode (per channel). I was quite surprised by this as I had been expecting 24-bit colour images, with 8-bits for each of the red, green and blue channels. The Nikon is somewhat unusual as it stores then in the proprietary Nikon NEF format; this can be uncompressed or compressed. It would appear that the compressed format (the only one available for the D70) quantises the data down to 9.5-bit, although the dynamic range is maintained.
Anyway, once you have NEF files (or any other RAW mode files) you need to convert them to something that most bits of image processing or remote sensing software can understand. I had initially used Adobe Photoshop, however it applies a shed-load of post-processing to the file (to make it look nice). If you are interested in “raw” pixel values then you need something else. Thankfully DCRAW came to the rescue. This is a command line programme that has reverse-engineered the structure of a whole swathe of commercial RAW files, allowing you to convert them to PSD (Photoshop) or TIFF formats. Usefully for an impatient person like me, RAWDROP has provided a graphical frontend to this. The final result does not look nice, but you do get the raw pixel values to play with.
Isn’t it always annoying when you find a web page with a Flash file embedded in it. You want to save it to look at later, but when you right click on it, all you are given are the Flash specific settings which don’t include saving the file. Well if you use Firefox there is a very simple solution. Right click on the main area of the page and goto “View Page Info.” Select the “Media” tab and you will see information on media files used in the webpage, including all Flash files. If you select the Flash file of interest you can then hit the “Save As” button. Easy!
I regularly use a Palm ZIre 72 to store my contact and diary information. Increasingly I find it great to refer to reference material and, through the excellent DocumentsToGo , natively edit Microsoft Word and Excel files. The Palm Zire takes SD storage cards and I have recently come across the Sandisk Ultra-II Plus. Not only is this a fast and spacious (0.5 or 1Gb) SD card, it very neatly folds to reveal a USB connector turning it in to a standard USB memory stick. It means I can place all my Powerpoint files on the card, view/practice them on the Palm and then plug them straight in to the USB port on the host computer.
1. NXPowerlite - this is a stand-alone program which optimises Powerpoint file sizes. Specifically it resizes and compresses graphics and “flattens” (into graphics) OLE objects. It generally does an amazing job of compressing a Powerpoint file.
2. I often want to distribute PDFs of my Powerpoint material. This prevents people stripping elements out of my presentations for re-use. I am not averse to this, but would prefer people to request this. For a while I have been searching for a PDF print driver that can perform edge-to-edge printing in such a way that the PDF looks like the original Powerpoint. I have had little success until recently when I installed Open Office 2. Unlike Powerpoint, Open Office (in particular Impress) can export a Powerpoint file to a borderless PDF (or indeed a Flash animation). OO’s ability to import Powerpoint is excellent, so this is the route I use now.
Like your coffee?? Well you need freshly roasted beans in order to get the best cuppa. Whilst unroasted (“green”) beans retain flavour for months after they have been picked, roasted beans only last weeks, whilst ground coffee lasts days. Vacuum packs are designed to minimise this degradation (and they contain a one-way valve to allow de-gassing, rather than exploding the packet!), however you really need to get freshly roasted beans.
Having tried a few places I now order all my coffee from Hill and Valley Coffee in Aylesbury. They roast daily and post out immediately. You should get your order within 2-days of roasting. I can recommend the Ethiopia Unwashed Harrar as being a particularly good medium coffee!
I’ve been thoroughly testing a Dualit espresso machine (84009) for the last year. Yes it’s a little expensive, but it makes fantastic coffee. I really can’t fault the brewing of an espresso, which produces a great crema on top. It comes with a capacious 2 litre water container and a 15 bar pump. As a result the milk frother is first rate.
Beware that making good espresso is a very messy business. Be prepared for alot of coffee and water getting thrown around. And if you live in a hard water area, the machine will need very regular (monthly) cleaning; this means running a vinegar solution through the machine and then flushing it.
OK that’s good news. Reliability wise its been a bit hit and miss. I returned the first one after 4 months as the pump blocked. The second one has been fine so far, however I have been through six (yes six!) brewheads. The plastic “splitter” that diverts the brewed coffee in to two cups repeatedly snapped off after limited use. Would appear to be a manufacturing fault; to Dualit’s credit they simply sent out a new one, but they can’t just replace the plastic splitter. It’s got to be the whole brewhead.
That said I couldn’t live without a pumped espresso maker now and the Dualit certainly makes good coffee!
I will providing an introductory seminar/practical at the AGI Conference, Chelsea Village, London (9-10 November 2005) entitled Inter-planetry remote sensing: landform mapping on Mars. These sessions take place in the trade exhibition space which has free entry; however you do need to register in advance at the above website.
I have become increasingly interested in exploring some of the inter-planetry remote sensing data sets, partly a result of the BSc in Earth and Planetry Science which we offer here at Kingston University. This is a large area, however I would recommend anyone interested in exploring the topic to visit the Astrogeology group at the USGS. They probably have the best selection of resources currently available for teaching yourself further on the subject, whilst making available a good selection of prepared data and software. A very good starting point (showing my interest in DEMs) is the near-global DEM of Mars:
One area of particular interest is the availability of dynamic remote data sets. The USGS/NASA are quite pro-active in making data available. This would traditionally have been made through an ArcIMS server, however with the establishment of web mapping standards (specifically WMS and WFS) from the OGC we are seeing non-proprietary formats made available. At the moment it helps if you already have a copy of ArcGIS in order to use these, however this will change with the availability of a new version of ESRI’s ArcGIS Explorer sometime next year. For inter-planetry enthusiasts, the servers you want to point ArcGIS at are:
“Mapping Hacks”, part of the O’Reilly “Hacks” series, was published earlier this summer and has proved popular with a variety of readers (e.g. comment). I’ve finished reading the book recently and thought I would post a review:
This book follows in the spirit of the O’Reilly “Hacks” series which try to open up technology areas by making them more accessible to everyday PC users. Mapping Hacks contains 100 “hacks” that take freely accessible computer cartography software and data, making them do genuinely useful things. The examples draw heavily from the US and the UK, showing the general areas of expertise of the authors. However a US-centric focus on data is not surprising given extensive free access to federally produced data. There are also a variety of guest authors adding considerably to the breadth and depth of the topics covered. This is no short book as, at over 500 pages, it covers many topics in some considerable detail (and is therefore excellent value at the list price of US$29.95). It should be noted that the text is very up-to-date. This ultimately means that some web links noted in the book will change and, overall, the text will date. That said, the books’ strength lies in its’ ability to “push” the edge of “everyday” computer cartography. For this reason, it will remain “current” for quite a while. And in case there is concern over dating content, the authors’ maintain a website to support the book.
The book is organised along the line of nine “themes”: Mapping your Life Mapping your Neighborhood Mapping your World Mapping (on) the Web Mapping with Gadgets Mapping on Your Desktop Names and Places Building the Geospatial Web Mapping with Other People
Each theme contains a dozen or so hacks, some exploring current technologies and techniques, and others developing/extending technologies. The list of subjects is extensive including GIS, GPS, wi-fi, geocaching, satellite imagery, georeferencing, census mapping, web mapping, XML, map servers, PERL, inter-planetry mapping, data clean up and visualisation. As such it is not a course text, specialised reference or practical guide. Think of it as a cross between a practical course and an encyclopedia.
Mapping Hacks covers a wide variety of reader experiences, ranging from straightforward introductory notes, through to extensive programming development exercises. At all stages the reader can closely follow the well guided text, implementing the materials as they see fit, or simply dip in and out picking up tips and tricks. Even experienced professionals will learn about many new topics. The book therefore manages to achieve something quite difficult; it is more or less able to be all things to all people. I have no hesitation in recommending this as a read for those knowledgable in the topics it touches upon, whilst it will be invaluable as a supplementary text to those take undergraduate and postgraduate in subjects such as remote sensing and GIS.
Have come across this excellent web based mapping product. It is an incredibly simple to use Flash animation that allows you to pull in vector data (using GeoRSS files) and raster imagery (either on your own server or via WMS. Yes, it can pull in WMS data!). We have implemented it at the Journal of Maps to show where we have content. Excellent!