Foxit Reader 3 has recently been released and it’s a worthy upgrade. Again, you can download the ZIP and simply extract the EXE in to any directory you want meaning you can run it from a USB stick). At 7Mb, and 1 file, it’s a pleasantly small and simple bit of kit. As with previous releases the speed of rendering PDFs is excellent, with functions supporting thumbnails and layers added in this release. The latter is particularly welcome as GeoPDFs (and layers) have become far more prevalent. Thoroughly recommended.
Yes it’s true, ESRI have finally ditched the requirement for the use a USB dongle (or old style parallel port dongle) in favour of a keyless authentication system. OK, so this is not a “feature” related in anyway to spatial processing, but it finally means the whole rigmarole of dongles is gone. Much of this patch is actually related to laptop usability. I’ve blogged before about the frustration of using a dongle based system and I think this move probably reflects the increased use of ArcGIS on laptops. In fact many of the other fixes in this patch related to laptop use and specifically the need to hibernate laptops and plug/unplug the dongle. I previously had set up a couple of batch scripts to manually start and stop the license server to get around this but its nice to see it sorted out properly.
I’ve been fiddling around with GeoPDF generation in ArcGIS 9.3 this week and its been quite frustrating. The added functionality is really very very good and considering this is their first release it works remarkably well. I have even had some maps submitted to the Journal of Maps that make good use of vector layers within the PDFs (and indeed some authors who don’t realise they are exporting GeoPDFs).
I am in the final stages of PDFing two large (in terms of filesize) maps, one with a large raster backdrop and some complex vector layers, the other with a very large raster backdrop!! The first map was being rendered almost exactly “as seen on screen” and incredibly quickly. However ArcGIS appears to be indiscriminately rasterizing some elements of the map and leaving others as vectors. It makes for a bit of a mess and to be honest its hopeless; really not fit for publication. I then switched to trying Terrago’s MAP2PDF which uses a totally different rendering engine, is much slower, but has greater control over the final product. It also deals with vector layers correctly and the final output looks much better. However it doesn’t work with one of the clipped raster layers properly, although a fix is supposedly coming.
The second map is really just a very large raster and needs to look good, but retain a small filesize. Whilst there is granular control in ArcGIS on the resolution, the JPEG “quality” is, well, pants! There are only 5 settings and only setting 3 or 4 is worth using. However the difference in quality can be significant. I wasn’t happy with the filesize/quality trade-off the ArcGIS 9.3 produces, so switched back to MAP2PDF which took 5 minutes to render the map before it crashed ArcGIS. Solution? Go back to ArcGIS 9.2 which seems to produce a better raster product
So there you have it, my totally unscientific and cursory exploration of producing PDFs of two complex/large vector and raster maps. This suggests that it is a far from mature product area. GeoPDFs are really starting to hit the big time and ESRI/Adobe appear to have big plans. But no one has got robust processing routine…. yet. I have no doubt that Adobe will get there, but its going to take time.
The reports just keep on rolling. This time we have the UK Location Strategy published by DCLG (and the GI Panel). It covers much old ground and gives a rather uninspiring action plan on p17. All Points Blog also discuss this and, somewhat fairly, note that “they do highlight how complex it can be to actually get them done!” However this has to be taken within the context of the 1987 Chorley report and the 1997 followup. Now that was only 21 years ago and, well, pretty much all the same points were covered back that.
The Pre-Budget Report published yesterday was obviously designed to get us to “spend, spend, spend”, but the small print contained much of interest to the debate on trading funds and publicly collected data (not just by the OS). Many feathers seem to have been ruffled in Whitehall, probably not least those of the Home Secretary; “crime maps” is probably rapidly becoming a rude phrase! Anyway, both Ed Parsons and Charles Arthur (FoD) have succinctly outlined the key parts of the document and some of the things they hint at.
The news over the last few days has been awash with items on the leaking of details of BNP members in the UK. And of course it didn’t take long for a Google mashup to be put together, although, even with the caveats that this mapped post-codes, not individual addresses it was largely misunderstood (Charles Arthur posted about this yesterday) and subsequently replaced with a “hot spot” map which is all very pretty by, well, largely meaningless. There was a nicely considered piece over at thinkwhere saying, in as many words, that on its own, the map doesn’t say too much about BNP membership as we are more interested in understanding the societal implications. This means correlating this list with measures of deprivation, population density, voting results etc etc. This can begin to identify clusters of membership and possibly why they are there (and that’s before you get in to issues of membership by occupation).
For those not familiar with the CORONA Mission, it was the original US spy satellite programme that operated from 1958-1972 (and was declassified in 1995). Of many “firsts” in satellite remote sensing, it was notably the first mission to provide photos taken from space through the launch of the camera in to a pre-selected orbit and, after image capture, the return to Earth of a capsule containing the film for retrieval by aircraft or boat. Whilst the first successful image was not acquired until 1960, it went on to capture over 800,000 images at a range of spatial resolutions.
US spy satellites are designated with a KH (for KeyHole) acronym, with CORONA ranging from KH1 to KH4. Spatial resolutions are good ranging from 7.5 m, with many at 2.75 m and some at 1.8 m. Stereo imagery was also collected for some missions. Not only does this remain “competitive” with contemporary commercial systems, but it also provides an excellent historical archive (although note that its panchromatic photography, not multi-spectral imagery). Given that Landsat-1 didn’t launch until 1972, this provides a valuable archive for a variety of applications. All the imagery are available for purchase from the USGS at $30 per frame.
And of course it wouldn’t be appropriate to conclude this blog without a brief comment on the continuing US spy satellite program. Again, Wikipedia has a nice summary of current known programmes. The successors to CORONA were initially film and then, with KH11, digital. Resolutions were commonly 15 cm (using what would appear to be a Hubble space telescope pointing at the Earth), with the expectation that the current KH13 is probably sub 5 cm.References
Yup, that’s right its World Toilet Day! Whilst us Brits have always had a tongue-in-cheek snigger about toilet humour, typified by Adam Hart’s wonderful Thunder, Flush and Thomas Crapper and, appropriately on WTD, a piece (or piss?!) by BBC News item (from TearFund) titled Britons’ toilet pastimes revealed, there is a slightly more serious side that WaterAid/TearFund are trying to get across. Namely that 2.5B people worldwide don’t have proper sanitation. And of course:
“One gram of faeces can contain 10 million viruses, one million bacteria, 1,000 parasite cysts and 100 parasite eggs.”
At the very least have a go at playing TurdlyWinks. Go on, you know it makes sense!!
I’ve just returned from 2-days of of training in writing Python code at ESRI UK, reveling in the delights of Aylesbury. As I’ve mentioned in numerous posts, scripting is finally “back in” at ESRI and Python is the language of choice (although you can script with other languages). So much so that there is a training course on it. Don’t expect it to be an introduction to Python, although to Python is introduced. Rather it is an introduction to developing ArcGIS scripts with Python and how geoprocessing functionality is accessed. Our trainer, Rob McPherson, was excellent and very knowledgable having clearly done quite a bit of Python development. In order to get the most out of the course its worth having already used Python to do some scripting, but you don’t need to have done so.
Introductory material includes the geoprocessing environment, python, tools/environment settings and the programming model. These lay the ground for using and manipulating the functionality exposed through describe, enumeration and cursor objects. The course concludes with integrating scripts back in to ArcGIS and debugging.
Interestingly the course is delivered using PythonWin which isn’t shipped with ArcGIS (IDLE is the common IDE that accompanies Python). However it is easy to use, offers a reasonable attempt at highlighting text and offering object choices and provides a debugging environment. One of the main downsides is that you can’t kill (at least I don’t think so) a Python script that is running.
Which brings me on to the next point, namely that it seems far better to run all your own scripts from PythonWin (or IDLE) without starting up ArcGIS at all. It appears faster and more stable. In fact, avoid the IDE entirely and run it directly in Python by double-clicking on the .py file. Anyway, a good course that is well worth attending by anyone wanting to develop scripts or needing to build models. Apparently an “Advanced” course runs in the US, but is not currently available here. I hope ESRI UK can add this to their portfolio.
I’ve recently been on training courses at both Leica and ESRI UK and therefore thought it an opportune moment to compare them. And the verdict? Well, ESRI UK provide, by a significant margin, the best biscuits. No question!
Open Street Map data is getting increasingly better with more and more detailed coverage. So much so that it is used, preferentially, for a variety of mapping and navigation option. The data is also now increasingly available and its worthwhile pointing to people to CloudMade who now make the entire global dataset available for download in a variety of very useful formats, including shp, xml, Garmin maps, POIs etc.
Another frustrating week at the Journal of Maps dealing with an excellent map that has used a (very) small amount of OS data licensed via JISC. Which means that the licensee is bound by these restrictions which I have described at length before. The JISC-OS license is not up for renewal for a while and therefore there is little to be done about what you can and can’t publish.
Which naturally led me to the question about whether OS data is “fit for purpose”. How can you have licensed users not allowed to publish their work? How can you have OS claiming IPR over an entire “product” regardless of the amount of their data included within it? How can non-profit/charitable users be essentially barred from map publication? How can such large sectors of society by so disenfranchised by a single organisation to the detriment of society as a whole? Is OS data “fit for purpose”? For many, I think not.
And to quote one user on the licensing conditions: “If I’d known the OS would be this rabid, I would not have paid for their data.”
As I’ve mentioned before backup is the cornerstone to any serious use of IT and any good backup strategy will involve some form of file/folder synchronisation. My favourite in the past has been Second Copy which works very well and supports FTP sync as well. Microsoft has released the freely available SyncToy whilst here are also some notable releases of the open source (and originally UNIX based) rsync in the form of DeltaCopy and cwRsync. Rsync is particularly clever in that not only does it work out which files have change but has which parts of files have changed and only copying those sections. As a result it is very efficient.
However I recently came across another solution called MirrorFolder which implements a real-time software RAID. Which means changes to files are written as they happen. It actually installs itself at the I/O level and any outputs to files are duplicated and automatically written to a copy. Very nice.
And since my last post it would appear that whilst Google still use the same tile encoding system, they are also using a new one which means it isn’t compatible with Super Googer. The good people at Super Googer have updated their code so that you just have to put in the lat/long (in decimal degrees) of the top left corner. Combine that with Google Earth Mapper (which lets you grab the lat/long of the GE cursor) and you have an easy way of grabbing BIIIIIIG images.
Life as an academic is often described as “publish or perish.” And to a certain extent this is true, with the mantra that you are only as good as your last paper. And one of the measures as to the “quality” of a paper is how many times it has been cited by others (although clearly you could have a completely rubbish paper and it is cited for this reason!!). Working out who cites your paper is clearly a virtually impossible task given the vast quality of material that is published every year. hence Thomson Scientific, the people who have a stranglehold on the citation listing market and produce journal stats such as impact factors, archive all journal articles and their references (for the the “A-list” of journals they maintain). This then allows them to work out who cites whom. Couple this to a web interface, search for your own article and you then get a list of everyone that cites it. Whats even more useful is that, if you are registered, you can set up citation alerts which are emailed out to you, as well as RSS feeds to monitor if you so wish. Of course your university needs a subscription to Web of Knowledge to take advantage of this, but it really is a valuable service.
Well as good as….. ArcGIS 9.3 is finally here. 14 years, 8 months, 27 days, 4 hours, 3 minutes and 32 seconds after the USA. I jest, and the good people at ESRI UK and CHEST have delivered the nice shiny DVDs ready for install. In fact this is a mission in itself. Its generally recommended that ERDAS Imagine is installed on top of ArcGIS. So, I had to very carefully uninstall in the following order:
And then install the ARC/INFO version of ArcGIS which I managed to mistakenly forget to install the bundled extensions, wasting another 30 minutes of reinstall. And of course the new Acrobat GeoPDF add-on which is only 2 Mb but also took 15 mins.
Anyway, its there and working and, touch wood, no problems. I’ve only had a quick play, but the tools in ArcToolbox appear noticeably quicker which is nice. I did try my custom Python script mentioned in an earlier blog and it bombed on the “extent” bug noted last time. The fix in 9.2 was to add in the following line:
gp.extent = “N/A”
This didn’t work and it gave the same “no extent” error when the line was removed. However on this occasion setting the extent to the input file seems to have done the trick.
I ran the script and it didn’t appear any faster “in process” but the speed improvements are on start-up rather than elsewhere. Something to watch out for. There is also a new geoprocessor that you need to instantiate as a 9.3 version. Early days yet though.
Yes, it’s finally happened, the student loan is fully and completely paid off. And they were so polite, thanking me for my custom. Only 18 years after I took it out. Now that was a worthwhile investment….
Following on from the previous post, the UK news has recently been covering university initiations, the so-called “coming of age” induction in to university clubs. And on the harmless end of the scale it generally involves extensive drinking but as the article points out, things can become more sinister. Indeed, this further article highlights some of the more unpalatable things that go on and how many still believe it is “good for team building”.
There is no way that anyone can condone this level of behaviour and watching the video in the previous post just highlights how seemingly ordinary people can cross the line to perform untenable acts. And again it is the person, the situation and the organisation. All 3 are often brought together during university initiations….
Donald Clark gives a thought provoking post about a recent talk by psychologist Philip Zimbardo on what makes ordinary people evil. Note that this video contains graphic images through examples from Abu Graib prison, however it does end on an uplifting note.
My local school had a vote last academic year to name each of the buildings on their campus. This was based around “inspiring” people (or even “heroes”) and the following won the vote:
1. Edmund Hillary 2. David Livingstone 3. Ellen MacArthur 4. Christopher Columbus 5. Neil Armstrong
An intriguing list of names for sure and not necessarily what you might expect, however it got me thinking about those that you might call “forgotten heroes.” Similar exploits, but not the fame. So from the list above, I’m building the “alternative” list below and wondered if anyone else had forgotten heroes to add.
1. Jacques Piccard and US Navy Lt. Donald Walsh: first descent of the Mariana Trench, the deepest location on the surface of the Earth’s crust (~11,000 m), January 23 1960. Intriguingly in the same timeframe as the ascent of Everest. 2. 3. 4. 5. Surely Yuri Gagarin was ranked a fair bit higher. First person in space and all. A very big moment.