Just a brief update about my install of the Windows 7 RC. Everything has worked very well on may Sony Vaio TX3XP; the RC drivers did their job, although a couple of device specific hardware failed. Either the XP or Vista drivers from Sony worked. However I couldn’t get hibernation to work, something I use a lot on a laptop. I initially thought it might be the graphics card drivers and I couldn’t find any on the Intel site for my Media Graphics Accelerator 950 chipset. A quick Google however and the drivers can be found here. That didn’t solve the problem though. Typing the following commands at the command prompt gives further info:
powercfg /a powercfg /hibernate on
The first tells you about supported sleep states and the second tries to turn hibernation on. This didn’t work and I got an error stating “an internal system component has disabled hibernation.” Not helpful. A bit more digging shows a similar problem with Server 2003 in relation to virtualised servers; hibernation simply isn’t supported. Which leads me to believe that if you install Windows 7 into a virtual disk (as I had done) then hibernation will be disabled.
I finally had an email back from the British Library concerning my earlier thesis request using the new EThOS service for digital thesis delivery. This was requested 17 February 2009 and was delivered today, which by my count is 129 days. So perhaps not the fastest service, but the product itself can’t be sniffed at. Well scanned, in colour, delivered to your door. Of course, if it’s already been scanned then it’s instant delivery and this will improve as the back catalog increases, however for “serious” use it needs to be returned within the 2-4 week timeframe.
Now this is just too cool not to blog about. A PC that is part of a 3-pin plug?? With a 1.2GHz processor, plus 512MB of DRAM, 512MB of NAND Flash memory, plus Ethernet and USB port. How cool is that?! They are touting it as an ideal media server but I can think of all sorts of environmental applications where you could plug a sensor in to it and just let it record. Very neat.
As the Research Excellence Framework (REF) begins to take shape it is worth keeping an eye on developments over at HEFCE. As has been widely reported, there is an intention to make much greater use of metrics. Whilst no explicit “formula” has been finalised, there has been much preparatory work on bibliometrics, that is, the use of article citation to measure quality. An interim report has been published on the use of bibliometrics, running some sample models over 46 Units of Assessment at 22 institutions (for both RAE entered staff and all staff). The models tested were: all publications based upon address, all publications based upon author and “nominated” publications. HEIs will clearly be concerned that the first 2 models will not necessarily include all published outputs of their staff; indeed they probably don’t know all the published outputs!! That only leaves the “selected” papers model (which is what RAE used) which possibly unduly impacts upon those staff that are prolific in terms of quality and quantity. It may also have the impact of researchers trying to manipulate citations through:
The list of misdemeanors could get quite long; some can be mitigated against and others can’t. The report only used journal articles and review papers; conference proceedings were not included (although they may be in the future) and this will no doubt irk scholarly societies that publish fully peer reviewed articles arising from conferences (but which haven’t been classified as journals).
The number of citations for each paper are then normalized with respect to field (using average number of citations worldwide by field), year and document type. The interim report discusses some of the problems with defined subject types and how subject normalisation is applied will be contentious, particularly for interdisciplinary areas.
A further problem is “differential lag.” It takes time for papers to start getting cited and this “lag” will vary depending upon the subject area, journal, article quality etc. And clearly there will be a minimum lag period, probably to be set at ~2 years.
The report has anonymised the results until its full publication in autumn 2009 at which time it will be interesting to see not only how each model performs, but the variations between REF and RAE at each institution and, naturally, what the impact upon funding would have been.
So what are the likely impacts of all this?? Well, if HEFCE goes for a nominated list model (say 6 best papers), then it is in an individual researcher’s best interest to maximise the number of citations using some of the methods above. If this can be achieved later in the REF cycle, then the normalisation by year should significantly boost the score (although the report notes the opposite effect). All of which means it is important to: (i) publish in highly cited/visible journals and (ii) be aware of the subject/journal average and target citations above this. Of course one knock-on effect will be a decrease in the number of submissions to non-citation listed journals.
“This is being billed as a first of its kind event in North America, bringing together many of the major technology players in 3D and related topics.”
Lidar is really hitting it big time and their are all many key players coming together. Whilst we have LAS as a file format and ESRI entering the fray with terrains, some standardization is sorely needed. Hopefully the OGC can provide this in the same way it has for other areas; the 3D Information Management (3DIM) Working Group seems to be the right starting point.
It was quite some time earlier in June that Microsoft made the release candidate of Windows 7 available for download. Yes its free (for a year anyway) for people to preview the technology. I normally quite like sampling the delights, but in this instance didn’t have a spare machine to dump it on to, so the 2.5Gb download sat languishing on my laptop. That was until I discovered that Win7 is starting to talk the talk when it comes to virtual machines. Yes there is the new XP mode (at a price!) which runs XP SP3 within Virtual PC, but Win7 can also handle virtual disks as well, creating them wherever you wish. And this offers the rather neat opportunity to start Win7 from a bootable DVD, start the install, create a virtual disk on your machine, install Win7 in to that disk and then automatically have a dual-boot option.
So my laptop still has XP installed on the main disk, but can now also boot in to Win7. First impressions are that it feels speedier than XP (let alone Vista!) and that the Aero interface is finally starting “to work”. Early days yet, but read the usual reviews to get some comparisons. In the meantime there really is no reason why you can’t trial it.
Perhaps somewhat quietly to European audiences, the Palm Pre was launched last Saturday in the US. Apparantly stocks have sold out and it has been largely well received, being described as the only likely rival to the iPhone. Clearly an awful long way to go and in depth reviews seem to praise webOS highly, with the Pre itself being a satisfactory starting point. A ROM image restorer has had the techniques in palpitations as the 200Mb file has allowed some disassembly to see whats going on. As ever, plenty of gossip over at Palm InfoCenter. However the biggest news has got to be the porting of Doom on to the Pre. A thoroughly worthwhile cause.
Minimap is a great extension for Firefox allowing access to a range of online mapping services through a sidebar. However it is much more than that; default view is the (broad) location of your IP address. You can drag and drop addresses on to the map to move to that area. Route planning is integrated, along with import/export of KML and support for many webservices (such as FireEagle, Flickr etc). Really takes web mapping to the next level of integration.
Intermap are finally moving in to the consumer arena with the release of Accuterra, an iPhone app that offers terrain data (and other info) for off-road users. This was noted by All Points last week and they point out that the app is not streamed, but downloaded to the users phone. You buy a single tile which is 200-400 Mb in size. Meaning of course when you are out of range, the app carries on working. It’ll be interesting to see how popular it proves.
Twitter, the micro-blogging website, has seen huge upsurge over the last year through it’s ease of use and adaptability to a variety of uses (see BakerTweet). However one area where it significantly lags behind is in location awareness. The usefulness of Twitter really came to the fore with #uksnow showing how crowd sourcing can produce some useful results (although not everyone agreed). There are plenty of examples of where this could be useful though (such as traffic congestion). But there is no location aspect to the Twitter API, replying on users to embed location information (postcode, town etc) All Points notes an article in Fortune detailing moves in this direction. It’s surprising given the popularity of both BrightKite and FireEagle. I guess there is more to come.
OK, so satnav is the spawn of satan (vsatan!!) and should be avoided at all costs. But this little snippet reported by All Points shows a genuinely useful application, namely used GPS/map (aka satnav) to pre-cool a hybrid car prior to coming to a standstill at traffic lights ensuring that the engine stays off for the maximum amount of time. Reported fuel efficiency of 9% which in my books is pretty good. Roll on more innovative uses.
Just a brief follow-on to yesterday’s post on UKMap; GeoconnexionUK has an interview with Seppe Cassettari outlining the datasets, collection process, potential scope and future development. Worth a read although it’s not yet listed on the Geoconnexion website.
Ed Parsons, quick of the mark as ever, posted a blog entry on the recently announced UK Map. The GeoInformation Group have, for some time, been rumoured to have been collecting small scale mapping data allowing them to compete directly with the Ordnance Survey. They were early in to digital imagery, LiDAR and thermal imaging, and small scale mapping is a natural extension to this. Of course, unlike the OS, they don’t have to (or want to!) map the entire country, but rather just those areas that will generate enough income. So expect places like London, Birmingham, Manchester, Glasgow etc etc to all be covered in the 24,000 km2 dataset being collected over the next 5 years, with London due for release in September.
Top of the list of benefits for companies will, I suspect, be pricing and licensing (with a particular focus upon derived data). I wouldn’t be surprised if we see large swathes of local government opting out of licensing OS data and, if we wanted to be sensationalist (hence the blog title!!) it could even be the beginning of the end for the Mapping Services Agreement (licensing OS data to local government). Will the OS be forced to change up (I’ve been watching too much of The Wire)?? I bet GeoInformation were pleased that there was ostensibly no change to licensing following the budget, but economics may well come in to play. Game on!