WinSCP for FlatPress Backup

The benefit of FlatPress is that it doesn’t use a database… this approach pivots to the axiom of “keep it simple” which should make the rendered site fast, secure, and portable (for example, when I ported from Blosxom to FlatPress). One area where this is particularly evident is backup as all you have to do is copy the content files off the server. You can run this either sever side (putting the files somewhere) or client side (pulling them to a local machine). I’ve opted for the latter approach and my tool of choice is WinSCP (and the portable version), an open source FTP client that includes a n extensive number of reliable and extensible tools. I’ve found WinSCP better than FileZilla, not least because it has reliably handled large file transfers and maintains the create dates of any files you transfer.

Of particular importance for automating FlatPress backup are directory synchronisation and scripting. In fact, the WinSCP GUI can generate the script for you based upon existing profile settings. For completeness this is the very simple script that runs:

open ftp://<username>:<password>@ftp.yourserver.com
lcd c:`\mywebsitebackup
cd /mywebste.com/htdocs
synchronize local -mirror
close
exit

This opens a connection to the server, changes the local and remote directories before mirroring from the remote to the local. On Windows I can then schedule this to run as a daily task.

A good tool for the arsenal!

FREE EPRINT: Sustainable Development Goals: genuine global change requires genuine measures of efficacy, Journal of Maps

Smith, M.J.
Journal of Maps


We live in tumultuous times - it is a common refrain for each new generation as the challenges of contemporary society impinge upon their worldview. There is always change and there is no change quite like how we experience it in the here and now and the way in which it disrupts our status quo. Malthus was disturbed by population change and how it would implode the society he inhabited. His thesis - the Principle of Population (1798) - espoused what became known as the Malthusian trap whereby growth in the supply of resources led to an increase in population so negating any boost to living standards. The so-called ‘limits to growth’ remain topical both for proponents and opponents. So is the world we inhabit today any different?

Visual Studio 2015 SSRS Solution Files and Upgrade Woes

The arrival of a new Windows PC precipitated an upgrade from Visual Studio 2015 to Visual Studio 2019. All of my development is for SQL-based reports which are then deployed to SSRS. As Visual Studio is Microsoft’s “one-size-fits-all” approach to programming, you need to make sure you pick the right “flavour”. In this instance that means SQL Server Data Tools. For VS2015 and VS2017 that is a standalone installer and you need to make sure you select the “Data storage and processing” option which then installs SQL Server Data Tools. For VS2019 some of the functionality has been moved out in to Extensions: for me that mean installing the MS Reporting Services Projects extension.

With that rigmarole out of the way I pointed VS2019 at my Solution file and… I got an error message saying that it couldn’t be upgraded! WTF?! I mean, seriously? Microsoft can’t upgrade from two versions ago? Whilst the RDL report file format hasn’t changed, setting up new Solution files would be a time vampire for no valid reason.

It then struck me that it was worth a punt installing VS2017 to see if the intermediary version could upgrade the VS2015 files, and then move on to VS2019 after that. A 1Gb download and 30 minute install later (seriously!) and VS2017 successfully upgraded the Solution files. I then copied these over to my new machine and VS2019 successfully upgraded those too. It’s one extra step but is then seamless!

Note to self… Permanent Power to a Transcend Wifi SD Card in a Nikon Camera

I had been looking through my box of “spare stuff” to find a slightly ageing Transcend Wifi SD Card which I could use in my Nikon D800. To cut a long story short, I wanted to upload a few selected photos from the card to my smartphone and this seemed like the easiest way. OK, so the card is a little slow, but for a few photos that’s fine. The first task was to upgrade the firmware of the card to the latest version, install the WiFi SD App and then connect to the camera. It didn’t work. In fact, the smartphone couldn’t find the card at all which suggested that the card wasn’t being powered. I tried scanning at the same time as I was taking a photo and the card would briefly appear before disappearing.

Clearly the card is not continually powered by the camera and after some slightly long-winded Googling I found this page. In short, there are two modes where the card is constantly powered:

  • Live View
  • “Auto-Meter Off Delay” switched off

The “Auto-Meter Off Delay” from your Custom Settings is the one to change (and select it as an option on your MyMenu). Once you set this to infinity the camera powers the card and you can then access it via the smartphone app.

If you are using something like Snapseed on your phone to edit, then it is a whole lot quicker to shoot in “RAW+JPEG Basic” (the 36MP resolution means “Basic” is actually pretty detailed!), before uploading just the JPEG.

Copying a Visual Studio SSRS Solution

Designing SSRS reports in Visual Studio is liberating in how easy it is to get them up and running, but every so often you come across a gotcha that you think should be straight forward. One of them is copying a “solution” (VS’s name for a set of project files) to a new location. You might want to do this because you want to back it up, duplicate it for another related project, or just to run some tests against a demo version. What’s missing in VS is a “Save As” for the whole solution (you can do it for individual reports). If you copy the folder containing all the files you can create a new version in a new location, however all of the hard coded file locations will be incorrect and it will then fail to load.

So what is the solution?! Well you could create a new solution, then add in copies of all the existing reports, but then you still have to set it all up again which is just a little self-defeating. Surprisingly, the simplest thing is to copy the solution folder, but keep it within the same directory as the original, just changing the name. You can then open the copied solution from within this folder and all the reports load correctly (as new copies). If you are deploying this to SSRS then you will need to change the name of the solution in the solution properties, but then you are good to go.

Microsoft SQL Server Report Designer Error: An item with the same key has already been added

Whilst designing a report for deployment to SSRS from Visual Studio 2015, I received this error message when entering a SQL query I knew worked in to the New Report wizard:

An error occurred while the query design method was being saved. An item with the same key has already been added.

This is a classic Microsoft error message that is both specific and vague at the same time… and also shouldn’t happen. There are scant details online as to where this comes from but is a result of Microsoft SQL Server Report Designer having a requirement for unique column names (even if the underlying SQL query returns unique columns with the same name). This is a stupid limitation and whilst the error message is accurate, it is sufficiently vague to obfuscate what is going on.

The solution - unsurprisingly - is to make sure that there is no repetition in the names of the columns.

Grouping Objects in Visual Studio 2012

Grouping objects should be one of those things that is - well - easy to do! In Microsoft Word you Ctrl select each object, then right-click and select “Group”. Easy. In Visual Studio 2012, not so. You would have thought that, in Microsoft’s prime programming environment, these simple layout tasks would be easy, but thy’re not and it’s not documented anywhere. In my particular instance I was creating a SQL Server Reporting Services report where images in the template were moving depending on the number of rows in the output. The solution was to group the images together.

The grouping concept is sensible and well implemented, it’s just that working out how to do it is difficult! You actually have to insert a new rectangle object and then drag-and-drop the objects you want to group in to it. Once you’ve done this, the properties of your contained objects should look something similar to this where the “Parent” attribute under “Other” shows “Rectangle”. Now if you move the group, they all move. Job done!

rectangle_properties.jpg

ISO 3166-1

ISO 3166-1 just trips off the tongue, however it’s one of those standards that underpins a fair amount of daily geospatial traffic that is undertaken on a daily basis. Yes, I’m talking about country codes which Wikipedia helpfully defines as:

ISO 3166-1… defines codes for the names of countries, dependent territories, and special areas of geographical interest

This is important because it is used in so much analogue and digital data exchange between countries, although don’t for a moment think the ISO is the only organisation that defines country codes… but that’s a whole other blog post!

What gets in included in the list is interesting… the criteria for inclusion include member states of the United Nations, a UN specialized agency or a party to the Statute of the International Court of Justice. Becoming a member state of the UN is clearly helpful, although what makes a country is interesting in itself, as well as highly politicised. Palestine is an obvious example, but just look at the UK. The UK is a country, but should Wales, Scotland, and Northern Ireland also be included? For example, they are included for FIFA. The UN loosely uses Article 1 from the Montevideo Convention which outlines four qualities a state should have: a permanent population, a defined territory, government, and the capacity to enter relations with other states.

Anyway, once you are on the ISO 3166-1 list you get 2 and 3 letter codes, along with a 3 digit numerical code. These are maintained by the ISO 3166 Maintenance Agency and, given the above, change regularly. You can view the current list here and subscribe to official updates.

At the RGS we are a membership organisation and take online international payments, so having up-to-date country codes is important. Rather than subscribe to the ISO, we use the UK government Country Register, which includes an update service. It has the ISO-2 letter codes, although isn’t necessarily identical (as it’s countries the UK recognises).

EGU 2020 Short Course: UAV Data Collection and Analysis: operating procedures, applications and standards

UAV Data Collection and Analysis: operating procedures, applications and standards Conveners: Paolo Paron; Co-conveners: Mike James, Michael Smith

UAVs have reached a tipping point in geoscience research such that they are near-ubiquitous and commonly used in data collection. In this way they are opening new ways to study and understand landforms, sediments, processes and other landscape properties at spatial and temporal scales that is close to the scale of the processes that operate. However this implies that non experts are entering the field of photography, image interpretation, photogrammetry and 3D modelling often without a solid grounding in the principles of surveying. This course aims at providing a solid foundation for UAV users in order to avoid simple mistakes that can lead to legal restrictions, UAV loss, operational problems and poor quality data.

We will introduce pre-flight, in-flight, and post-flight procedures that aim at optimizing the collection of high quality imagery for subsequent downstream processing. We will also demonstrate the analyses of data by means of existing state of the art commercial software, such as Pix4D and Metashape for point cloud analysis, and eCognition for object based image analysis. We will also demonstrate the use of open source/open access software like Cloud Compare and Orfeo Toolbox

Converting from Blosxom to Flatpress

This blog has been offline for a little while as the original Blosxom implementation had been hacked. Blosxom was a wonderful CGI script that was elegant in its simplicity yet eminently extensible through the many plugins which existed and made it moderately feature rich. Best of all, it used plain text files to store all its entries which makes backup and conversion much simpler than a database. With my implementation of blosxom decommissioned, I needed to find a replacement. Google flat file blogging engines and there are a lot. However many of the projects have been orphaned, like blosxom, and no longer in active development. What I wanted to find was an engine that was simple, had some good features and an active community. Flatpress seems to fit the bill with a new maintainer - and active Flatpresser - Arvid Zimmerman.

The next step was to convert my archive of over 1000 blosxom blog entries to Flatpress. Big shout out to James O’Connor who wrote the Python script to convert the files. The process is broadly this:

  • download your Blosxom files, including all the sub-directories for categories, but make sure to maintain the date/time filestamp of individual files - this is used to timestamp the entry for Flatpress. WinSCP does this (Filezilla doesnt)
  • make sure the categories only ONE DIRECTORY DEEP. Move any sub-sub-directories up to the top level
  • rename all the directories to numbers. These are used to tag the entries and can then be recreated within FlatPress
  • copy the script.py and template files to the directory the folders are stored in
  • edit the template file to have the header/footer you want. The content, date and categories will be changed for the entries
  • run the script
  • a new fp-content directory will be created with all your entries
  • upload this to your flatpress site and rebuild the index

The script does the following

  • renames the file to entry‹date›-‹time›.txt based upon the date modified date
  • copies the file to a new subfolder in FlatPress /content folder based upon year and month
  • deletes the first line from the file (and deletes the first line break)
  • prefixes the file with: VERSION|fp-1.1|SUBJECT||CONTENT|
  • suffixes with: |AUTHOR|miksmith|DATE|<1566926569>|CATEGORIES||

Any updates will be posted over at Flatpress.

FREE EPRINT: Editorial: Perspectives on the contemporary art-geoscience interface, Journal of Maps

Tooth, S., Smith, M.J., Viles, H.A. and Parrott, F.
Journal of Maps


This Special Issue of the Journal of Maps is devoted to highlighting contemporary examples of interdisciplinary collaborations between the arts and the geosciences (e.g. geomorphology, geology, Quaternary studies), with a specific focus upon the exploration of locations using, at least in part, some form of mapping. As previous contributions to the journal have exemplified, mapping is essential for the exploration of locations, particularly by supplying visual representation to help with the characterisation of three core geographical concepts (Matthews & Herbert, 2008): space (e.g. distances, directions), place (e.g. boundaries, territories), and environment (e.g. biophysical characteristics).

FREE EPRINT: Testing and application of a model for snow redistribution (Snow_Blow) in the Ellsworth Mountains, Antarctica, Journal of Glaciology

Mills, S.C., Le Brocq, A.M., Winter, K., Smith, M.J., Hillier, J., Ardakova, E., Boston, C., Sugden, D. and Woodward, J.
Journal of Glaciology


Wind-driven snow redistribution can increase the spatial heterogeneity of snow accumulation on ice caps and ice sheets, and may prove crucial for the initiation and survival of glaciers in areas of marginal glaciation. We present a snowdrift model (Snow_Blow), which extends and improves the model of Purves et al. (1999). The model calculates spatial variations in relative snow accumulation that result from variations in topography, using a digital elevation model (DEM) and wind direction as inputs. Improvements include snow redistribution using a flux routing algorithm, DEM resolution independence and the addition of a slope curvature component. This paper tests Snow_Blow in Antarctica (a modern environment) and reveals its potential for application in palaeo-environmental settings, where input meteorological data are unavailable and difficult to estimate. Specifically, Snow_Blow is applied to the Ellsworth Mountains in West Antarctica where ablation is considered to be predominantly related to wind erosion processes. We find that Snow_Blow is able to replicate well the existing distribution of accumulating snow and snow erosion as recorded in and around Blue Ice Areas. Lastly, a variety of model parameters are tested, including depositional distance and erosion vs wind speed, to provide the most likely input parameters for palaeo-environmental reconstructions.

FREE EPRINT: Quantification of Hydrocarbon Abundance in Soils using Deep Learning with Dropout and Hyperspectral Data, Remote Sensing

Asmau Ahmed, Olga Duran, Yahya Zweiri, Mike Smith
Remote Sensing


Terrestrial hydrocarbon spills have the potential to cause significant soil degradation across large areas. Identification and remedial measures taken at an early stage are therefore important. Reflectance spectroscopy is a rapid remote sensing method that has proven capable of characterizing hydrocarbon-contaminated soils. In this paper, we develop a deep learning approach to estimate the amount of Hydrocarbon (HC) mixed with different soil samples using a three-term backpropagation algorithm with dropout. The dropout was used to avoid overfitting and reduce computational complexity. A Hyspex SWIR 384 m camera measured the reflectance of the samples obtained by mixing and homogenizing four different soil types with four different HC substances, respectively. The datasets were fed into the proposed deep learning neural network to quantify the amount of HCs in each dataset. Individual validation of all the dataset shows excellent prediction estimation of the HC content with an average mean square error of ~2.2×10-4. The results with remote sensed data captured by an airborne system validate the approach. This demonstrates that a deep learning approach coupled with hyperspectral imaging techniques can be used for rapid identification and estimation of HCs in soils, which could be useful in estimating the quantity of HC spills at an early stage.

FREE EPRINT: Assessment of low altitude UAS flight strategy on DEM accuracy, Earth Science Informatics

Anders, N.S., Smith, M.J., Suomalainen, J., Cammeraat, L.H., and Keesstra, S.D.
Earth Science Informatics


Soil erosion, rapid geomorphological change and vegetation degrada- tion are major threats to the human and natural environment. Unmanned Aerial Systems (UAS) can be used as tools to provide detailed and accurate estimations of landscape change. The effect of flight strategy on the accuracy of UAS image data products, typically a digital surface model (DSM) and orthophoto, is unknown. Herein different flying altitudes (126-235 m) and area coverage orientations (N-S and SW-NE) are assessed in a semi-arid and medium-relief area where terraced and abandoned agricultural fields are heavily damaged by piping and gully erosion. The assessment was with respect to cell size, vertical and horizontal accuracy, absolute difference of DSM, and registration of recognizable landscape features. The results show increasing cell size (5-9 cm) with increasing altitude, and differences between elevation values (10-20 cm) for different flight directions. Vertical accuracy ranged 4-7 cm but showed no clear relationship with flight strategy, whilst horizontal error was stable (2-4 cm) for the different orthophotos. In all data sets, geomorphological features such as piping channels, rills and gullies and vegetation patches could be labeled by a technician. Finally, the datasets have been released in a public repository.

FREE EPRINT: ‘Reading landscape’: interdisciplinary approaches to understanding, Journal of Maps

Mike J. Smith, Flora Parrott, Anna Monkman, James O’Connor and L. Rousham
Journal of Maps


This paper outlines a collaborative project between a group of Fine Art and Geography students who helped develop and contribute to a conversation about recording ‘place’. Introducing methodologies from both disciplines, the project started from the premise of all environmental ‘recordings’ being ‘inputs’ and so questioned what could be defined as ‘data’ when encountering a location. Brunel’s Grand Entrance to the Thames Tunnel (London) provided the motivation for 10 objective and subjective ‘recordings’ which were subsequently distilled into a smaller subset and then used to produce a short film that was presented at an international conference. Important to the collaborative nature of the project were ongoing opportunities to share equipment, techniques, material and references across disciplines. It was an experiment to measure the potential for ‘mapping’ to capture physical and historical information, as well as embodied experience.

FREE EPRINT: Land inundation and cropping intensity influences on organic carbon in the agricultural soils of Bangladesh, Catena

M.J. Uddin, Peter S. Hooda, A.S.M. Mohiuddin, Mike J. Smith and Martyn Waller
Catena


Land inundation is a common occurrence in Bangladesh, mainly due to the presence of two major river systems -the Brahmaputra and the Ganges. Inundation influences land use and cropping intensity. However, there is little information on the influences of the extent of flooding and cropping intensity has on soil organic carbon (SOC),particularly at the landscape level. To investigate these influences, we collected 268 surface (0-30 cm) soil samples from 4 large sites within the two alluviums deposits (the Brahmaputra river and the Ganges river), on a regular grid (1600 m). The findings show that SOC levels are generally low, reflecting the intensity of agriculture and land management practices. SOC variability was higher across the medium high land (MHL) and medium low land (MLL) sites than in the high land (HL) and low land (LL) sites. The relatively low SOC levels and variability in the HL sites indicate soils here might have reached to equilibrium levels due to higher land use intensity. Topographically higher lands (HL and MHL), due to less of inundation, had higher cropping intensities and lower SOC’s than lower lands (MLL and LL), which had lower cropping intensities, as they remain inundated for longer periods of time. The findings clearly demonstrate the intrinsic influence of land inundation in driving cropping intensity, land management practices and SOC levels.

FREE EPRINT: Summary of activities 2018, Journal of Maps

Mike J. Smith (2019)
Journal of Maps


Creativity is one of those tropes that seems to do the rounds regularly in, well, creative circles. Almost by definition, it is levelled at the arts, in part because its base definition is along the lines of the ability to create. Withinthis context, cartography is well-poised because any map requires the cartographer to create a new, unrealised, graphic product.

OPEN ACCESS EPRINT: Demystifying academics to enhance university-business collaborations in environmental science

John K. Hillier, Geoffrey R. Saville, Mike J. Smith, Alister J. Scott, Emma K. Raven, Jonathon Gascoigne, Louise J. Slater, Nevil Quinn, Andreas Tsanakas, Claire Souch, Gregor C. Leckebusch, Neil Macdonald, Alice M. Milner, Jennifer Loxton13, Rebecca Wilebore, Alexandra Collins, Colin MacKechnie, Jaqui Tweddle, Sarah Moller, MacKenzie Dove, Harry Langford, and Jim Craig (2019)
Geoscience Communication


challenge posed by a heavily time-constrained culture; specifically, tension exists between opportunities presented by working with business and non-optional duties (e.g. administration and teaching). Thus, to justify the time to work with business, such work must inspire curiosity and facilitate future novel science in order to mitigate its conflict with the overriding imperative for academics to publish. It must also provide evidence of real-world changes (i.e. impact), and ideally other reportable outcomes (e.g. official status as a business’ advisor), to feed back into the scientist’s performance appraisals. Indicatively, amid 20-50 key duties, typical full-time scientists may be able to free up to 0.5 day per week for work with business. Thus specific, pragmatic actions, including short-term and time-efficient steps, are proposed in a “user guide” to help initiate and nurture a long-term collaboration between an early- to mid-career environmental scientist and a practitioner in the insurance sector. These actions are mapped back to a tailored typology of impact and a newly created representative set of appraisal criteria to explain how they may be effective, mutually beneficial and overcome barriers. Throughout, the focus is on environmental science, with illustrative detail provided through the example of natural hazard risk modelling in the insurance sector. However, a new conceptual model of academics’ behaviour is developed, fusing perspectives from literature on academics’ motivations and performance assessment, which we propose is internationally applicable and transferable between sectors. Sector-specific details (e.g. list of relevant impacts and user guide) may serve as templates for how people may act differently to work more effectively together.

Ricoh GRIII

I’ve waxed lyrical about the Ricoh GR being a great UAV camera before - well the long awaited successor has been announced (but hasn’t yet landed) and summarised over at DPReview. The interesting aspects of the uprated specs are IIS (inbody image stabilisation), 24MP sensor and touchscreen. The resolution boost and IIS will be of significant interest to UAV users so it will be interesting to see how it performs out in the field.

TED: Academic research is publicly funded - why isn’t it publicly available?

An interesting talk from TEDxMileHighWomen. Worth a watch to get a short 10-min summary of some of the issues involved with publishing academic research - the comments are worth a look too.

As Erica Stone implies, she hasn’t got much experience in academic publishing and it unfortunately shows. There are some points well made, but there is and underlying naivety about the role of publishing, the cost, the requirements of universities and the amount of time academics have. As I noted in my editorial this year:

Academic publishing is a knowledge distribution and academic assessment system, partially funded by universities and research institutes.

To publish you have cross-subsidise, or go down an author or reader pays route - ironically (and perhaps to the chagrin of the OA camp), OA is currently costing the system more than a subscription model on an annual basis and probably on a pageview basis too. But, let’s keep the debate going!