FOIMan explains why he’s not afraid of the dark age.
In my last post I recounted how pioneers in the UK have contributed to the development of digital preservation solutions over the last 20 years. This was inspired by several news articles at the end of last week reporting on Google Vice-President Vint Cerf’s comments heralding a “digital dark age”. In this piece I want to give my personal reaction to this apocalyptic prediction.
As I indicated in my previous post, the issues raised by Cerf are not new. And indeed he isn’t the first to suggest the dire consequences of a lack of action.
But are such visions realistic? My personal view is that they’re not. Let’s consider what happened in the past.
We tend to assume that electronic formats are somehow more fragile than previous media. This isn’t in fact the case. As any archivist or conservator will tell you, failure to keep paper or parchment at the right levels of temperature and humidity can lead to it becoming unreadable. In one job early in my career I found records being stored in damp, dank conditions under the Town Hall steps. They were covered in mould and fungi – to all intents and purposes unreadable. Some of the records were less than 10 years old.
Just as servers can be hacked, intruders or employees with a grievance can access offices and pick up files they shouldn’t have seen. Careless employees can leave files on trains or even in evacuated premises. Fire or flood can destroy whole warehouses of physical records without the insurance of a backup to restore the files.
These risks have always existed. And until more enlightened times, even governments failed to keep their records in suitable storage. Just read Caroline Shenton’s excellent book about the fire that destroyed the Palace of Westminster if you want some illustrations of this.
And yet… Record Offices hold vast quantities of physical records – they complain of lack of space and have significant backlogs requiring cataloguing. Historians will always want more, but the fact is that despite the poor quality of storage in previous eras, the limited literacy of earlier generations, and in some cases the passing of many years, archivists hold vast volumes of evidence on our past.
The problem in our era is not a sparsity of information. It’s a glut of it. And with so much information – whatever format it is originally created in – it is inevitable that a huge proportion of it will survive. Indeed it is fear that information will live on indefinitely that feeds current debate over the right to be forgotten.
It will survive because it is popular – the more copies of a file that exist, the more likely it is that some will remain (take, for example the four copies of Magna Carta recently exhibited together in London). It will survive because people are interested in it. FOI will play its role – now copies of government documents will be found in many personal collections and on websites as well as stored by their creators. A proliferation of information – facilitated in the digital world – will guarantee that vast quantities of it remain accessible to future historians.
The real problem is not whether there will be information that will remain accessible, but which information should do so. As I’ve suggested, lots of it will live on purely through chance. But it is important that organisations (and individuals too) identify the records that have most value – especially long-term value – and take deliberate action to preserve them. This too will happen because there are commercial, governmental or sentimental reasons to retain them. In my last post I explained the need for pharmaceutical companies to retain digital records – so they took steps to ensure that those records would be preserved. Similarly the digital photographs that you look at the most – of your children, your significant experiences – will almost certainly survive because you will regularly look at them and if you have problems accessing them you will do something about it.
Digital records require specific techniques to ensure their preservation (as indeed do records printed on paper or written on parchment). That’s why the work of the pioneers I wrote about in my last post is so important. But in principle at least, preserving digital records is no different to preserving records created in other formats. It requires the organisation or individual to first identify what it needs to keep (a point made by the National Archives’ Chief Executive, Jeff James, on Saturday’s Today programme on BBC Radio 4). How it will keep it is a secondary and technical question, but one that will be answered if the information really does have value.
This is why, short of a nuclear holocaust (in which case I suspect we will have more pressing concerns should we survive), I don’t think a dark age is coming, digital or otherwise.
Trouble is, digital information depends on having electricity, which is not necessarily predictable in the historical long term. The indefinite extension of life for digital information also requires that current formats are the norm waaay into the future.
Contrast the BBC’s 1986 re-enactment of the Domesday book. By the turn of the century it was unrecoverable due to format obsolescence, and it was purely a fluke find of an old video disc player that enabled the entire project to be salvaged. And the original Domesday book itself can still be read and opened after a thousand years of not-always-optimal conditions, whereas no-one’s really sure what the first email said and there’s absolutely no original of it left anyway and no chance of retrieving one if it did.
I can put a book to one side and pick it up later in the certainty that all the pages will still be there, that it won’t have been surreptitiously edited by anonymous partisans or propagandists while my back was turned, and that the book itself won’t suddenly become invisible due to the local library shutting unexpectedly. You absolutely cannot say that any analogous position is true of the internet.