The Cultural Heritage of Our ‘Digital Dark Age’ Thrown into the ‘Information Black Hole’: What Can It Teach Us and Our Students in the Near Future and ‘in the Long Run’?
The Digital Humanities — and they are not alone — do not only document their research data and results in the form of texts and — recently — databases using digital tools and formats, they also create new forms of cultural heritage in genuine digital formats. While the advantages of these tools and formats are obvious and help researchers to communicate faster, access more sources and research results and any kind of information faster than ever before, their future usability and availability are far from certain: During the last c. 30 years, and especially with the advancement of the World Wide Web, hundreds of research projects have been started using digital means — and many of those have disappeared in the meantime, destroying the results of scholarly work and, by doing so, working and life time in unimaginable magnitudes. Now, for the last 5 years or so, funding institutions are demanding more and more that researchers present ‘data management plans’ describing how their data will be created, stored and ‘securely’ safed for the future. And by ‘future’, they mean 15–20 years. This demand does not only come some 25 years (too) late, it is also ridiculous in several aspects: First, while research in the humanities usually deals with artifacts dozens, hundreds or even thousands of years old, the results of the new digitally enriched research shall be lost after 20, at least 50 years? Because there is NO data format, let alone: software, that can be guaranteed to work for more than 50 years. (And this is only expected to be true for simple text formats like TXT, i. e. without any enhancement in comparison to printed texts.) In addition, computer sciences themselves have no means and even no ideas what such means could be or look like to preserve data, software and the necessary hardware to use both for more than 20–30 years. Vint(on) Cerf, as the main developer of the TCP/IP in the early 1970s, one of the ‘fathers of the internet’, has warned since 2015 that our times will be seen as the «digital dark age» in the future. But his suggestion for a solution, a sort of a meta-emulation for hard- and software called the «digital vellum» is not working yet and will encounter grave problems once it would be working and available for everyone. Alan Kay, one of the fathers of object-oriented programming and the graphical user interfaces developed in the late 1960s and early 1970s, proposed another medium to preserve at least ‘snapshots’ of data on discs containing self-explaining descriptions decipherable and readable by rational beings even in thousands of years. He calls them the digital «cuneiform tablets». But these do not address our immediate problem, and they may also not overcome the same problems with proprietary software, licences, ‘activation keys’ that have to be exchanged over the internet and, finally, data and software heavily (and more and more) relying on other data accessible via networks.
Before we introduce more and more new, shiny digital products, tools and formats into research, documentation and usages like education, we should step back for a moment and try to answer the question: How long will these be available? Is it really worth to invest time and (lots of) money into projects whose results we ourselves may not be able to use in 20 years anymore? Based on this situation, the paper will sketch a proposal for another solution that may look ridiculously expansive at first sight, but — as far as I can see — cannot be avoided in the long run. It will only get more and more expansive the longer we wait, and more and more research data will be lost in the meantime. We should teach (and learn ourselves) that digital tools and formats bear this immense problem, and that without any solution we are throwing our data into a big «information black hole» (Cerf).