Thursday 9 June 2011

Digital editions and crowdsourcing

Crowdsourcing has been discussed as an element of digital editing projects for some time now. In practice, however, it has come to mean the crowdsourcing of transcriptions, at least in an academic context. There are some strikingly successful examples of this approach. I can hardly fail to mention the Transcribe Bentham project at University College London. At the time of writing it has 1,291 registered users and there have been 11,383 edits. The project seems to have been particularly successful in building a sense of community among those who have contributed, and is careful to acknowledge their efforts (for example in a 'leader board').

Another project which has received publicity recently is the Civil War Diaries Transcription Project at the University of Iowa. The interface is extremely simple, and unlike Bentham, there is not even a requirement for users to register (an email address is optional). There is also no real editorial guidance provided so it will be very interesting to see the general quality of submissions (which will be checked before publication).

Both of these projects have developed bespoke interfaces, albeit a very simple one in the latter example. A new tool from the Center for History and New Media at George Mason University may help others to adopt this approach without having to spend time on potentially costly software development. Scripto is described as 'a light-weight, open source, tool that will allow users to contribute transcriptions to online documentary projects' and has enormous potential in a range of humanities disciplines. Crucially it supports version control and a full, presumably customisable, editorial workflow.

It remains to be seen whether the success that has been achieved in the crowdsourcing of transcriptions, or in the correction of OCR, will transfer to other elements of the editorial process. ReScript, for example, is planning to produce a crowdsourced new edition of the Alumni Oxonienses, with contributors identifying individuals, suplying corrections and adding new information. This is a much more subjective process than transcription (although of course there are subjective elements here too) and it remains to be seen whether we will be able to make it work for a scholarly edition.

1 comment:

  1. Transcription certainly seems to be the 'killer app' of crowdsourcing for academic purposes, and what the Transcribe Bentham Initiative has achieved, in what is a relatively short time, is phenomenal.

    I've recently instigated a much more lo-fi crowdsourcing example within the electronic version of James Mill's common place books, which I currently edit. Within each of the 58 chapters that make up the manuscripts I am asking users who have studied the chapter to contribute a series of tags to better identify it - be they thematic, descriptive or historical in nature. The idea is to eventually use these user-contributed tags to associate chapters together that have different titles, but share similar material or ideas.

    Anyone who is interested is welcome to contribute at http://www.intellectualhistory.net/mill/. No registration is required.

    ReplyDelete