Digital Technology and the Practices of Humanities Research
|4. The Impact of Digital Resources
Texte intégral
1It has now become commonplace to begin articles about the use and impact of digital resources with a bold statement about how much is being spent on their production. And it is a great deal, and seems to rise every year. What is less clear is exactly why we are doing this. We are often told that if it is not digital or digitised, it does not exist, and that this is especially true for our students. The corollary of this is the assumption that if things are digital, they not only exist, but are popular, exciting, well known, and thus well used. University managers, funding councils, and policy makers also appear to assume that doing things with computers is automatically better, faster, cheaper, and more economical in terms of person-time than not doing it, despite the lack of evidence for this.
2It is no wonder, then, that there often seems to be an implied belief that doing humanities in a digital way will render it ‘relevant’, solve any apparent crises in the subject, and bring what has otherwise been obscure and arcane to the notice, and indeed love, of the general public. At the same time, cultural heritage organisations, such as museums, galleries, archives, and libraries, have been investigating ways in which they can use digital methods and social media as a vector for outreach and a way to increase visitor engagement.
- 1 Simon Tanner, Measuring the Impact of Digital Resources: The Balanced Value Impact Model (London: K (...)
- 2 See, for example, James Smithies et al., ‘Managing 100 Digital Humanities Projects: Digital Scholar (...)
3But are these assumptions well founded? Do we render the humanities relevant simply by being digital? Do visitors automatically find it easier to engage with cultural heritage, with galleries, libraries, museums, or archives (GLAM) if the material is digitised? These questions fall into the realm of what has become known as impact assessment, whether it is carried out by government bodies, or by cultural heritage institutions themselves.1 In the following chapter we examine the question of how digital resources might have an impact, and upon whom, in what way, and how it might be measured. We will also examine the necessary conditions for a resource or collection to have an impact, foremost among which is its continued existence — an obvious and necessary condition, but not necessarily one as easily achieved as might be expected.2
Understanding and Measuring Impact
- 3 Sara Selwood, ‘What Difference Do Museums Make? Producing Evidence on the Impact of Museums’, Criti (...)
- 4 Simon Tanner, and Marilyn Deegan, Inspiring Research, Inspiring Scholarship. The Value and Benefits (...)
4The process of understanding and measuring impact (impact assessment) has many definitions, depending on the context in which it is used. There are well-established fields of impact assessment, such as environment, health, economic, and social impact assessment; but these have not normally been associated with humanities research or cultural heritage institutions, particularly with regard to digital content, collections, and resources.3 Recent research into the value and impact of digitised resources and collections has shown clear benefits; but while there is an abundance of anecdotal evidence, systematic data is often lacking.4
- 5 John Myerscough, The Economic Importance of the Arts in Britain (London: Policy Studies Institute, (...)
5For much of the last two decades the GLAM sector has taken the lead in measuring the impact of both its digital and physical collections. There has been a growing recognition that demonstrating, monitoring, and clearly articulating the impact and value of their existence is necessary in a time of intense pressure on public funding. Since the 1980s, the value and use of GLAM sector collections has been demonstrated through the lens of their ‘impact’, whether economic or social.5
- 6 Eilean Hooper-Greenhill, ‘Measuring Learning Outcomes in Museums, Archives and Libraries: The Learn (...)
- 7 Department for Culture, Media, and Sport, Statistical Data Set: Museums and Galleries Monthly Visit (...)
6Over the last fifteen years, a large amount of work has gone into forming and testing appropriate, flexible, and effective methodologies to indicate the impact and value of the GLAM sector. These include measuring attendance and demographics, audience evaluation, generic learning outcomes, and most recently, culture metrics.6 For example, comprehensive monthly quantitative data is collected by all Department for Culture, Media, and Sport (DCMS)-sponsored museums and galleries in an attempt to reflect the quality and effectiveness of the programmes and the impact they have on society.7 They provide a broad picture of performance with a focus on visitor figures, audience profiles, learning, outreach, visitor satisfaction, and income generation.
7Although the frequency of evaluation is rising, whether it is meaningful in terms of its significance to long-term institutional impact assessment is still questionable, particularly in relation to digital resources. There is a need to address the ‘use’, ‘value’, and ‘impact’ of digital resources in the context of an expanding mass of cultural heritage digital content, which is believed to have tremendous potential for public engagement.
- 8 Hasan Bakhshi and David Throsby, Culture of Innovation. An Economic Analysis of Innovation in Arts (...)
- 9 Wavell et al., Impact Evaluation.
8Current evaluation models, which are mainly project-driven, lack the consistency and longevity to create meaningful performance indicators and benchmarks. Many of the impact studies of museum and cultural activities overstate their measurable economic values but ignore the intangible impacts and values that they generate. Hasan Bakhshi and David Throsby, writing in 2010, believe that ‘ [f] resh thinking is needed on how to articulate and, where possible, measure, the full range of benefits that arise from the work of arts and cultural organisations’.8 However, this will be difficult; cultural impacts are often intangible, are more complex than the purely economic and numerical, and hard to explain and prove.9 Visitor experience and engagement cannot be measured by instrumental values alone. As more collections are made available via digital technologies, the number of beneficiaries will increase and the ability of the sector to track and trace the benefits and end uses of visitor engagement with collections will become increasingly challenging.
- 10 Claire Warwick et al., ‘If You Build It Will They Come? The LAIRAH Study: Quantifying the Use of On (...)
- 11 Tanner, Measuring the Impact of Digital Resources.
- 12 ‘TIDSR: Toolkit for the Impact of Digitised Scholarly Resources’, Oxford Internet Institute, https: (...)
- 13 Paola Marchionni, ‘Why Are Users So Useful? User Engagement and the Experience of the JISC Digitisa (...)
9The rise of ‘impact’ as an important concept in academic research, and the use of digital resources created in academia, is more recent. The LAIRAH (Log Analysis of Internet Resources in the Arts and Humanities) study found that very few creators of digital resources knew how they were used and had no contact with their user base.10 Even funding bodies lacked knowledge about this; as Simon Tanner points out, LAIRAH was one of the first studies commissioned by the Arts and Humanities Research Council (AHRC) into the use of its resources.11 However, in the twelve years since this study, changes are being made. Jisc became aware that investment in digital resources might be more strategically targeted, and so mandated user consultation and involvement in its second phase digitisation projects and commissioned a study, which resulted in the TIDSR (Toolkit for the Impact of Digital Scholarly Resources).12 It proposed a number of different methods for evaluating the use of a digital resource.13 This was a welcome development, but, at the time, the idea of digital impact was associated only with use, findability, and dissemination: the toolkit involves such methods as web metrics, log analysis, surveys, focus groups, and interviews.
- 14 Lorna M. Hughes et al., ‘Assessing and Measuring Impact of a Digital Collection in the Humanities: (...)
10There is a strong underlying assumption, therefore, that use equals impact. The TIDSR team stresses that this is the reason for including qualitative techniques such as focus groups, because metrics may tell us how many people have landed on a certain page, or how many links are made to it; but they cannot tell us what the user thinks about what they have found, what they like and dislike, what they wanted or did not want, or, crucially, if they found what they were looking for. The toolkit was designed not only to provide evidence of use for the funders and institutions themselves, but also to help designers improve the resources; its utility has been proven in published studies such as those by Lorna M. Hughes et al.14
- 15 Molly Morgan Jones and Jonathan Grant, ‘Making the Grade: Methodologies for Assessing and Evidencin (...)
- 16 REF, Assessment Framework and Guidance on Submissions (Bristol: REF UK, 2011), http://www.ref.ac.uk (...)
11However, a major change in the idea of impact measurement occurred after TIDSR was produced: the UK’s Research Excellence Framework (REF) adopted the idea of impact. The primary purpose of REF 2014 was to assess the quality of research in the UK’s Higher Education Institutions (HEIs). A significant difference between the RAE (Research Assessment Exercise), last carried out in 2008, and REF 2014 was the inclusion of the assessment of impact.15 This was defined as ‘any effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment, or quality of life, beyond academia’.16 Under the terms of the REF, the conflation of use and simple dissemination of results was no longer acceptable. Academics now had to prove that their work had produced a change in behaviour of, or benefit to, a user community, and assessors were mandated to evaluate the reach and significance of such changes on a four-star scale. However, such effects are not straightforward to measure.
- 17 Ben Showers, ‘A Strategic Approach to the Understanding and Evaluation of Impact’, in Evaluating an (...)
12Impact evaluation is a complex issue, which is not helped by the fact that definitions are still being determined and understood by the sector. While there is an abundance of anecdotal evidence and descriptions of best practice, extensive evidence of impact, gathered systematically, is often lacking. The concept of impact is problematic because it is often entwined with several other key issues inherent in digital resources: discoverability, access, usage, and sustainability.17 Considering the nature of these interwoven issues, is it possible to identify and measure impact in humanities research, particularly focusing on digital resources?
- 18 Sara Selwood, ‘Making a Difference: The Cultural Impact of Museums. An Essay for NMDC’ (2010), http (...)
- 19 Emily Keaney, ‘Public Value and the Arts: Literature Review’, Strategy (2006), 1–49 (p. 41); J. Hol (...)
13Sara Selwood suggests there are various ways of ascertaining, if not assessing, overall impact other than by economic value.18 These include: direct consultation to assess public value; self-evaluations, and peer and user reviews; and stakeholder analysis.19 Indeed, an increasing body of work is being developed around such approaches; but, to date, this has largely relied on peer and specialist review, which draws on small, professional networks rather than end-users.
- 20 Simon Tanner, ‘The Value and Impact of Digitized Resources for Learning, Teaching, Research and Enj (...)
14Tanner has produced a complex model of impact assessment for GLAM institutions, which also defines impact as going beyond use to include benefit and change.20 It takes into account multiple factors such as the ecosystem of a digital resource, the value drivers, and the key criteria indicators, all applied through five core functional stages: 1) context, 2) analysis and design, 3) implementation, 4) outcomes and results, and 5) review and respond; and it is evident that undertaking such an analysis would be a complex, time-consuming, and costly exercise.
- 21 Ibid.
- 22 Tanner, Measuring the Impact, p. 23.
15Nevertheless, we still lack adequate means to assess impact in humanities research due to a dearth of significant evidence beyond the anecdotal.21 Despite the mass of existing evidence, ‘attempts to interpret such evidence often tends (sic) to rely on assumptions about the nature of digital resources, without fully appreciating the actual way in which end users interact with digital content’.22
- 23 Nancy L. Maron, Jason Yun, and Sarah Pickle, ‘Sustaining our Digital Future: Institutional Strategi (...)
16It is tempting draw a distinction, as Nancy Maron et al. do, between digital resources that are created by academics as part of their research, and the digitisation of collections and resources by GLAM institutions.23 We might argue that the process of digitising a collection of papers, images, or museum objects for use by a memory institution differs from an academic, or group of academics, creating a digital resource as part of their research. It might be regarded as a service that is provided for the visiting public by the institution. It may be at least partially funded by the institution, and thus amenable to a more centralised, controlled process, and likely to be attached to an existing catalogue, or similar finding aid. An academic resource may be a piece of ‘private enterprise’ resulting from the individual’s research interests. It is likely to be externally funded for a limited period, and may be somewhat idiosyncratic in design (this is more likely the older the resource is). In a large university, there may be numerous different homes for such projects: departments, computing centres, libraries, research units, digital humanities (DH) centres, or a combination of the above. In this way, the digital landscape may look, at least outwardly, more chaotic.
- 24 Rossetti Archive, www.rossettiarchive.org
- 25 Blake Archive, www.blakearchive.org
- 26 Whitman Archive, www.whitmanarchive.org
- 27 Old Bailey Online, www.oldbaileyonline.org
- 28 Claire Warwick, ‘Archive 360: The Walt Whitman Archive’, Archive Journal, 1.1 (2011).
17But this would be to oversimplify things. Many of the most celebrated digital research projects created by academics have resulted in very comprehensive digital resources, often known as archives (the Rossetti Archive,24 the Blake Archive,25 the Whitman Archive,26 to name only a few), or in databases with huge, diverse user communities, such as the Old Bailey Online.27 Yet, they are the product of very complex and intellectually rigorous research, which could have, and in some cases has, resulted in the production of more traditional scholarly outputs such as articles and monographs.28 It would also be a serious under-estimation to imply, in an age of highly skilled ‘alt-ac’ (alternative-academic) DH professionals working in museums, libraries, and archives, that resources created by GLAM institutions are simply about service and not the outcome of research. Tanner’s model is designed for the GLAM sector, but draws explicitly on the definition of impact created for an academically driven exercise — the REF — and the process and model that he describes could easily be applied to an academically generated resource.
- 29 Hughes et al., ‘Assessing and Measuring Impact’.
- 30 Björn Hellqvist, ‘Referencing in the Humanities and its Implications for Citation Analysis’, Journa (...)
- 31 Daniel Maliniak, Ryan Powers, and Barbara F. Walter, ‘The Gender Citation Gap in International Rela (...)
- 32 Wilsdon, James, et al., The Metric Tide: Report of the Independent Review of the Role of Metrics in (...)
18Digital resources may also have academic impact when a resource has an influence on the work of other academics. In the case of analogue resources, citations are commonly used as evidence of this; however, as Hughes et al., show, this is problematic in the case of digital resources, which are often not cited correctly.29 Even in the case of conventional publications there are still significant problems in the use of metrics to judge academic impact and value: academics may cite papers as a straw man argument or an example of bad practice, and may cite in very different ways according to discipline — especially in the arts and humanities.30 The gender of the author has also been proven to affect citation practices.31 Thus, the most recent report concludes that metrics are not subtle enough to judge the quality of any kind of academic output, whether conventional or digital.32
- 33 REF, ‘Assessment Criteria and Level Definitions’, https://www.ref.ac.uk/2014/panels/assessmentcrite (...)
- 34 For further details on REF and impact see: Rita Marcella, Hayley Lockerbie, and Lyndsay Bloice, ‘Be (...)
- 35 HEFCE, Nature, Scale, and Beneficiaries.
19REF impact was assessed according to its reach and significance, and awarded star ratings from unclassified (little or no evidence of reach or significance) to four-star (outstanding).33 Case studies also had to provide evidence for a link between this impact and the underpinning research, which had to be a two-star (internationally recognised) research output.34 The case studies are now available in a database that, despite the caveats discussed above, provides useful evidence for the impact of UK research, whether digital or analogue. In the following section, we present a qualitative analysis of the impact of digital humanities as evidenced by the case study database. A previous quantitative text-mining-based study of all the REF case studies provides excellent evidence for the diversity of impacts claimed for research carried out in the UK’s universities.35 However, the report itself makes clear that this kind of method has limitations. Using text-mining methods, we can track the kinds of impact discussed: the words used, and the connections between themes and subject areas. This in itself is fascinating, but it provides only partial information. For example, case study authors claimed impact, but, the database does not indicate whether this claim was accepted by the panels as being wholly or partially evidenced, nor do we know how effective it was judged to be. Marks are released as a statistical profile across a unit, so we cannot link an individual case study to a star rating, unless all the case studies in that unit, from that university, were marked the same (which is relatively unusual). Nor do we know why the panel made the judgements they made, or how they marked reach and significance.
20We therefore did not use text-mining methods, since this chapter is concerned primarily with exploring the types and quality of impact produced by DH, and the arguments that may be made for it. Instead, we sampled those case studies that were likely to be most relevant to DH methods by means of a phrase search for ‘digital humanities’; this returned forty-one hits. We also searched for ‘digital scholar’ (zero results), ‘digital history’ (two results), ‘digital classics’ (two results), and ‘digital edition’ (eleven results). In a few cases the same project was indexed under two or more terms. Thus, the searches resulted in an initial set of forty-seven case studies. We then read each case study and identified the kinds of impact the case studies presented, and whether there was evidence for them. After an initial reading, it became apparent that, in some cases, the digital resource was either a very minor element of the whole project, or that the impact for it was either not claimed or not evidenced. This then left us with a set of forty-two studies.
21Both the panel’s evaluation and our reading of these case studies relied upon qualitative judgement because, while text-mining and statistical methods can show that the word ‘museum’ is present in a certain number of cases, we cannot tell how profound an effect, if any, the impact claimed on that museum or its visitors might be. Thus, we present findings in qualitative terms because we cannot know what judgements the panels themselves made, nor can we be sure that another reader looking at the same case studies would agree with every judgement we make.
- 36 Diogenes, https://community.dur.ac.uk/p.j.heslin/Software/Diogenes
- 37 Old Bailey Online, www.oldbaileyonline.org
- 38 REF, Research Excellence Framework 2014: Overview Report by Main Panel D and Sub-Panels 27 to 36 (L (...)
- 39 Maron, Yun, and Pickle, ‘Sustaining our Digital Future’.
22All the case studies provided evidence for the use of their resource, in some cases on a very impressive scale. For example, the Diogenes software,36 used to analyse classical texts, recorded 91,011 downloads, while the Old Bailey Online project37 has had five million visits from 213 countries since 2003. In some instances, use and dissemination were confused with evidence of impact — a widespread issue in humanities subjects.38 Numerous downloads of digital resources do not, of course, prove that users benefitted. However, all but four of the digital case studies did offer evidence of wide-ranging, genuine impact. Compared to the situation on which LAIRAH reported in 2005, where very few resource creators had any evidence of whether and how their resource was being used, this has been a huge step forward. It is also significant given that even in 2013 Maron et al. found that few resource creators had any contact with users, or collected data about use.39
- 40 Simon Tanner, ‘3 Reasons Why REF2014 Was Good for Digital Humanities Scholars’, When the Data Hits (...)
23In some ways, this apparent contradiction is explicable. Entering a research project of any kind, whether digital or not, as an REF impact case study was a highly selective process. The resources universities chose as case studies are likely to have been successful, and managed by dedicated PIs (principal investigators) and research teams who were likely to keep usage statistics. Once identified, case study projects had to collect further evidence of impact proactively. Nevertheless, the inclusion of the impact measurement in the REF appears to have produced an incentive for academics to keep information about how their research is used. Happily for DH, evidence of this is often easier to collect for digital resources than for analogue resources. In this sense at least, the impact measurement is, as Tanner argues, good for DH.40
Commercial Impact
- 41 Gate: General Architecture for Text Engineering, https://gate.ac.uk
- 42 Scottish Corpus of Texts & Speech, https://scottishcorpus.ac.uk
- 43 Thesaurus Linguae Graecae, http://stephanus.tlg.uci.edu/
24We found several cases of commercial impact: DH’s history of research in linguistic analysis resulted in the adoption of tools, algorithms, and resources outside academia. The GATE system,41 developed by the University of Sheffield, has had a profound effect on commercial practices in natural language processing, as has the SCOTS corpus42 from Glasgow on lexicography and the preparation of commercial teaching materials for English language. Software functionality for morphological analysis from the Diogenes system of Durham University was also used as part of a commercial publishing product: the Thesaurus Linguae Graecae.43 Two spin-off companies were formed, both of which focused on digital imaging: Oxford Multi Spectral Ltd (University of Oxford) and Scriptura Ltd (http://scriptura.co.uk) (University of Sheffield). This is a relatively common practice in the sciences, but highly unusual for the humanities.
- 44 David Gauntlett, Cultures of Creativity: Nurturing Creative Mindsets Across Cultures, ed. by Bo Stj (...)
25Several projects gave rise to collaborations with the software industry, such as the University of Leeds’ work on the Cologne Edition of Heinrich Böll, whose technical collaboration with software engineers at Pagina Ltd resulted in new software and platforms for large-scale critical editions. Perhaps the most unusual commercial relationship was the University of Westminster’s collaboration with LEGO via digital community interaction and creativity.44
Media and Performance
- 45 David Gauntlett, ‘Enabling and Constraining Creativity and Collaboration: Some Reflections after Ad (...)
- 46 Will Self, Kafka’s Wound, a digital essay, https://thespace.lrb.co.uk/
- 47 London Review of Books, www.lrb.co.uk
26Although broadcast media was most commonly used as a dissemination tool, we found several cases where digital projects had collaborated with the media to produce a genuine impact. Westminster researchers worked with the BBC and S4C to develop a virtual world for children called Adventure Rock. This research helped both companies reconsider their presentation of interactive experiences for children.45 The complex nature of storytelling used by the Re-imagining the Literary Essay for the Digital Age (RILEDA) project at Brunel University (which created the multi-media digital literary essay Kafka’s Wound)46 changed the archiving practices of a media organisation (the London Review of Books)47 and even gave rise to new forms of public performance, both live and recorded.
- 48 Centre for Robert Burns Studies, http://www.gla.ac.uk/schools/critical/research/researchcentresandn (...)
- 49 Editing Robert Burns for the 21st Century: An AHRC-Funded Project to Produce a Multi-Volume Edition (...)
- 50 Daily Record, ‘Auld Lang Syne Record Set’, Youtube, 1 December 2009, https://www.youtube.com/watch? (...)
27Work at the Centre for Robert Burns Studies,48 including a digital edition,49 has made numerous contributions to the Scottish cultural scene. In 2009, the project commissioned a new musical composition by Scottish composer James Macmillan, which was performed live, to mark the 250th anniversary of Burns’ birth. They also co-organised a successful world record attempt to perform Burns’ Auld Lang Syne simultaneously in forty-one languages, which was recorded on YouTube.50
- 51 Newton Project, www.newtonproject.sussex.ac.uk. The project team is currently based at the Faculty (...)
- 52 ‘The Laws of Motion’, In Our Time, BBC Radio 4, 3 April 2008, http://www.bbc.
co.uk/programmes/b009 (...) - 53 Old Bailey Online, www.oldbaileyonline.org
- 54 Tales from the Old Bailey, BBC 2, March–May 2013, http://www.bbc.co.uk/programmes/b01rdp8t
- 55 Garrow’s Law, BBC 1, November 2009–February 2012, http://www.bbc.co.uk/programmes/b00w5c2w
28University of Sussex and University of Cambridge’s Newton Project,51 which provides an open access, online scholarly edition of Sir Isaac Newton’s complete writings, inspired the play Let Newton Be!, along with other television and radio programmes, including BBC Radio 4’s In Our Time, the BBC 4 series The Beauty of Diagrams, and BBC 2’s Isaac Newton: The Last Magician.52 The University of Sheffield’s Old Bailey Online,53 a database of the records of criminal cases at the Old Bailey between 1674 and 1913, provided material for the BBC series Tales from the Old Bailey54 and Garrow’s Law.55
29Knowledge generated by digital projects and the use of digital linguistic analysis benefitted theatre companies, such as Shakespeare’s Globe (University of Strathclyde’s digital linguistic analysis as a rehearsal tool project at Shakespeare’s Globe Theatre), the Royal Shakespeare Company (University of Birmingham’s Debating Shakespeare in the Olympic Year Research), and King’s College London’s research project Out of the Wings: The Research and Practice of Spanish American Theatre in Translation benefited multiple theatres including the RSC, Silver Lining Theatre Company, CASA Festival, and the Royal Academy of Dramatic Art (RADA).
Cultural Heritage
- 56 Claire Bailey-Ross et al., ‘Engaging the Museum Space: Mobilizing Visitor Engagement with Digital C (...)
30Several projects fostered public engagement with cultural resources or the GLAM sector. We have written at greater length elsewhere about our work on the QRator and Social Interpretation (SI) projects, which used digital resources to facilitate engagement with museums and were both the subject of case studies.56 These projects were always designed to capture impact and evaluate the nature of benefit and change in visitor behaviour as part of the research projects and not for the sake of REF. However, it meant that we could provide evidence of impact in a way that few other digital projects were able to do.
- 57 Transcribe Bentham, www.transcribe-bentham.da.ulcc.ac.uk
- 58 ‘Oxyrhynchus Online’, Papyrology at Oxford, www.papyrology.ox.ac.uk/POxy
- 59 Ancient Lives, www.ancientlives.org
- 60 Tim Causer and Valerie Wallace, ‘Building a Volunteer Community: Results and Findings from Transcri (...)
31Other innovative methods of engaging the public with cultural resources using digital methods were discussed in the case studies of crowd-sourced transcription projects. These included the ground-breaking Transcribe Bentham project,57 and two projects from Oxford: the Oxyrhynchus Online,58 and Ancient Lives,59 which together made the Oxyrhynchus papyri available to the public using a web interface and crowdsourcing techniques. The facility to collect detailed evidence of the impact of the Transcribe Bentham project was built into the original research design, and has been published in greater detail than the case study word limit would allow.60
- 61 Strandlines, https://www.strandlines.london/
- 62 Vernon Manuscript Project, www.birmingham.ac.uk/vernonmanuscript
32The London French project, from the University of Westminster, resulted in the creation of a community digital archive in collaboration with the British Library. This benefitted the French community, as well as information professionals, through the sharing of experiences and the dissemination of knowledge, and through the connections made between contemporary and historical lives. As a result of King’s College London’s Strandlines,61 members of the local community were able to interact in a digital public space with local artists, cultural practitioners, and creative industries to explore the meaning of place, discover the histories of their community, and exchange experiences. Research on a digital edition of the medieval Vernon Manuscript (Bodleian Library MS. Eng. poet. a. 1), written in the West Midlands’ dialect, led to several public events in collaboration with some of Birmingham’s libraries and museums.62 This enhanced the understanding of the history and culture of the West Midlands and its contemporary dialect.
- 63 Ulster Poetry Project, arts.ulster.ac.uk/ulsterpoetry
- 64 ‘Learning Pack’, Dear Sterne, http://dearsterne.blogspot.co.uk/p/learning-pack.html
- 65 Lecture numérique: application ‘Candide, edition enrichie’, http://www.ac-grenoble.fr/mission-tice/ (...)
33Several projects also benefited school-aged children and their teachers. Digital resources created by the University of Reading’s Ure Museum of Greek Archaeology were used by school children at an animation workshop. The Ulster Poetry Project63 developed an online library that has assisted in the development of teaching and learning materials about Ulster-Scots literature. Research on the eighteenth-century novelist Laurence Sterne at Northumbria University created a digital learning package for teachers to use when primary school children visit local heritage properties.64 The Candide app, from the University of Oxford, is being used by secondary-aged students of Voltaire in French schools.65
- 66 Corpus Vitrearum Medii Aevi (Medieval Stained Glass in Great Britain), www.cvma.ac.uk
34As we have discussed above, projects such as these demonstrate that the impact of digital resources cannot always be categorised as academic-, community-, or GLAM-based. Indeed, such collaboration is vital to the success of many digital projects. We found numerous references to collaboration with the GLAM sector, including museums, galleries and libraries, and heritage sites, such as Norwich Cathedral, whose glass collection was made available digitally by the University of East Anglia’s Norfolk Medieval Stained Glass Project.66
Policy Impact
- 67 Clergy of the Church of England Database, http://theclergydatabase.org.uk/
- 68 1641 Depositions Project, http://www.abdn.ac.uk/1641-depositions/
- 69 Mark Hedges, Mike Haft, and Gareth Knight, ‘FISHNet: Encouraging Data Sharing and Reuse in the Fres (...)
- 70 HathiTrust Opinion, 2012, 11 CV 6351, p. 13, http://www.scribd.com/doc/109647049/HathiTrust-Opinion
35Perhaps more surprisingly, DH has also had an impact on public policy. The Clergy of the Church of England database (CCEd) 1540–1835 (University of Kent) resulted in changes in the ministry and practice of the Church of England;67 while analysis of the language of 1641 depositions (a project by the University of Aberdeen) was used to facilitate public debate and political policy discussions about modern sectarianism in Northern Ireland.68 The Freshwater Information Management project at King’s College London has been used in environmental policy making, as well as to provide information to farmers and the public about water quality.69 Material from Google Ancient Places (GAP) — an Open University project using GIS (geographic information systems) technology to map the ancient world — was used as part of the HathiTrust legal case in the USA, during which the right to fair access to digital educational materials was established.70
Limitations of the REF Case Studies
36The REF case studies provide compelling evidence that DH has an impact beyond the predictable areas of the information professions and cultural heritage. However, there are limitations to the use of such material. The most obvious of these is that although REF criteria specify that impact should be judged on geographic reach, the exercise is not intended to benchmark impact in an international context. Although REF panels included members from the user community, digital resources created purely by the GLAM sector and commercial organisations without the input of academics, were excluded from the exercise. Information from the REF can be used to extrapolate the impact that the resources created outside the UK higher education sector might have, but there is no evidence base to test this in any meaningful way.
37It is also important to remember that the REF case studies were selected by universities and not randomly sampled. There was also no requirement to enter case studies that included digital tools or resources. This means that the case studies represent the strongest examples of the genre that could be found in any given university: cases where the impact of digital projects were difficult to prove were therefore not entered or evaluated. Thus, it is hardly surprising that the impact of such successful and high-profile projects was significant. We cannot, however, extrapolate from this that all digital resources must therefore have an impact: it is possible that most of them do not, or that it is only the most outstanding that do. We cannot ascertain what the ratio of outstanding, impactful projects to the average digital resource might be. The only way to test this would be to select digital resources at random from a list of funded projects, or from those archived in a repository, and then judge their impact accordingly.
38This also leads to another limitation. The case studies were constructed and written by the universities themselves, who were responsible for collecting evidence of change or benefit, and for writing the narrative of the case study. However, such a procedure is naturally open to bias. Universities wanted to present their work in the best possible light and therefore selected evidence accordingly, perhaps disregarding indicators that were not as positive. A more objective method, whereby impact was judged by independent researchers against an agreed set of criteria, might reach different conclusions. However, doing this would be expensive and time intensive, and there is no evidence that there is any demand from funders, government, or the academics themselves, to carry out such an exercise.
39Finally, while we are able to show that digital resources have an impact, so, it seems, does most research. In REF 2014, eighty-four percent of the impact section was judged to be four-star or three-star (eighty-one percent in panel D, which covered arts and humanities and digital resources). Thus, simply achieving impact for any research cannot be seen as exceptional, or even especially impressive.
Conclusions
40The REF results demonstrate that DH can have an impact on numerous sectors, with some resources benefitting multiple sectors at a time. The case studies provide evidence of impact on cultural heritage, theatrical performance, the media, industry, schools, religious organisations, community groups, public policy, and the interested public. This is very welcome indeed. While such results are helpful in terms of advocacy for digital humanities, they are, nonetheless, of limited use to the creators of such resources themselves, if compared, for example, to Tanner’s model. REF panels provided brief, general summaries of each unit of assessment, which sometimes contained comments on especially impressive impact cases. However, no detailed feedback was given, thus it is difficult for resource creators to know what was judged to be especially effective, or what might be improved. Tanner’s model would probably have provided a more rigorous evaluation of the characteristics of such projects, but the time and funding required to undertake such a procedure may mean that, in an environment where resources are scarce, such protocols are relatively rarely used.
41This recognition of the broad impact of DH is very heartening. The REF may be a positive force in bringing complex questions about the sustainability of digital resources to the fore. REF regulations allow the possibility of research having an impact up to twenty years after publication; and feedback on the 2014 exercise suggests this may still be too short a period, even in science and medicine. If we want digital resources to be able to have an impact for future REFs or other such exercises, they will still need to be accessible and functional beyond such a period — at the least. This is a significant challenge, given that, at present, the UK’s Arts and Humanities Research Council only requires resource creators to ensure the availability of a resource for three years after the end of its funding period. It means that universities will need to think about how to plan for and fund the life of a digital resource for longer periods after the funding has ended. This entails not only making it available, but also keeping it updated, so that users feel confident in using it. By definition, if the functionality degrades, or the interface seems uninviting, and, as a result, use decreases, then evidence for longer-term impact will be harder to collect. This becomes even more complex in cases where a digital resource is a collaboration with, or even hosted by, a cultural heritage organisation, over whose sustainability policies universities do not have any control. But, of course, this only applies to resources hosted in the UK, there are no such levers elsewhere.
- 71 Nancy L. Maron, and Sarah Pickle, Sustaining the Digital Humanities Host Institution Support beyond (...)
42In this environment DH must, therefore, argue strongly for the impact of what it does so that in future its resources still exist to do so. As Nancy Maron and Sarah Pickle argue, DH is in an ideal position to demonstrate its impact.71 DH resources are attractive and accessible to the public in a way that a dataset of scientific data simply cannot be. Not only have we built our resources so that they can be shared, but we can demonstrate that the public has been doing so, and indeed contributing to the content and intellectual endeavour of some digital projects. Impact is, as we have shown, not easy to capture or measure, but the experience of the REF suggests that we can offer evidence for the benefit and change brought about by DH resources in many different sectors.
Bibliographie
Bibliography
1641 Depositions Project, http://www.abdn.ac.uk/1641-depositions/
Ancient Lives, www.ancientlives.org
Bailey-Ross, Claire et al., ‘Engaging the Museum Space: Mobilizing Visitor Engagement with Digital Content Creation’, Digital Scholarship in the Humanities, 32.4 (2016), 689–708, https://doi.org/10.1093/llc/fqw041
Bakhshi, Hasan, and David Throsby, Culture of Innovation. An Economic Analysis of Innovation in Arts and Cultural Organizations (Nesta, London, 2010).
The Beauty of Diagrams, BBC 4, November–December 2010, http://www.bbc.co.uk/programmes/b00w5675
Blake Archive, www.blakearchive.org
Carnall, Mark, Jack Ashby, and Claire Ross, ‘Natural History Museums as Provocateurs for Dialogue and Debate’, Museum Management and Curatorship, 28.1 (2013), 37–41, https://doi.org/10.1080/09647775.2012.754630
Causer, Tim, and Valerie Wallace, ‘Building a Volunteer Community: Results and Findings from Transcribe Bentham’, Digital Humanities Quarterly, 6.2 (2012), http://www.digitalhumanities.org/dhq/vol/6/2/000125/000125.html
Centre for Robert Burns Studies, http://www.gla.ac.uk/schools/critical/research/researchcentresandnetworks/robertburnsstudies
Clergy of the Church of England Database, http://theclergydatabase.org.uk/
Corpus Vitrearum Medii Aevi (Medieval Stained Glass in Great Britain), www.cvma.ac.uk
Culture Metrics: A Shared Approach to Measuring Quality, http://www. culturemetricsresearch.com/
Daily Record, ‘Auld Lang Syne Record Set’, Youtube, 1 December 2009, https://www.youtube.com/watch?v=9mb9ZwB_-xY&noredirect=1
Department for Culture, Media, and Sport, Statistical Data Set: Museums and Galleries Monthly Visits, DCMS, London, 2017.
Diogenes, https://community.dur.ac.uk/p.j.heslin/Software/Diogenes
Editing Robert Burns for the 21st Century: An AHRC-Funded Project to Produce a Multi-Volume Edition of the Works of Robert Burns, http://burnsc21.glasgow.ac.uk/
Garrow’s Law, BBC 1, November 2009–February 2012, http://www.bbc.co.uk/programmes/b00w5c2w
Gate: General Architecture for Text Engineering, https://gate.ac.uk
Gauntlett, David, Cultures of Creativity: Nurturing Creative Mindsets Across Cultures, ed. by Bo Stjerne Thomsen (Billund: LEGO Foundation, 2013).
— ‘Enabling and Constraining Creativity and Collaboration: Some Reflections after Adventure Rock’, in Content Cultures: Transformations of User Generated Content in Public Service Broadcasting, ed. by Helen Thornham and Simon Popple (London: I. B. Tauris, 2013), pp. 161–80, https://doi.org/10.5040/9780755694426.ch-009
Gauntlett, David et al., Defining Systematic Creativity in the Digital Realm (Billund: LEGO Foundation, 2010).
HathiTrust Opinion, 2012, 11 CV 6351, http://www.scribd.com/doc/109647049/HathiTrust-Opinion
Hedges, Mark, Mike Haft, and Gareth Knight, ‘FISHNet: Encouraging Data Sharing and Reuse in the Freshwater Science Community’, Journal of Digital Information, 13.1 (2012).
Hellqvist, Björn, ‘Referencing in the Humanities and its Implications for Citation Analysis’, Journal of the American Society for Information Science and Technology, 61.2 (2010), 310–18, https://doi.org/10.1002/asi.21256
Higher Education Funding Council of England (HEFCE), The Nature, Scale, and Beneficiaries of Research Impact: An Initial Analysis of Research Excellence Framework (REF) 2014 Impact Case Studies (London: King’s College London, 2015).
Holden, J., and J. Baltà, The Public Value of Culture: A Literature Review (EENC Paper, Brussels, 2012).
Hooper-Greenhill, Eilean, ‘Measuring Learning Outcomes in Museums, Archives and Libraries: The Learning Impact Research Project (LIRP)’, International Journal of Heritage Studies, 10.2 (2004), 151–74, https://doi.org/10.1080/13527250410001692877
Hughes, Lorna. M., et al., ‘Assessing and Measuring Impact of a Digital Collection in the Humanities: An Analysis of the SPHERE (Stormont Parliamentary Hansards: Embedded in Research and Education) Project’, Digital Scholarship in the Humanities, 30.2 (2015), 183–98, https://doi.org/10.1093/llc/fqt054
Jones, Molly Morgan and Jonathan Grant, ‘Making the Grade: Methodologies for Assessing and Evidencing Research Impact’, in 7 Essays on Impact. DESCRIBE Project Report for Jisc, ed. by David Cope et al. (Exeter: University of Exeter, 2013), pp. 25–43.
Keaney, Emily, ‘Public Value and the Arts: Literature Review’, Strategy (2006), 1–49.
Kinghorn, Naomi and Ken Willis, ‘Measuring Museum Visitor Preferences Towards Opportunities for Developing Social Capital: An Application of a Choice Experiment to the Discovery Museum’, International Journal of Heritage Studies, 14.6 (2008), 555–72, https://doi.org/10.1080/13527250802503290
‘The Laws of Motion’, In Our Time, BBC Radio 4, 3 April 2008, http://www.bbc.co.uk/programmes/b009mvj0
Lecture numérique: application ‘Candide, edition enrichie’, http://www.ac-grenoble.fr/mission-tice/Delegation_academique_au_numerique/Lecture_numerique_%3A_%22Candide%22.html
‘Learning Pack’, Dear Sterne, http://dearsterne.blogspot.co.uk/p/learning-pack.html
London Review of Books, www.lrb.co.uk
Maliniak, Daniel, Ryan Powers, and Barbara F. Walter, ‘The Gender Citation Gap in International Relations’, International Organization, 67.4 (2013), 889– 922, https://doi.org/10.1017/s0020818313000209
Marcella, Rita, Hayley Lockerbie, and Lyndsay Bloice, ‘Beyond REF 2014: The Impact of Impact Assessment on the Future of Information Research’, Journal of Information Science, 42.3 (2016), 369–85, https://doi.org/10.1177/0165551516636291
Marcella, Rita, ‘The Effects of the Research Excellence Framework Research Impact Agenda on Early- and Mid-Career Researchers in Library and Information Science’, Journal of Information Science, 44.5 (2018), 608–18, https://doi.org/10.1177/0165551517724685
Marchionni, Paola, ‘Why Are Users So Useful? User Engagement and the Experience of the JISC Digitisation Programme’, Ariadne (30 October 2009), http://www.ariadne.ac.uk/issue/61/marchionni/
Maron, Nancy L., Jason Yun, and Sarah Pickle, ‘Sustaining our Digital Future: Institutional Strategies for Digital Content’, Strategic Content Alliance, Ithaka Case Studies in Sustainability (2013), https://sca.jiscinvolve.org/wp/files/2013/01/Sustaining-our-digital-future-FINAL-31.pdf
Maron, Nancy L. and Sarah Pickle, Sustaining the Digital Humanities Host Institution Support beyond the Start-Up Phase (New York: Ithaka S+R, 2014), https://doi.org/10.18665/sr.22548
Matarasso, François, Use or Ornament? The Social Impact of Participation in the Arts (Stroud: Comedia, 1997).
Myerscough, John, The Economic Importance of the Arts in Britain (London: Policy Studies Institute, 1988).
Newton Project, www.newtonproject.sussex.ac.uk
Old Bailey Online, www.oldbaileyonline.org
‘Oxyrhynchus Online’, Papyrology at Oxford, www.papyrology.ox.ac.uk/POxy
REF, ‘Assessment Criteria and Level Definitions’, https://www.ref.ac.uk/2014/panels/assessmentcriteriaandleveldefinitions/
— Assessment Framework and Guidance on Submissions (Bristol: REF UK, 2011), http://www.ref.ac.uk/2014/media/ref/content/pub/assessmentframeworkandguidanceonsubmissions/GOS%20including%20addendum.pdf
— Research Excellence Framework 2014: Overview Report by Main Panel D and Sub-Panels 27 to 36, (London: REF UK, 2015.
Ross, Claire, Melissa Terras, and Carolyn Royston, ‘Visitors, Digital Innovation and a Squander Bug: Reflections on Digital R & D for Audience Engagement and Institutional Impact’, in Museums and the Web 2013, ed. by N. Proctor and R. Cherry (Silver Spring, MD: Museums and the Web, 2013).
Rossetti Archive, www.rossettiarchive.org
Self, Will, Kafka’s Wound, A Digital Essay, https://thespace.lrb.co.uk/
Selwood, Sara, ‘Making a Difference: The Cultural Impact of Museums. An Essay for NMDC’ (2010), https://www.nationalmuseums.org.uk/media/documents/publications/cultural_impact_final.pdf
— ‘What Difference Do Museums Make? Producing Evidence on the Impact of Museums’, Critical Quarterly, 44.4 (2002), 65–81, https://doi.org/10.1111/1467-8705.00457
Scottish Corpus of Texts & Speech, https://scottishcorpus.ac.uk
Showers, Ben, ‘A Strategic Approach to the Understanding and Evaluation of Impact’, in Evaluating and Measuring the Value, Use and Impact of Digital Collections, ed. by Lorna M. Hughes (London: Facet, 2012), pp. 63–72, https://doi.org/10.29085/9781856049085.006
Smithies, James, et al., ‘Managing 100 Digital Humanities Projects: Digital Scholarship & Archiving in King’s Digital Lab’, Digital Humanities Quarterly, 13.1 (2019), http://www.digitalhumanities.org/dhq/vol/13/1/000411/000411.html
Strandlines, https://www.strandlines.london/
Tales from the Old Bailey, BBC 2, March-May 2013, http://www.bbc.co.uk/programmes/b01rdp8t
Tanner, Simon and Marilyn Deegan, Inspiring Research, Inspiring Scholarship. The Value and Benefits of Digitised Resources for Learning, Teaching, Research and Enjoyment (London: JISC, 2011).
Tanner, Simon, Measuring the Impact of Digital Resources: The Balanced Value Impact Model (London: King’s College London, 2012).
— ‘The Value and Impact of Digitized Resources for Learning, Teaching, Research and Enjoyment’, in Evaluating and Measuring the Value, Use and Impact of Digital Collections, ed. by L. M. Hughes (London: Facet, 2012), pp. 103–20, https://doi.org/10.29085/9781856049085.009
— ‘3 Reasons Why REF2014 Was Good for Digital Humanities Scholars’, When the Data Hits the Fan! (2 February 2015), http://simon-tanner.blogspot.co.uk/2015/02/3-reasons-ref2014-was-good-for-digital.html
Thesaurus Linguae Graecae, http://stephanus.tlg.uci.edu/
‘TIDSR: Toolkit for the Impact of Digitised Scholarly Resources’, Oxford Internet Institute, https://www.oii.ox.ac.uk/research/projects/tidsr/
Transcribe Bentham, www.transcribe-bentham.da.ulcc.ac.uk
Travers, Tony, Museums and Galleries in Britain Economic, Social and Creative Impacts (London: London School of Economics & Political Science, 2006).
Ulster Poetry Project, arts.ulster.ac.uk/ulsterpoetry
Vernon Manuscript Project, www.birmingham.ac.uk/vernonmanuscript
Warwick, Claire, ‘Archive 360: The Walt Whitman Archive’, Archive Journal, 1.1 (2011).
Warwick, Claire, et al., ‘If You Build It Will They Come? The LAIRAH Study: Quantifying the Use of Online Resources in the Arts and Humanities through Statistical Analysis of User Log Data’, Literary and Linguist Computing, 23.1 (2008), 85–102, https://doi.org/10.1093/llc/fqm045
Wavell, Caroline, et al., Impact Evaluation of Museums, Archives and Libraries: Available Evidence Project (Aberdeen: Robert Gordon University, 2002).
West, Jevin D., et al., ‘The Role of Gender in Scholarly Authorship’, ed. by Lilach Hadany, PLOS ONE, 8.7 (2013), e66212, https://doi.org/10.1371/journal.pone.0066212
Whitman Archive, www.whitmanarchive.org
Wilkinson, Claire, ‘Evidencing Impact: A Case Study of UK Academic Perspectives on Evidencing Research Impact’, Studies in Higher Education, 44.1 (2019), 72–85, https://doi.org/10.1080/03075079.2017.1339028
Wilsdon, James, et al., The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management (HEFCE: London, 2015), https://doi.org/10.13140/RG.2.1.4929.1363
Notes
1 Simon Tanner, Measuring the Impact of Digital Resources: The Balanced Value Impact Model (London: King’s College London, 2012).
2 See, for example, James Smithies et al., ‘Managing 100 Digital Humanities Projects: Digital Scholarship & Archiving in King’s Digital Lab’, Digital Humanities Quarterly, 13.1 (2019), http://www.digitalhumanities.org/dhq/vol/13/1/000411/000411.html
3 Sara Selwood, ‘What Difference Do Museums Make? Producing Evidence on the Impact of Museums’, Critical Quarterly, 44.4 (2002), 65–81, https://doi.org/10.1111/1467-8705.00457; Caroline Wavell et al., Impact Evaluation of Museums, Archives and Libraries: Available Evidence Project (Aberdeen: Robert Gordon University, 2002).
4 Simon Tanner, and Marilyn Deegan, Inspiring Research, Inspiring Scholarship. The Value and Benefits of Digitised Resources for Learning, Teaching, Research and Enjoyment (London: JISC, 2011).
5 John Myerscough, The Economic Importance of the Arts in Britain (London: Policy Studies Institute, 1988); Tony Travers, Museums and Galleries in Britain Economic, Social and Creative Impacts (London: London School of Economics & Political Science, 2006); François Matarasso, Use or Ornament? The Social Impact of Participation in the Arts (Stroud: Comedia, 1997); Naomi Kinghorn and Ken Willis, ‘Measuring Museum Visitor Preferences Towards Opportunities for Developing Social Capital: An Application of a Choice Experiment to the Discovery Museum’, International Journal of Heritage Studies, 14.6 (2008), 555–72, https://doi.org/10.1080/13527250802503290
6 Eilean Hooper-Greenhill, ‘Measuring Learning Outcomes in Museums, Archives and Libraries: The Learning Impact Research Project (LIRP)’, International Journal of Heritage Studies, 10.2 (2004), 151–74, https://doi.org/10.1080/13527250410001692877; Culture Metrics: A Shared Approach to Measuring Quality, http://www.culturemetricsresearch.com/
7 Department for Culture, Media, and Sport, Statistical Data Set: Museums and Galleries Monthly Visits (London, 2017). The Department for Culture, Media, and Sport (DCMS) sponsors sixteen national museums, which provide free entry to their permanent collections. These museums are the British Museum, Geffrye Museum, Horniman Museum, Imperial War Museum, National Gallery, National Maritime Museum, National Museums Liverpool, Science Museum Group, National Portrait Gallery, Natural History Museum, Royal Armouries, Sir John Soane’s Museum, Tate Galleries, Tyne and Wear Museums, Victoria and Albert Museum, and the Wallace Collection. Data collection methods vary between institutions, and each uses a method appropriate to its situation. All data is collected according to the DCMS performance indicator guidelines.
8 Hasan Bakhshi and David Throsby, Culture of Innovation. An Economic Analysis of Innovation in Arts and Cultural Organizations (Nesta, London, 2010), p. 58.
9 Wavell et al., Impact Evaluation.
10 Claire Warwick et al., ‘If You Build It Will They Come? The LAIRAH Study: Quantifying the Use of Online Resources in the Arts and Humanities through Statistical Analysis of User Log Data’, Literary and Linguist Computing, 23.1 (2008), 85–102, https://doi.org/10.1093/llc/fqm045
11 Tanner, Measuring the Impact of Digital Resources.
12 ‘TIDSR: Toolkit for the Impact of Digitised Scholarly Resources’, Oxford Internet Institute, https://www.oii.ox.ac.uk/research/projects/tidsr/
13 Paola Marchionni, ‘Why Are Users So Useful? User Engagement and the Experience of the JISC Digitisation Programme’, Ariadne (30 October 2009), http://www.ariadne.ac.uk/issue/61/marchionni/
14 Lorna M. Hughes et al., ‘Assessing and Measuring Impact of a Digital Collection in the Humanities: An Analysis of the SPHERE (Stormont Parliamentary Hansards: Embedded in Research and Education) Project’, Digital Scholarship in the Humanities, 30.2 (2015), 183–98, https://doi.org/10.1093/llc/fqt054
15 Molly Morgan Jones and Jonathan Grant, ‘Making the Grade: Methodologies for Assessing and Evidencing Research Impact’, in 7 Essays on Impact. DESCRIBE Project Report for Jisc, ed. by David Cope et al. (Exeter: University of Exeter, 2013), pp. 25–43; Higher Education Funding Council of England (HEFCE), The Nature, Scale, and Beneficiaries of Research Impact: An Initial Analysis of Research Excellence Framework (REF) 2014 Impact Case Studies (London: King’s College London, 2015).
16 REF, Assessment Framework and Guidance on Submissions (Bristol: REF UK, 2011), http://www.ref.ac.uk/2014/media/ref/content/pub/assessmentframeworkandguidanceonsubmissions/GOS%20including%20addendum.pdf
17 Ben Showers, ‘A Strategic Approach to the Understanding and Evaluation of Impact’, in Evaluating and Measuring the Value, Use and Impact of Digital Collections, ed. by Lorna M. Hughes (London: Facet, 2012), pp. 63–72, https://doi.org/10.29085/9781856049085.006
18 Sara Selwood, ‘Making a Difference: The Cultural Impact of Museums. An Essay for NMDC’ (2010), https://www.nationalmuseums.org.uk/media/documents/publications/cultural_impact_final.pdf
19 Emily Keaney, ‘Public Value and the Arts: Literature Review’, Strategy (2006), 1–49 (p. 41); J. Holden and J. Baltà, The Public Value of Culture: A Literature Review (EENC Paper, Brussels, 2012).
20 Simon Tanner, ‘The Value and Impact of Digitized Resources for Learning, Teaching, Research and Enjoyment’, in Evaluating and Measuring the Value, Use and Impact of Digital Collections, ed. by Lorna M. Hughes (London: Facet, 2012), pp. 103–20, https://doi.org/10.29085/9781856049085.009
21 Ibid.
22 Tanner, Measuring the Impact, p. 23.
23 Nancy L. Maron, Jason Yun, and Sarah Pickle, ‘Sustaining our Digital Future: Institutional Strategies for Digital Content’, Strategic Content Alliance, Ithaka Case Studies in Sustainability (2013), https://sca.jiscinvolve.org/wp/files/2013/01/Sustaining-our-digital-future-FINAL-31.pdf
24 Rossetti Archive, www.rossettiarchive.org
25 Blake Archive, www.blakearchive.org
26 Whitman Archive, www.whitmanarchive.org
27 Old Bailey Online, www.oldbaileyonline.org
28 Claire Warwick, ‘Archive 360: The Walt Whitman Archive’, Archive Journal, 1.1 (2011).
29 Hughes et al., ‘Assessing and Measuring Impact’.
30 Björn Hellqvist, ‘Referencing in the Humanities and its Implications for Citation Analysis’, Journal of the American Society for Information Science and Technology, 61.2 (2010), 310–18, https://doi.org/10.1002/asi.21256
31 Daniel Maliniak, Ryan Powers, and Barbara F. Walter, ‘The Gender Citation Gap in International Relations’, International Organization, 67.4 (2013), 889–922, https://doi.org/10.1017/s0020818313000209; Jevin D. West et al., ‘The Role of Gender in Scholarly Authorship’, ed. by Lilach Hadany, PLOS ONE, 8.7 (2013), e66212, https://doi.org/10.1371/journal.pone.0066212
32 Wilsdon, James, et al., The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management (HEFCE: London, 2015), https://doi.org/10.13140/RG.2.1.4929.1363
33 REF, ‘Assessment Criteria and Level Definitions’, https://www.ref.ac.uk/2014/panels/assessmentcriteriaandleveldefinitions/
34 For further details on REF and impact see: Rita Marcella, Hayley Lockerbie, and Lyndsay Bloice, ‘Beyond REF 2014: The Impact of Impact Assessment on the Future of Information Research’, Journal of Information Science, 42.3 (2016), 369–85, https://doi.org/10.1177/0165551516636291; Rita Marcella et al., ‘The Effects of the Research Excellence Framework Research Impact Agenda on Early- and Mid-Career Researchers in Library and Information Science’, Journal of Information Science, 44.5 (2018), 608–18, https://doi.org/10.1177/0165551517724685; Clare Wilkinson, ‘Evidencing Impact: A Case Study of UK Academic Perspectives on Evidencing Research Impact’, Studies in Higher Education, 44.1 (2019), 72–85, https://doi.org/10.1080/03075079.2017.1339028
35 HEFCE, Nature, Scale, and Beneficiaries.
36 Diogenes, https://community.dur.ac.uk/p.j.heslin/Software/Diogenes
37 Old Bailey Online, www.oldbaileyonline.org
38 REF, Research Excellence Framework 2014: Overview Report by Main Panel D and Sub-Panels 27 to 36 (London: REF UK, 2015).
39 Maron, Yun, and Pickle, ‘Sustaining our Digital Future’.
40 Simon Tanner, ‘3 Reasons Why REF2014 Was Good for Digital Humanities Scholars’, When the Data Hits the Fan! (2 February 2015), http://simon-tanner.blogspot.co.uk/2015/02/3-reasons-ref2014-was-good-for-digital.html
41 Gate: General Architecture for Text Engineering, https://gate.ac.uk
42 Scottish Corpus of Texts & Speech, https://scottishcorpus.ac.uk
43 Thesaurus Linguae Graecae, http://stephanus.tlg.uci.edu/
44 David Gauntlett, Cultures of Creativity: Nurturing Creative Mindsets Across Cultures, ed. by Bo Stjerne Thomsen (Billund: LEGO Foundation, 2013); David Gauntlett et al., Defining Systematic Creativity in the Digital Realm (Billund: LEGO Foundation, 2010).
45 David Gauntlett, ‘Enabling and Constraining Creativity and Collaboration: Some Reflections after Adventure Rock’, in Content Cultures: Transformations of User Generated Content in Public Service Broadcasting, ed. by Helen Thornham and Simon Popple (London: I. B. Tauris, 2013), pp. 161–80, https://doi.org/10.5040/9780755694426.ch-009
46 Will Self, Kafka’s Wound, a digital essay, https://thespace.lrb.co.uk/
47 London Review of Books, www.lrb.co.uk
48 Centre for Robert Burns Studies, http://www.gla.ac.uk/schools/critical/research/researchcentresandnetworks/robertburnsstudies
49 Editing Robert Burns for the 21st Century: An AHRC-Funded Project to Produce a Multi-Volume Edition of the Works of Robert Burns, http://burnsc21.glasgow.ac.uk/
50 Daily Record, ‘Auld Lang Syne Record Set’, Youtube, 1 December 2009, https://www.youtube.com/watch?v=9mb9ZwB_-xY&noredirect=1
51 Newton Project, www.newtonproject.sussex.ac.uk. The project team is currently based at the Faculty of History, University of Oxford.
52 ‘The Laws of Motion’, In Our Time, BBC Radio 4, 3 April 2008, http://www.bbc.
co.uk/programmes/b009mvj0; The Beauty of Diagrams, BBC 4, November–December 2010, http://www.bbc.co.uk/programmes/b00w5675
53 Old Bailey Online, www.oldbaileyonline.org
54 Tales from the Old Bailey, BBC 2, March–May 2013, http://www.bbc.co.uk/programmes/b01rdp8t
55 Garrow’s Law, BBC 1, November 2009–February 2012, http://www.bbc.co.uk/programmes/b00w5c2w
56 Claire Bailey-Ross et al., ‘Engaging the Museum Space: Mobilizing Visitor Engagement with Digital Content Creation’, Digital Scholarship in the Humanities, 32.4 (2016), 689–708, https://doi.org/10.1093/llc/fqw041; Claire Ross, Melissa Terras, and Carolyn Royston, ‘Visitors, Digital Innovation and a Squander Bug: Reflections on Digital R & D for Audience Engagement and Institutional Impact’, in Museums and the Web 2013, ed. by N. Proctor and R. Cherry (Silver Spring, MD: Museums and the Web, 2013); Mark Carnall, Jack Ashby, and Claire Ross, ‘Natural History Museums as Provocateurs for Dialogue and Debate’, Museum Management and Curatorship, 28.1 (2013), 37–41, https://doi.org/10.1080/09647775.2012.754630
57 Transcribe Bentham, www.transcribe-bentham.da.ulcc.ac.uk
58 ‘Oxyrhynchus Online’, Papyrology at Oxford, www.papyrology.ox.ac.uk/POxy
59 Ancient Lives, www.ancientlives.org
60 Tim Causer and Valerie Wallace, ‘Building a Volunteer Community: Results and Findings from Transcribe Bentham’, Digital Humanities Quarterly, 6.2 (2012), http://www.digitalhumanities.org/dhq/vol/6/2/000125/000125.html
61 Strandlines, https://www.strandlines.london/
62 Vernon Manuscript Project, www.birmingham.ac.uk/vernonmanuscript
63 Ulster Poetry Project, arts.ulster.ac.uk/ulsterpoetry
64 ‘Learning Pack’, Dear Sterne, http://dearsterne.blogspot.co.uk/p/learning-pack.html
65 Lecture numérique: application ‘Candide, edition enrichie’, http://www.ac-grenoble.fr/mission-tice/Delegation_academique_au_numerique/Lecture_numerique_%3A_%22Candide%22.html
66 Corpus Vitrearum Medii Aevi (Medieval Stained Glass in Great Britain), www.cvma.ac.uk
67 Clergy of the Church of England Database, http://theclergydatabase.org.uk/
68 1641 Depositions Project, http://www.abdn.ac.uk/1641-depositions/
69 Mark Hedges, Mike Haft, and Gareth Knight, ‘FISHNet: Encouraging Data Sharing and Reuse in the Freshwater Science Community’, Journal of Digital Information, 13.1 (2012).
70 HathiTrust Opinion, 2012, 11 CV 6351, p. 13, http://www.scribd.com/doc/109647049/HathiTrust-Opinion
71 Nancy L. Maron, and Sarah Pickle, Sustaining the Digital Humanities Host Institution Support beyond the Start-Up Phase (New York: Ithaka S+R, 2014), pp. 15–16, https://doi.org/10.18665/sr.22548
Le texte seul est utilisable sous licence CC BY 4.0. Les autres éléments (illustrations, fichiers annexes importés) sont « Tous droits réservés », sauf mention contraire.