I’ve been grappling with value and impact in library evidence based practice over the last few years. It’s often been a source of dissonance (though one I keep returning to).
It’s not that we lack frameworks, standards, and scholarship on library value and impact to refer to.
For me, the problem lies in how we communicate about value and impact. While we typically (and understandably) focus on communication with stakeholders and library users, 1 I’m thinking specifically about communication with library professionals asked to engage in evidence-based practice and impact assessment.
The purpose of impact
Despite the relationship of evidence with impact, conversations about value and impact can sometimes feel out of step with library practices and the users that evidence draws from.
When purpose is articulated poorly, value and impact work risks being reduced to reporting, and with it, evidence based practice too. Rather than communicating the purpose and potential of collecting evidence of impact, we risk communicating evidence-based practice solely as an administrative burden. I see this reinforced when a single approach is routinely relied on (e.g., dashboards/statistics) rather than engaging with the multiplicity of methods needed to navigate impact assessment in complex environments. 2
The need to report may define our approaches or how we communicate evidence but the purpose of impact lies with change and difference, whether tangible or intangible.
Without a purpose for impact assessment, evidence collected on library services with a relational and connected quality (especially in teaching and learning) risks becoming transactional. The questions that drive the process of evidence-based practice start becoming more extractive, fixated on what evidence we can get (ahem, collect), rather than the community this evidence and impact benefits.
We focus on assessing change but miss celebrating the connection that created it.
Purpose-driven impact assessment
I’ve found conversations on value and impact can feel isolated from the communities intended to experience impact (or from practitioners partnering to create it). I suspect this is when we become more concerned with the ‘what’ and the ‘how’ of evidence than the ‘why’.”
These conversations get stuck on achieving the ‘right’ definition of impact or turn to lament that we cannot easily demonstrate impact in the ‘right’ way. Since impact has different contexts (e.g., social or economic impact) and value types differ, these conversations should help us to determine our approaches (the ‘how’).
Yet, any nuances of impact and the complexities of our environment seem to drive us back to the ‘safety’ of reporting, neatly separated from the exploration, partnership, and creativity I see in practice.
Social contexts
Drawing from critical assessment scholarship, I’ve encouraged evidence-based practice that explicitly considers power, privilege, and positionality. 3 Such considerations support engagement with the social context of our work rather than seeing impact further entrenched as “a collection of technical data gathering approaches.” 4
I’ve seen open education communities excel at exploring this social context. Last year, Andersen, Stagg, and I discussed values, trust, and evidence.
In the open educational practice space:
“Constructing a culture in which open education is a viable and supported learning and teaching approach requires deliberate and purposeful community-building predicated on shared values and trust.”
This carries over to approaches in the evidence-based practice space, where:
“The evidence we collect is recognised as existing in partnership with underlying values and a wider social context which helps it to contribute positively to the communities that open education builds and partners with.”
The deliberate and purposeful approach to community-building in open education is something evidence-based practice would benefit from in engaging practitioners.
Reporting to prove
Often, conversations I see become stuck are overly focused on reporting to prove. They can feel detached from the work that creates impact and the communities this should centre. 5
Value is typically linked to user and stakeholder perceptions of library service quality or importance. Library practitioners can experience this value intuitively through partnerships and, over time, can bring an understanding of user information behaviour and needs. 6
Limiting explanations of evidence-based practice to reporting (however much this is or is not needed) misses the breadth of ways library professionals intuitively learn to experience the process of evidence in practice.
Reporting does not immediately convey the potential for evidence-based practice to be a “mutually empowering experience for both librarians and their clients.” 7 The potential impact of impact is lost in communication. Evidence-based practice becomes distanced from everyday practices and the decisions of library practitioners who are often involved in, at minimum, collecting evidence.
Narratives of evidence and impact assessment
A paper I’m fond of, “Moving from Critical Assessment to Assessment as Care,” captures (what I would call) the disconnect that can come with impact work in library assessment. Douglas describes this as an assessment culture of fear and “the narrative of assessment we’ve accepted,” writing:
“I am sympathetic to attempts to demonstrate that our work in libraries is important, but that is the action of advocacy and reporting, not assessment. If students are indeed at the center of our teaching in higher education and libraries, then learning, and by extension, assessment, should be an inherently relational act (Schwartz, 2017, p. 6). Assessment practice has the potential to be a site of connection and care, an exchange of ideas and feelings, and a place where we can truly engage in bell hook’s (1994) idea of engaged pedagogy, where everyone involved in education is empowered. Assessment can enrich our students and ourselves as educators, librarians, and people, but it requires an approach that prioritizes care over justification, connection over reporting, and people over products.” 8
Yet, we also see the significance of evidence in demonstrating impact and how it contributes to a broader social and institutional conversation. Brettle (2014) offers the most succinct explanation I’ve come across:
“Evidence based practice in library and information work is all about using evidence in our decision making, and is usually associated with effectiveness or whether something works. On the other hand, impact is something that as library practitioners we are increasingly asked to demonstrate to our stakeholders; and is about whether our services make a difference. However just as we can use or locate evidence to help us in our own decision making to find out whether an aspect of our service works, we can also use evidence to help us understand whether our service makes a difference. Not only can we use this evidence of impact for ourselves in our own decision making, we can use evidence to demonstrate the impact of our practice or service to our stakeholders and help them make decisions regarding our services. Thus measuring impact has an important place within evidence based practice.” 9
Perceptions of impact
These varied perspectives in scholarship on measuring impact reflect the different viewpoints I’ve also encountered. They highlight that we need to not only align our language with user communities but also ensure the purpose of impact assessment resonates with the day-to-day work of LIS professionals and is meaningful. We need to be able to critique our accepted methods, assumptions, and communications, listening to where points of disconnect lie.
While library assessment scholarship has offered critical perspectives on value and impact, we’re yet to see this to the same extent in evidence-based practice scholarship. This isn’t to forgo or diminish impact assessment but to challenge our understanding and communication of the role it plays in our evidence, practice, and administration.
Ideally, our capacity and capabilities for evidence-based practice should emphasise the relational aspects of impact. This is especially true in complex and changing environments where “… traditional usage statistics alone cannot determine the impact of the library and the value of new learning and research partnerships.” 10
Demonstrating how value and impact assessment connects to everyday practices and local community contexts, rather than leaving it abstracted in a reporting context, is important.
The language we use
If library value and impact overly rely on reporting strategies and performance language, evidence-based practice may not resonate with where library practitioners find meaning.
Urquhart picks up on this matter of language and translation when describing how “the word narrative probably sounds very comforting to many librarians” 11 for communicating value and impact (compared to traditional marketing language and strategies).
Expanding our language repertoire to describe what evidence-based practice can be may help advance evidence-based decision-making at an organisational level. It provides space to translate a more holistic understanding of what evidence can be beyond and alongside reporting.
Empowering connection and impact
Research on how academic librarians experience evidence-based practice sheds light on language for communicating the ‘why’ of impact assessment.
Miller et al. (2017) explore the ways that academic librarians experience evidence-based practice. One way it was experienced was in terms of “being impactful” or “having a visible impact” for stakeholders. Notably (but I suspect sometimes forgotten), this experience of impact sits alongside other experiences that include empowering, intuiting, affirming, connecting, and noticing. 12 This means that evidence-based practice is also experienced as “being connected” and we see “mutual interactions between librarians with clients … regarded as essential evidence” for decision-making. 13
These findings highlight the potential for evidence to be both “mutually empowering” and contribute to “value-adding decisions.” 14 Since value and impact are connected to decision-making, there is the potential for this work to empower through improved practice and services.
As scholarship continues to explore the role of libraries as social agents of change, with the “capacity to act” 15 (albeit within structural constraints), I’m interested in how value and impact work in libraries will arrive at more relational and co-created qualities.
Experiences of professionals
While studies do explore how LIS professionals perceive evidence-based practice, we lack the same exploration of how value and impact work is perceived (particularly between different sectors and services).
Existing research on value and impact predominantly focuses on approaches (methods, standards, frameworks, and benchmarks), and more recently, co-creating social value and engagement with service users (sometimes connected with marketing). 16
We do, though, see that librarians embrace “opportunities for new partnerships” to translate value and impact in complex environments, 17 prompting questions such as: “How can academic librarians give expression to the value that the library provides?” 18
For evidence-based practice, this requires building capacity in a way that empowers a meaningful understanding of impact assessment 19 and demonstrates a connection to everyday practice.
Changing the narrative
I’d love to see library impact assessment take on a different narrative, embracing change beyond reporting. Ensuring that ‘being impactful’ is tied to ‘being connected’ is a good place to start. This elevates partnership and connection, presenting a more meaningful construct for value and impact work to exist within.
- Murray & Ireland, 2018; Yamaguchi & Richardson, 2018 ↩︎
- Tenopir, 2013; Salisbury and Peasley, 2018 ↩︎
- Bell, 2023, p. 125 ↩︎
- Wall et al., 2014, p. 13 ↩︎
- For this reason, I’m also quick to include service improvement (which is another rabbit hole of thoughts) and decision-making in the discussion. I do not think the ‘prove’ (of value and impact) and the ‘improve’ (of our practice) are separate. This also gives us something to act on if impact findings are not what we might hope for or expect. ↩︎
- Miller et al., 2017, p. 13 ↩︎
- Ibid., p. 10 ↩︎
- Douglas, 2020, p. 47 ↩︎
- Brettle, 2014 ↩︎
- Salisbury and Peasley, 2018, p. 109 ↩︎
- Urquhart, 2015, p. 99 ↩︎
- Miller, 2017, p. 21 ↩︎
- Ibid. ↩︎
- Ibid. ↩︎
- Braun et al., 2019, p. 788 ↩︎
- Bawden et al, 2010; Turner, 2016; Urquhart, 2018 ↩︎
- Salisbury and Peasley, 2018, p. 109 ↩︎
- Ibid. ↩︎
- Ibid., p. 117 ↩︎
2 thoughts on “Finding purpose & potential in library impact assessment”