I’ve recently had several conversations on libraries, the Sustainable Development Goals (SDGs), and soft power. A few discussions turned to focus on reporting and impact, including mapping activities around the SDGs. In particular, these looked at applying the SDGs as a framework to outputs or outcomes after they have occurred.
There’s nothing inherently wrong with this approach. I’ve done similar on several occasions, retrospectively applying frameworks to activities and insights.
It can start conversations, prompt reflection, act as a baseline to guide future initiatives, and demonstrate value. It can also make the implicit explicit, which helps to draw out value and new connections that may otherwise go unrealised (and uncommunicated).
Yet, I frequently observe that these approaches or initiatives do not get pursued further in organisations. Further action or continuous improvement that would meaningfully align with other objectives and user communities is neglected. Value is articulated retrospectively, as legacy work, rather than being oriented into the future.
This sees a hit-and-miss approach for impact. There is hope that impact will reveal itself even despite a strategy devoid of ongoing evaluation or objectives.
In mapping this approach to Thorpe’s and Howlett’s evidence based practice capability maturity model in libraries, I suspect it would see many organisations (and not just libraries) become stuck in Justifying (Tier 2). That is:
“Evidence based practice activities are used to justify actions taken and to demonstrate busyness across the organization … EBLIP is acknowledged but only in the context of collecting statistics for reporting against metrics or to justify decisions already made.”
Thorpe and Howlett (2022)
Broadly, I think of this as a retrospective strategy.
Retrospective strategy
By a retrospective strategy, I mean fitting evidence of past outputs and outcomes into a strategy or framework only determined when reporting needs to occur. It sees objectives being either:
- Determined by outcomes after being achieved, 1 or
- Set at the time of reporting, with outputs and outcomes then mapped to them. 2
In either instance, priorities and objectives are defined after activities occur. Then, their outputs and outcomes are mapped retrospectively to strategy.
I think of these as ‘mock priorities.’ They serve a purpose, though usually outside of actively generating impact or striving for continuous improvement. Such priorities may still appear tied to impact, but this is to ‘justify’ rather than to ’empower’ or ‘transform.’ 3
The purpose of these ‘mock priorities’ could be in response to requests for reporting that did not previously exist. They may be an attempt to raise morale or engagement. Or, they could have soft power qualities that aim to create appeal and symbolic capital (elevating reputation).
None of these is immediately a problem, especially if evidence is on hand and is easily mapped or reported.
It does, however, see challenges when used under the guise of continuous improvement or impact without any forward-thinking inquiry or evaluation.
This risk arises in using the SDGs in reporting.
Sustainable Development Goals (SDGs)
In the Australian LIS profession, we often see a focus on the SDGs used as a global framework that is applied locally. Within such frameworks, libraries can be positioned as the change agents they are, highlighting their role as catalysts for positive community impact.
When the SDGs are used as part of a retrospective strategy (after activities have happened), we would ideally establish objectives around future impact or continuous improvement. The SDGs are not without contest and critical perspectives, and we would do well to ensure our engagement with them is considered and authentic.
One instance where this may be challenging is when we observe soft power behaviour underlying the SDGs. In an institutional context, organisations may (positively) encourage others to ‘act favourably’ and adopt a sustainable development agenda. They shape the interests of other organisations by leveraging the appeal and attraction associated with the SDGs. It may, however, see the SDGs co-opted to contribute solely to image and reputation. This risks the SDGs being used as a vanity project, as Gadd describes. 4
Gadd’s recent article on using the SDGs as an assessment framework highlights where such issues may arise. Point three considers whether universities are Appearing to, or actually contributing to the SDGs?, suggesting we may miss the mark on societal impact.
“The SDGs aren’t an opportunity to look good, they are an opportunity to do good. … If universities want to assess their contribution to the SDGs, they should seek to understand the actual SDG targets and indicators, and then weigh up how their missions and investments align.”
Gadd (2023)
A line in Gadd’s article about “real ambitions” also stood out. Both impact and continuous improvement rely on a degree of ambition to achieve change. In libraries, this may require stepping beyond ad hoc impact assessment to cyclic evidence-based action.
Determining specific targets or objectives provides the potential to go beyond retrospective strategy. Targets can be tied to strategy from the outset and provide accountability measures. These can be used as a baseline to drive future action and progress.
Conversely, assessments that apply the SDGs retrospectively (as a standalone activity) are not ambitious (in terms of a forward-thinking strategy). The SDGs may provide an incentive or motivation for ambitious responses, but this requires they be set as genuine priorities.
Impact assessment activities
Library assessment, undertaken to evaluate and improve library services, often requires we are retrospective in using evidence. Yet, this still needs to prompt action or change when needed.
The purpose of assessment activities is not a feel-good or checkbox activity for individuals or organisations. It is about improvement to drive impactful outcomes.
When impact assessment is used purely for reputational purposes, we risk diminishing local community impact. We lose focus on using evidence for genuine impact and shift to the image of impact.
We see similar challenges in cultural institutions and higher education. For example, scorecard diplomacy and ranking and league tables can be a “mechanism of producing status.” 5 These have competitive and soft power undertones that reduce our focus to reputational concerns.
To quote Gadd: “It drives competition over collaboration.”
Sustainable impact
Collecting, assessing, and using evidence well takes time. The process deserves to be used purposefully and applied meaningfully.
The risk of consistently using a retrospective strategy to generate impact stories is that these have a fixed-term value, falling short of sustainable and long-term positive change. They are single (ad hoc) instances of impact assessment work.
This is not a call for the end of retrospective assessment methods (or for more resource-intensive ones). Forward-thinking approaches to evidence and impact, however, should be part of these methods.
Broadly, this might look like being explicit about purpose, framing meaningful questions that align with a user community, and looking forwards (not just backwards) with value and impact.
Without this, we risk one-shot approaches to evidence and hit-and-miss approaches to impact.
- Inductively. ↩︎
- Deductively. ↩︎
- Thorpe & Howlett (2020, p. 96) ↩︎
- Soft power is not a normative concept – that is, something that ought or ought not to be done. It is, however, also not a neutral concept. ↩︎
- Lo (2011) ↩︎