The past week I’ve been trialling a remote mic device and hearing aids in professional and social settings. Eventually, I’ll have to make decisions for the longer term.
It made me think about the different types of evidence I’ll rely on for that decision.
A colleague recently suggested that examples of applying evidence to decisions in everyday contexts (beyond libraries) could help in discussions on how we engage with evidence based library and information practice (EBLIP).
So, I started to draw some connections to what this looks like. This little account is not directly analogous, though my current decision-making process did make me think about the complexities woven into this.
There have been elements of trust, as well as limitations, to acknowledge. The uncertainty, reflection, and challenges in interpreting evidence are all there.
So, what have I learnt so far in trialling a remote mic and hearing aids?
- Being trusted in my experience to observe what does/doesn’t work is incredibly valuable. Based on professional advice, I thought the remote mic would be more helpful than the hearing aids alone. I found the opposite was true (for me).
- There was never the expectation that I should immediately or intrinsically know what assistive technology would be beneficial or in what settings. I found that hearing the chirping of birds amplified during an online meeting wasn’t a problem, whereas the clattering and excited shrieking that gets picked up in a cafe with a remote mic is a problem. Different contexts require different approaches and setups.
- Assistive tech does not replace other inclusive practices. I always appreciate when people chair meetings in accessible ways that enable contribution. This hasn’t changed. The way people approach chairing meetings still has a greater impact (for me) on accessibility than the hearing aids. Individual assistive technology does not remove all professional barriers.
If you noticed the “(for me)” comments in the bullet points above, they are significant. In libraries, we discuss ‘local context’ and the importance of including ‘local evidence.’ In this instance, I am the local context. Ultimately, this was about whether I would experience any benefit and could report a positive impact.
Here, my decision-making process is being guided by someone else’s professional expertise and clinical experience (evidence), was initiated because of (and guided by) my lived experience and observations (evidence), and I’m being presented options that exist because of research (evidence).
I trusted a professional to make recommendations based on their professional expertise, drawing on research evidence and clinical experience. In turn, my experience is trusted as I pay attention to what does and does not help in my local context.
Across this, I’ve picked up on gaps in the research literature, had outcomes and events that I didn’t expect, and have been creative in finding ways to determine if there’s any difference – even when it may be subtle. It’s not a perfect process, but I’m drawing on and interpreting evidence to guide my decision.
When I talk about EBLIP, like others, I acknowledge the messiness of it (and that it’s also different to a clinical health sciences context). I believe it’s helpful to understand that building capabilities for evidence based practice in libraries is not a perfect or linear process. I’d argue that it’s a disservice to pretend it is.
From my perspective, what is valuable in navigating complexity can also contribute to it. This might be:
- being prepared to recognise gaps and limitations,
- understanding who is impacted (and will be),
- showing a willingness to listen,
- appreciating where local context is significant,
- reflecting on what types of evidence should contribute,
- interpreting and communicating across any conflicting evidence, and
- being reflexive in understanding what we bring to a decision.
Our approach to each of the above develops over time. There will be changing perspectives. Decisions will sit across different and evolving contexts. Interpretation is necessary. And this is what makes evidence based practice equally important and challenging, and why ongoing dialogue is necessary.
There’s an element of trust woven across this. That is, trust in how people navigate data, translate it as evidence in context, and use it in practice. There is also trust involved when people share lived experience or experience evidence based practice as a relational process.
It’s because of complexity that discernment and reflexivity need to be part of EBLIP done well. While we’re typically inclined to start with defining a problem, we have the potential to evoke much more – and that go beyond a “fact-checking paradigm.” 1
Discussing trust in ethnography, Pugh and Mosseri write:
“When we move beyond a problem-solution binary, we find ourselves able to hear a bit more from a scene: the sometimes many contradictory voices, the irony, the complexity, the multiple layers, aspects which deepen and enrich our experience and understanding.”
Pugh & Mosseri (2023, p. 3)
Contradictions between different methods, evidence sources, and claims may appear as a challenge to data quality and verification. Yet, they open up new complexity and meaning:
“Contradictory facts demonstrate not just informants’ acts but the meaning they make from them; they allow us to see the pressures people feel from the colliding demands of their social world, and how they manage that collision.”
Pugh & Mosseri (2023, p. 8)
The complexity and surrounding context are easily dismissed if we rely on only one source of evidence (I.e., systems data) or neglect to ask questions about our process. While there are constraints 2 to what we can explore or respond to in our practice – and perhaps new layers and curiosities can seem inconvenient – that complexity adds depth to understanding.
Complexity feels as though it demands we address notions of trust, transparency, authenticity, and reflexivity in practice.
I’m thankful to experience mutual trust in my current decision-making process. I can reflect on how I experience accessibility in professional and social spaces and how this sits alongside other types of evidence. It has [re]challenged the ways I think about accessibility and forced me to think about perceptions and assumptions – even in writing this.
My experience is one that’s gone beyond solving a problem.
Evidence provides opportunities to problem-solve and explore. And that exploration can feel creative in practice as pathways of ideas, information, problems, solutions, experiences, evidence, people, and contexts meet. These change, converge, coil, knot, and unravel but never quite settle.
This means that social contexts, perceptions, and experiences have been entwined with evidence and decision-making.
For now and “(for me)” I’m embracing that complexity and appreciating the impact that can come with it.
- Pugh & Mosseri (2023, p. 2) ↩︎
- Time, resources, etc. The usual. ↩︎
1 thought on “Exploration in evidence: A personal reflection”