Article: Public involvement in English research

This recent article contains a wealth of interesting information on public involvement in health research. Hence I thought that it would be worth discussing some of them here, and adding my thoughts on what this may mean for methodological research.

The full title of the article is "How embedded is public involvement in mainstream health research in England a decade after policy implementation? A realist evaluation" and it can be found here. As a side-note, the authors use the term 'patient and public involvement' (PPI), whereas I generally refer to 'public involvement'. I do so as, first I take the term 'public' to include 'patients', and secondly because my work (being methodological in nature) will apply to the public in general, not to specific patient groups. However, I appreciate that depending on the context either term would be appropriate.

Arguments for public involvement.

The authors identify two main arguments for public involvement. The first is the moral argument: "research conducted on people without their input is unethical". I think that this can be also extended to say that wherever public monies are used to fund research, then the public should have the opportunity to be involved in that work. The second is the methodological argument, as involvement "will improve recruitment, impact and outcomes".

I agree with both arguments. Unfortunately, as the authors note, a third argument is sometimes used, which could be called the coercion argument: "Many researchers had engaged with PPI because they ‘had to’ to obtain funding." If this is the sole reason for engaging with the public - and hence there is no senior engagement - then public involvement is likely to become tokenistic. As such, I believe that increasing the awareness of the moral and methodological arguments for public involvement is important. The authors note that sometimes the coercion argument evolved into the methodological argument during a project. However, ideally the coercion argument would never be used. I believe that education is likely to be the best way of doing this.



Salient actions for enabling public involvement.


These actions (which are based on 'context, mechanism and outcome configurations') were identified via the analysis of 22 research project. Analyses used the framework of Normalization Process Theory, which helps to identify how new process are implemented.

Six salient actions were identified. Below I describe these, and add my thoughts for each.

1. A clear purpose, role and structure for PPI are ensured
This includes having an identified individual who is responsible for public involvement, as well as providing skills and continued support. The authors also note that “For research teams without dedicated PPI support, access to an external pre-existing PPI group was an enabler”.

An interesting comparison is also drawn between 'out-sourcing' PPI (which extended diversity) versus 'in-house' PPI (which extended the breadth of PPI activities).

My thoughts
I would agree with all of the above, particularly with regards to the usefulness of existing public involvement groups. The comparison of out-sourced PPI (which was usually in the form of topic-specific panels) against working 'in-house' (usually from an existing local group) was also interesting. My experience so far is with 'in-house' public involvement. However, I have been wondering if it would be useful to have a panel to aid with methodological work. The scope of this panel would be very important - maybe start with one for cost-effectiveness research and see how that goes.
Also, it should be possible to get the best of both worlds - to 'in-source' from panels.

2. Active recruitment of public contributors who reflect the diversity of a study population.
The authors note that this “posed a significant challenge”, and there was a danger of blurring the lines between public involvement and public participation. Embedding study teams within local communities helped with recruitment. 

My thoughts
For methodological research such as mine, the study population will represent the general population. However, ensuring diversity and inclusiveness in public involvement in general is challenging and an active area of research (see also p51 of this report by the RAND corporation). For example, by definition the public who are involved in research will reflect a more engaged section of the general public (compared to those who are not involved).
As such, I'm unsure how best to tackle this problem for methodological studies.

3. Whole team engagement with PPI
This moves beyond having an individual coordinator (see salient action 1) to also ensuring that there is engagement from senior researchers, to ensure that there was a “shared narrative”. This helps to avoid tokenism in public involvement (see also the previous section on arguments for public involvement).

My thoughts
Having whole-team engagement in PPI is not an issue for my fellowship, as I am both the PPI coordinator and the principal investigator. However, given that there are currently few examples of public involvement in methodological research, it is unclear how much of an issue senior ‘buy-in’ will be for this type of work. Increasing awareness of the moral and methodological arguments for public involvement is likely to be important here.

4.Mutual understanding and trust
To help foster this understanding and trust, four key elements were identified:

  • Having a shared understanding of what ‘involvement’ means – not confusing it with ‘engagement’ or ‘participation’. Training for researchers on public involvement can be particularly useful in this regard.
  • Making sure that public contributors felt valued. Feedback was particularly important in this regard.
  • Fair re-imbursement of people’s time.
  • Sustained relationships (between researchers and the public) across projects.
My thoughts
All of these points are very important. In particular, providing feedback to public contributors is something that I repeatedly hear/read about as being highly valued. It is telling that this is one of the public involvement indicators (4.3) to support the National Standards for Public Involvement. For re-imbursement, INVOLVE has published useful guidance (although applying this guidance within the University of Sheffield has been far from easy! Things are slowly improving tho). Maybe something similar is required for delivering feedback to ensure that a consistent and useful approach is taken.

I fully agree with the importance of training researchers. However, when I was starting with public involvement, I was unable to identify any formal training courses for this. For example, the majority of courses provided by the University of Sheffield are on the topic of public engagement, not public involvement. If anyone is aware of good introductory courses on public involvement, please let me know.
Finally, the idea of having sustained relationships suggests the potential use for an external panel which provides opportunities for embedding public involvement in specific projects (see discussion of the first salient action).

5. Opportunities for PPI throughout the research process
The authors found that for “applied health research”, involvement throughout the research process was fairly common, although there were still instances of public involvement dying-out as the research progressed. This was a particular threat if there were insufficient resources for PPI.

However, for “basic science studies” questions were raised about if non-scientists could be engaged throughout the entire research project, due to “a lack of technical understanding”. This has implications for the feasibility of including members of the public as co-applicants on research bids.

My thoughts
I think that the authors are mainly talking about “laboratory-based scientists” when they refer to “basic science studies”. However, this type of work also includes methodological research. For applied health research, there is a very useful diagram of public involvement throughout the research cycle. I believe that it would be very useful to have something similar for methodological research (although I’m not sure if there would be a single diagram for methodological research as a whole, or a diagram for specific types of research – such as statistical research and cost-effectiveness research). It is interesting to note that the authors provided two examples of PPI in basic science studies: one relating to funding, and one for prioritising studies. This is similar to my fellowship – I had good public input when developing my research, and my original plan was to have members of the public help prioritise the methodological questions that I would consider (along with other stakeholders. This didn’t happen, (for the reasons discussed here), but I feel that both are potentially valuable roles for public involvement in methodological research. 

6. Reflection, appraisal and evaluation of PPI
The authors emphasised the importance of on-going evaluation, whilst noting that many studies did not perform this. The main reasons for a lack of evaluation were:


  • A lack of evaluation tools.
  • A lack of clarity about what to evaluate (study outcomes / research outcomes / public contributor outcomes)?

My thoughts

Based on my (admittedly limited) experiences with evaluation, I agree that a lack of sources and lack of clarity about what to evaluate can both be barriers. However, there are some very useful resources out there. For my current work, I am basing evaluations on the approach suggested by the Cancer Research UK PPI toolkit for researchers. I have found this to be a very useful resource for the majority of my public involvement activities – it may be hosted by a cancer charity, but it is applicable to public involvement at a quite general level. I am also currently undertaking online training on “Public Engagement Evaluation”. It is early days, but this course also appears to be very useful. For example, it describes the limitations with standard evaluation questions (such as “Have you learnt something new today?”) and the importance of before-after questionnaires for evaluating learning. 

What are your thoughts on any of the above? Comment below or continue the discussion on Padlet!

Comments

  1. Jacqui Gath (Lay Patient Advocate)12 November 2019 at 08:30

    Very interesting blog Ben, which gave me lots of valuable insights and confirmed many more, not to mention the links to resources. Is it possible that research in methodology needs rather more training of the public taking part, than otherwise? I have a background in geeky stuff (data, diagrams, and databases etc) but still had difficulties in my first foray into this type of study. Once the researchers had found time to explain the different tools used, the why, the how and where, life suddenly got easier. The trouble was finding the time in a very hectic schedule of research. This is something perhaps which needs to be planned in terms of more time and reading for the public contributor.

    ReplyDelete
  2. Dear Jacqui.
    Thank you for your useful feedback. Yes, based on my experience in health technology assessment, I believe that extra training is needed. I think that there is some important work into what this training should look like, and if a shared set of training materials can be used - to stop researchers re-inventing material, and members of the public having different training experiences.
    Thanks again, Ben.

    ReplyDelete

Post a Comment

Popular posts from this blog

NIHR Infrastructure Doctoral Research Training Camp.

Talk and Poster at the 2018 Research Students' Conference