Quality in Primary Care Open Access

  • ISSN: 1479-1064
  • Journal h-index: 27
  • Journal CiteScore: 6.64
  • Journal Impact Factor: 4.22
  • Average acceptance to publication time (5-7 days)
  • Average article processing time (30-45 days) Less than 5 volumes 30 days
    8 - 9 volumes 40 days
    10 and more volumes 45 days
Reach us +32 25889658

Research Paper - (2012) Volume 20, Issue 3

Interdisciplinary health research: perspectives from a process evaluation research team

David Clarke1*, Rebecca Hawkins2, Euan Sadler3, Geoffrey Harding4, Anne Forster5 and Christopher McKevitt6, Mary Godfrey7, Josie Monaghan8, Amanda Farrin9

1Senior Research Fellow/Lecturer, Bradford Teaching Hospitals NHS Trust and University of Leeds, UK

2Research Fellow, Leeds Institute of Health Sciences, University of Leeds, UK

3Stroke Association Post-Doctoral Fellow, Department of Primary Care and Public Health Sciences, King’s College London, UK

4Hon. Senior Research Fellow, Peninsula College of Medicine and Dentistry, Exeter, UK

5Professor of Stroke Rehabilitation and Clinical Lead, Yorkshire Stroke Research Network, UK

6Reader in Social Science and Health, Department of Primary Care and Public Health Sciences, King’s College London, UK

7 Reader in Health and Social Care, Leeds Institute of Health Sciences, University of Leeds, UK

8Clinical Trials Manager, Bradford Teaching Hospitals NHS Trust and University of Leeds, UK

9Principal Statistician, Director Health Sciences Division, Clinical Trials Research Unit, University of Leeds, UK

Corresponding Author:
Dr David J Clarke
Academic Unit of Elderly Care and Rehabilitation
Bradford Institute for Health Research
Temple Bank House, Bradford Royal Infirmary
Duckworth Lane, Bradford BD9 6RJ, UK
Tel: +44 (0)1274 383441
Email:d.j.clarke@leeds.ac.uk

Received date: 9 January 2012; Accepted date: 26 February 2012

Visit for more related articles at Quality in Primary Care

Keywords

health services research, interdisciplinary collaboration, stroke

How does this fit in with quality in primary care?

What do we know?

An interdisciplinary approach to health research is normally encouraged and sometimes mandated by funding councils worldwide. This position is based on the belief that complex and persistent health and social care problems can be more effectively addressed by integrating research expertise and scholarship from a diverse range of disciplines.

What does this paper add?

The process evaluation study reported in this paper highlights policy drivers and the strategic and practical challenges inherent in setting up and conducting interdisciplinary health research. It examines how working with and through differing disciplinary and research perspectives can directly influence the ways in which health research is carried out. This includes influencing how data are generated and analysed and, most importantly, how our understanding and explanation of specific interventions or experiences can be enhanced by integrated interdisciplinary research practice.

Introduction

Interdisciplinary health research (IDHR) is commonly encouraged and can be a requirement for specific research grants provided by health research funding councils including those in Canada (Canadian Institutes of Health Research, CIHR1), the USA (National Institutes of Health, NIH2) and Australia (National Health and Medical Research Council, NHMRC3). In the UK, the National Institute for Health Research (NIHR)4 anticipates that research teams applying for programme grants will be interdisciplinary; and will also embrace clinical and methodological diversity. The NIHR expressly encourages collaboration between researchers from several institutions and with service users through Patient and Public Involvement structures led by Involve.[4] There is a common sense rationale for these directives; it is widely believed that research expertise and scholarship from a diverse range of disciplines are necessary to examine questions relating to complex health and social concerns for which single disciplinary approaches have been found to be inadequate.[1,5,6] However, realising such potential in real-world interdisciplinary research programmes can be challenging. To be successful IDHR depends on high levels of collaboration, shared values, recognition and respect for different epistemological positions and trust between principal investigators (PIs), project teams and researchers in the field.[7] Despite worldwide support for IDHR, some researchers argue that evidence for improved patient or public health outcomes resulting from investment in IDHR remains limited.[6,8] As a result, some experienced researchers may engage in IDHR as a strategic move to access large-scale funding without sharing the belief that IDHR brings additional benefits, with its implications for a different kind of research practice; thereby limiting what might be achieved by IDHR teams.[6]

There is a continuing concern that few researchers have been prepared to work collaboratively with scholars from other academic disciplines.[1,9] Pincus et al[10] noted that until recently there were few IDHR training opportunities for PhD students and early career researchers. Potential barriers to promotion and reward may follow from participation in IDHR projects for early career researchers who traditionally seek to publish and establish a scholarly reputation in a distinct disciplinary area.[9,11,12] Although some universities continue to be structured along traditional disciplinary lines, there is increasing recognition of the need for researchers to be able to cross disciplinary boundaries in order to attract the research funding necessary to address research questions relevant to public health, to communities and to industry.[13,14]

Planning for and conducting IDHR demands attention to both macro and micro levels of health service research programmes. At the macro level this includes determining the host institution and identifying the infrastructure and finance to support IDHR projects, bringing together collaborators with the required expertise from within or across different institutions and disciplines, designing the research project and justifying the proposed methodologies. At the micro level, issues common to all projects are evident but come with the additional challenges of working with researchers with epistemological, theoretical and methodological backgrounds which may differ considerably. For PIs, IDHRrequiresmanagement of more diverse research teams, reaching agreement on approaches to collecting, analysing and interpreting data and deciding whether to report findings in integrated or separate publications.[5,15,16] This paper examines how some of the micro-level issues identified above impacted on and were managed by researchers during the course of a large-scale process evaluation study linked to a pragmatic cluster randomised controlled trial (RCT). The first section of the paper briefly examines the terminology in common use in the IDHR literature. The process evaluation is then drawn upon to explore some of the practical, theoretical and research practice issues faced by researchers in this IDHR project. Comments from research team members are used to highlight individual and collective perspectives on the experience of thisIDHR study.

Interdisciplinary health research: defining terms

In a seminal paper exploring drivers for and progress in developing IDHR in Canada, Hall et al[12] argued that defining concepts in regular use was an essential precursor to IDHR becoming a common approach among health scientists. As in the wider teamworking literature, there is inconsistency in the use of the terms multidisciplinary, interdisciplinary andtransdisciplinary in the context of IDHR. These terms are often used interchangeably despite differences in their meaning. Multidisciplinary can be differentiated from interdisciplinary research on the basis that the former normally involves researchers working in parallel or sequentially on linked studies or parts of a study. These typically draw on different research methods but are focused on a common problem.[16,17] For example, a nationwide quantitative survey could be conducted of all ambulance trusts and accident and emergency (A&E) departments to establish average times for an individual with a suspected stroke to get to a hospital and be seen by a health professional. The survey data would be analysed by a statistician and interpreted and published by the PI and statistician. In a parallel study run by the same PI, qualitative researchers may conduct in-depth interviews with a purposive sample of stroke survivors, paramedics and A&E health professionals to determine factors considered to affect the time taken to contact emergency services and get to A&E when a stroke is suspected. The findings of the qualitative study would be analysed and may be reported with reference to the survey data; more commonly these are reported separately. In a multidisciplinary team, researchers’ methodological knowledge and expertise are recognised but contact between researchers is normally confined to agreeing on the need for and planning studies; expertise is not expressly shared with others working on a study. As Lynch[13] pointed out, having a collection of disciplines and skills in place does not guarantee team members will articulate their knowledge for mutual benefit. Knowledge synthesis and exploration of new insights generated by interaction between researchers from different disciplines is not the purpose of the multidisciplinary research team. The primary concern is often to ensure the right individuals are in place to conduct high-quality but separate studies on time and on budget.

By contrast, IDHR teams normally seek to maximise the benefits of collaboration between researchers by actively drawing on and integrating their established disciplinary skills and knowledge. These are brought to bear alongside those from different disciplines in order that the product of the interaction or exchange between researchers is a shared understanding and innovative ways of investigating and responding to complex and persistent health and social problems.[1,12] Using the above example of understanding factors affecting the time taken to get to A&E when stroke is suspected, an IDHR approach would recognise the expertise of survey researchers but would involve all researchers participating in the study in commenting on or challenging the structure and content of the survey prior to its use. An interdisciplinary approach seeks to open out the kinds of questions to be addressed not just employ different methods. Thus in reviewing the proposed questionnaire, a sociologist might pose the question as to how the person is identified as having a probable stroke, how the decision is made that hospital admission is required and how resources are mobilised to make it happen. Interpretation of the survey findings and proposed followup qualitative interviews would again involve all participating researchers and not be the sole responsibility of the PI and statistician. This explicit exploration and synthesis of expertise and disciplinary knowledge is claimed to be the added value which can be gained by adopting an IDHR approach.[6,12] There is increasing pressure to adopt IDHR approaches to address intractable health issues such as obesity, coronary heart disease, stroke and diabetes, or the interlinked problems and health consequences of long-term unemployment, poverty or to understand the needs of those with long-term mental health problems.[8,12] IDHR requires that researchers working together on complex health problems step aside from traditional, often hierarchical valuing of basic science methods or clinically driven problems and expertise that in the past have assumed priority over social science methods. It seeks respect not only for disciplinary knowledge and methodological expertise, but also a willingness to engage with different epistemological and ontological perspectives.[8,18] Thus, instead of simply asking whether an intervention is effective, an IDHR team is more likely to question how an intervention works, for whom, in what circumstances and delivered in what ways.

Lynch[13] noted that the term transdisciplinary research was not a major element of the discourse surrounding health research in the UK but the term is increasingly evident in the literature. Stokols et al1[7] claim that transdisciplinary health research (TDHR) goes beyond the level of collaboration inIDHR.Through the development of shared conceptual understanding, together with explicit synthesis of discipline-specific theoretical perspectives,TDHRcan achieve ‘novel and integrated conceptual models’which can be applied to clinically and socially relevant programmes for change (Stokols et al17, p. 204). Canning et al8 argue that whilst TDHR potentially offers positive solutions to enduring and complex public health issues, in practice, few transdisciplinary research teams expressly address the epistemological assumptions from which different disciplines operate. This may be a direct consequence of single discipline research training or an ingrained distrust of social or biomedical science research methods. However, it perhaps represents a missed opportunity to be open to alternative and novel ways of understanding particular phenomena through the theoretical lens of other disciplines. This does not mean researchers have to leave behind disciplinary knowledge and expertise, but rather be willing to examine alternative explanations against existing ones. Such teams bring diverse expertise to develop a broader understanding of complex phenomena. For more detailed examination of these debates see Slatin et al,[5] Lingard et al[19] and Canning et al.[8]

This paper examines an IDHR project in inpatient stroke services. Hall et al[12] defined IDHR as involving ‘a team of researchers solidly grounded in their respective disciplines, that come together around an important and challenging health issue, the research question for which is determined by a shared understanding in an interactive and iterative process’ (p. 764). It was this kind of interdisciplinary approach which underpinned the process evaluation referred to in this paper.

Rationale for the Training Caregivers After Stroke (TRACS) trial and process evaluation

Stroke is the third largest cause of death and the main cause of adult disability in the UK; 115 000 new strokes occur annually and cost the National Health Service (NHS) and economy about £7bn a year.[20] Although outcomes have improved, 30% of stroke survivors will have some residual disability and require longer term assistance with activities of daily living including washing and dressing, eating, drinking and walking.[21] As with most long-term conditions, there is an increasing expectation that family members will provide continuing support for stroke survivors after their discharge from hospital.[22,23] In a single centre study, a structured caregiver training programme, the London Stroke Carer Training Course (LSCTC) proved effective in decreasing burden, anxiety and depression for the caregiver, in improving psychological outcomes for patients and in reducing overall costs.[24] To test the efficacy of the LSCTC in a wider range of units and with a more diverse population, a pragmatic cluster RCT introduced a modified version into 18 stroke units. A further 18 units providing care based on the National Clinical Guidelines for Stroke[25] represented the control arm of the trial.

Methods

Medical Research Council (MRC)[26] guidance on developing and evaluating complex interventions (such as the LSCTC) identified process evaluations as an important and necessary part of researchers’ methodological toolkit alongside RCTs; the primary purpose of process evaluations is:

to explain discrepancies between expected and observed outcomes, to understand how context influences outcomes, and to provide insights to aid implementation. (p. 4)

The TRACS process evaluation was funded separately and conducted by researchers not involved in the RCT, although the same PI was responsible for both studies. Delays in securing funding, ethical and research governance approval meant that the process evaluation commenced 9 months after the RCT began recruitment. Implementation is not a single event but rather a process that occurs in stages. We focused attention on what occurred over time and on what mechanisms facilitated or hindered implementing and embedding the LSCTC in routine care.[27,28] Interventions are introduced into organisationswith different histories, cultures, learning climates and readiness for change, so many factors can affect outcomes.[29] We adopted an ethnographic approach to develop an indepth understanding of a sample of stroke units participating in the trial and the practices, interactions and perspectives of professionals, caregivers and stroke survivors in those units. Four researchers undertook fieldwork in eight units in four English regions over 6 months, and for 3 months in two additional intervention units. Data generation and analysis drew on grounded theory methods.[30,31] Full details of the process evaluation methods will be published elsewhere.

The interdisciplinary TRACS process evaluation team

The CIHR1 interim evaluation of the IDHR Team Programme found that two of the five principal reasons for interdisciplinary teams forming to compete for grants were the ‘dependence of the research on complimentary skills and/or knowledge’ and ‘personal relationships which existed prior to the grant amongst key participants’ (p. 3). These factors were significant in determining the collaborators on the grant application, the composition of the Process Evaluation Steering Group and the researchers who managed the study on a day-to-day basis. The PI was an experienced stroke [trials] researcher with a background in physiotherapy; co-applicants were drawn from stroke medicine, a clinical trials unit, and health and social care research units. Co-applicants were active in stroke research or had worked with or were known to the PI. The process evaluation, however, depended on combining the skills and knowledge of experienced stroke researchers who were closely involved in providing and evaluating stroke services, with four mostly unknown researchers. These were experienced in qualitative methods but had backgrounds in medical sociology, social and medical anthropology, gerontology, nursing and physiotherapy. Drawing fromexisting networks can mean that IDHR teams begin from a position of shared knowledge and understanding, in this case, of stroke and the needs of stroke survivors and caregivers. However, this shared understanding may not extend to knowledge of or confidence in the methodologies adopted by researchers from non-medical and qualitative backgrounds.[6,16] Such differences can be a space for creative dialogue or can be the basis for distrust and disharmony in IDHR teams.[7,15,19] In the process evaluation, two qualitative researchers had no prior experience of stroke care and so came to the study with a naı¨ve understanding of stroke service provision which was to prove beneficial as the study progressed. One team member noted:

‘Having a common qualitative research background but also different disciplines and career paths (i.e. nursing, physiotherapy, research in medical sociology and anthropology but not stroke care) worked well. Sharing a background in qualitative research meant that we spoke broadly the ‘same language’ and understood the kinds of questions we were asking of the data. Being from different disciplines helped us develop the observation schedule and interview topic guide, it also prompted reflection on our fieldwork (observations), and led us to reflect on, question and develop our interpretations of the data.’ (Research Fellow 1)

From protocol to fieldwork: debate and decisions

Cheek7 noted that two practical problems faced in collaborating with researchers from different disciplinary backgrounds involve deciding what disciplines (required skills and knowledge) to invite to work together on a project and then how (process) those disciplines should work as a team. O’Cathain et al,[16] writing in the context of teamworking in mixedmethods studies, highlighted the central role of the PI in shaping and facilitating IDHR teams. In our study, a steering group was formed in advance of fieldwork. This consisted of the PI, the TRACS trial manager, a co-applicant who was an experienced stroke researcher (background in social anthropology) and a senior qualitative researcher with a background in implementation and evaluation research related to older people’s services. Team selection was both a pragmatic decision and a methodological one; established colleagues respected for their methodological skills became part of the steering group. The need for senior qualitative research expertise to oversee the operation of the study was seen as a necessary quality mechanism both in terms of managing a large novel and complex study, and in terms of ensuring rigour in data generation and analysis. Research with mixedmethods research teams has also highlighted the importance placed on having ‘senior expertise’ available, this was understood to be a component part of maximising the gains which could be made from interdisciplinary and mixed-methods collaborations.[16]

In our study, differences in methodological knowledge and skills were soon apparent when the nature and focus for observations in stroke units were discussed by the group. This was an early manifestation of epistemological differences across the team, with trial researchers primarily concerned with preventing contamination of the RCT and qualitative researchers giving primacy to gaining access to participants’ dayto- day work and experiences. Canning et al[8] suggest that such differences are commonly acknowledged to be inherent in IDHR and TDHR teams but are rarely addressed. The process evaluation protocol called for ‘non-participant’ observations to be carried out discreetly by researchers occupying spaces where the interaction of staff, caregivers and stroke survivors could be observed without the researcher influencing these interactions. This suggested that researcherswould be ‘apart’ observers collecting social facts objectively, and was contested by the qualitative researchers. They argued that non-participant observation was a misnomer, that some degree of interaction and thus participation with participants in the stroke units was inevitable.[3234] The senior qualitative researchers on the steering group concurred with this position but these discussions, occurring in advance of fieldwork, largely remained at the theoretical level and did not result in the kinds of conflict, tension and identity work reported in some IDHR studies.[15,19]

The need for the observational element of fieldwork to avoid activity which could be seen to alter practice, for example, influencing delivery of the caregiver training programme in intervention units or highlighting the need for caregiver training in the control units was not contested. At the same time, the qualitative researchers maintained that fieldwork relations were complex and whilst general principles could be readily agreed, the nature and content of observations would in part be determined by the physical structure of the stroke units and the day-to-day flow of work in these units. Morse[35] suggested that qualitative researchers in IDHR teams may be pressurised to compromise on their proposed methods and that this may serve the needs of the wider team at the cost of diluting the contribution of in-depth qualitative enquiry. This issue is often more evident in mixed-methods studies where quantitative research methods can be considered to be more important than qualitative elements. [5,7,16] Similar issues have also been reported in predominantly qualitative research teams.[18] Whilst the process evaluation was not a mixed-methods study per se, the conduct of fieldwork in the same units as the RCT meant that methodological decisions had to be cognisant of the aims and operation of both studies. For example, in seeking to observe caregiver training we did not wish to draw unnecessary attention to these aspects of rehabilitation and prompt MDT members to alter their activity in working with caregivers.

Team processes

In group development terms, prior to ethical approval and fieldwork, the process evaluation team was effectively in the ‘forming’ stage.[36] Relations between team members, the majority of whom had worked together in some way previously, were cordial and there was excitement about conducting the first largescale process evaluation study alongside an RCT in stroke care. Power relations were not being challenged and respect was apparent for different methodological positions, although discomfort at the differences in understanding the premises of observational research was already apparent. Michel[15] records a similar period of early group cohesion in an IDHR team studying health of migrants, where epistemological differences were minimised as the initial focus was on working together on a complex research problem. In Michel’s[15] team, this cohesion rapidly disappeared as group development progressed through the ‘storming’ phase in which divergent opinions became apparent and individual team members’ epistemological claims and proposed methods were prioritised or criticised. Such challenges are frequently encountered in IDHR and ideally should be anticipated by PIs.[16,19] IDHR teams either manage to address disharmony and conflict and become more cohesive and task focused, or such teams may fragment. This can lead to an uneasy compromise regarding research objectives and methods with the result that an integrated interdisciplinary study does not ensue, some team members withdraw and others may contribute little.[5,15,19]

Focusing on delivering the study

The requirement to secure ethical approval moved our discussion from abstract theoretical debate about observations to addressing the reality of how the agreed research aims would be met in practice. However, this did not lead to insistence on conducting non-participant observations as originally indicated in the protocol, nor in destructive criticism or defence of particular perspectives. Instead, what developed was the opening of a lively critical dialogue between steering group members and researchers who would undertake fieldwork. This initially focused on theoretical perspectives underpinning observational fieldwork and addressed concerns about researchers contaminating the intervention. That this exploration and dialogue about different perspectives did not exhibit some of the typical vying for position and conflicts common to groups in the storming phase, was due in large part to the willingness of the PI and TRACS trial manager (representing the larger trial team) to listen, question, challenge, and over time, to trust and endorse the proposals made by the qualitative researchers. The involvement of two trusted senior qualitative researchers on the steering group undoubtedly contributed to confidence in the fieldwork approach agreed.

The process evaluation team’s interaction with the Research Ethics Committee (REC) highlighted how differences in knowledge and research expertise external to the team can impact directly on IDHR studies. REC members reviewed the outline of proposed observations in stroke units and required that a detailed observational schedule be submitted for approval. Despite robust debate with the researchers, it was clear that REC members anticipated that a quantitative observational schedule would be developed to capture the frequency of particular actions on stroke units. An observational schedule was developed and debated by the steering group in person and by email. This reflected a core structure identifying areas of focus for observations linked to the aims of the process evaluation; expectations for recording and reviewing observational data were also included. The production and discussion of the observational schedule effectively addressed some of the earlier methodological concerns which arose from team members’ differing perspectives and clarified the approach to and focus for observations. As a result, there was renewed consensus on the focus of the research. The potential disruption often evident in the ‘storming’ stage of IDHR team development did not materialise. This early exploration of theoretical and practical perspectives related to conducting observations established an approach towards debating methodological issues which we used throughout the study.

Developing effective communication in IDHR teams

In large IDHR projects where researchers are working in different parts of the country or in different countries, effective systems for communication are a key part of ensuring that the study can achieve its aims and the added value inherent in IDHR can be realised.[18] Dialogue about fieldwork, initial findings and ongoing data analysis formed the basis of much of the team’s interaction during the course of the project. A secure online group discussion area hosted by one universitywas set up early in the study, but researchers and steering group members, who were not all employed in the host institution, experienced delays in access to an unfamiliar system which was sometimes unavailable. Overall, we did not find this communication medium useful and ceased using it after 3 months of fieldwork. Instead, team dialogue took place in scheduled face-to-face meetings, but equally commonly by telephone and email as the need for decision making about fieldwork or data analysis arose. Regular communications were particularly important as fieldwork took place in four different English regions with face-to-face contact between researchers occurring only every 6–8 weeks and with the steering group every 3–4 months. The process evaluation team benefitted from the fact that the TRACS trial was already underway and the PI and trial manager were able to provide, often at short notice, information which contributed to decision making. For example, in relation to patient and carer recruitment to the trial, the focus of documentary analysis, timing of interviews with staff, and as the study progressed, analysis of the theory underpinning the intervention and its implementation. However, closer contact with the full trials team may also have proved useful:

‘Access to and open discussion with the trial team (trial manager/trial lead) was really important for: (1) our process evaluation ‘‘fitting’’ with the trial and (2) our understanding of the intervention and the implementation process. For future process evaluations it would be better if dialogue between the wider trial team and the process evaluation team took place earlier (i.e. as the protocols were being developed) and then continued throughout the studies.’ (Research Fellow 1)

Meetings between these two larger groups may have run the risk that epistemological and related methodological differences may have dominated discussion and reduced the effectiveness of the interdisciplinary collaboration. However, we would recommend that pre-study, mid-point and end of study meetings between such teams be considered.

Different disciplinary backgrounds can be an advantage in IDHR

In IDHR, exchange of disciplinary knowledge and perspectives should lead to insights and explanations which result in high-quality applied research with findings which have direct relevance to practice settings and which can be the basis for change and service development.[12,13] In the process evaluation study, two issues arising as a direct result of differences in our disciplinary backgrounds and research training impacted on the analysis and interpretation of our data and then on decisions related to the development and focus of publications. These are both identified as areas of potential division or conflict in IDHR teams, particularly where mixed methods are employed.[12,16] We argue that disciplinary diversity within our team, even among the predominantly qualitative researchers was an advantage. Although the team could be considered to have reached Tuckman’s[36] ‘norming stage’, i.e. wewere an established group who understood each other and our work, it took more direct engagement with study data before we appreciated each others’ research skills and knowledge. In the regular research fellows meetings it quickly became apparent that we took for granted our research training even though this influenced the way we saw activity in the stroke units and how we recorded data and interpreted them. For example:

‘It was mainly an advantage to have four research fellows who had different professional backgrounds and theoretical ‘‘lenses’’ with regards to the data collection and analysis. In terms of one disadvantage, the two research fellows who did not have a clinical background may have found some of the technical language used in stroke difficult to understand during the observations.’ (Research Fellow 2)

In practice, we also benefitted from the lack of a clinical background of these two researchers. In an early meeting focused on the contexts and nature of routine practice in the stroke units in which we were observing, researchers’ fieldnotes were reviewed. Researchers with clinical backgrounds tended to focus more narrowly on the practice of specific staff groups and their interactions with each other and with caregivers. By contrast, non-clinical colleagues also highlighted contextual features such as the influence of the built environment, which groups of staff congregated in which spaces and the power relations which were impacting on the organisation of routine work. One researcher with experience of a number of IDHR projects and who felt that the disciplinary and skill mix in those projects had not focused sufficiently on the social context of behaviours noted:

‘It was refreshing for me to work with other than psychologists – allowing us to give due consideration to the broader contextual backcloth via the observational work we undertook. This, combined with the interviews makes for a far greater appreciation of the issues around rehabilitation than might otherwise have been unearthed.’ (Research Fellow 3)

The project lead also engaged in data generation and analysis as well as day-to-day project management. Building on the open dialogue approach to decisions on observations, the project lead acted as a ‘knowledge broker’ actively seeking divergent perspectives, challenging claims and summarising progress or gaps in understanding. For our group, regular dialogue in face-to-face meetings interspersedwith frequent email contact, contributed to openness between researchers and a respect for others’ fieldwork and interpretations. Thoughts, suggestions and differing interpretations were listened to and considered during meetings; research fellows’ and steering group meetings were supportive spaces to discuss and get feedback on the developing interpretations.

These also guarded against developing too cherished a theoretical interpretation of what was being observed and avoided entrenched theoretical dogma. This was illustrated when researchers were trying to explain and conceptualise the shock and distress observed in patients and caregivers in the days and weeks following stroke.[3739] Biographical disruption, a concept frequently employed in accounts of coming to terms with chronic illness[40,41] was initially explored and favoured. The experienced qualitative researchers on the steering group challenged the utility of this concept in the context of the sudden onset of stroke and the complex interaction between the shock experienced by the patient and that experienced by the caregiver. This challenge led us to explore the significance of pre-illness relationships, prior illness and caregiving experiences, wider social expectations of carers,[22] and also stroke-specific research including that on caregiving.[4244] By this stage, the team can be said to have been operating at the ‘performing’ stage[36] in that challenge was not seen as a threat or a lack of respect for specific disciplinary theories, there was evidence of interdependence and flexibility in thinking among team members. This theoretical challenge led to a frank exchange of views and sharing of alternative ways of seeing, which resulted in enhanced understanding and increased rigour in our analysis; outcomes which are often claimed for IDHR.[1,6] The same kinds of debate surrounding analysis and interpretation of the data also led to wider examination of the implementation literature[27,29,45] and the posthoc employment of normalisation process theory (NPT)[28] as a synthesising framework for the process evaluation data. NPT was not familiar to all team members but its initial application in an analytical summary exploring factors which could explain positive or negative trial outcomes confirmed its relevance for exploring and reporting our findings. We drew on this theoretical framework to review our analysis, this prompted greater attention to examination of features of the intervention itself, the preparation of staff to deliver the intervention and to what extent they engaged with the intervention in the short and medium term. As a result of sharing and challenging disciplinary knowledge, the process evaluation team was more focused analytically and less likely to close down lines of enquiry prematurely.

Publication output from IDHR studies

Outputs from effective IDHR projects can result in publications which reflect narrow disciplinary interests, rather than harnessing and reporting on the insights which may have arisen from interdisciplinary collaboration. [5,16] This can be compounded by difficulties in finding journals which seek or accept papers reporting on IDHR projects.[1,12] However, Pincus et al[10] reported that IDHR training projects conducted by the RAND/ Hartford Foundation actually resulted in a substantial number of IDHR publications being achieved. Moreover, the CIHR1 reported that approximately half of their focus group respondents (54%) ‘saw no difference in their rate of publication as sole or first/senior author since the beginning of the grant’ (p. 8).We also note that growth in open access publishing, for example, through Bio-Med Central journals and similar, has increased the number journals to which IDHR reports can be submitted. In the process, evaluation diversity in theoretical and disciplinary perspectives directly influenced planned publication output. We developed a publications plan and agreed on a protocol for first authorship, involvement of the remainder of the teamin writing or review and also target journals. We believe that this is essential to address the concerns of researchers that combined author outputs may affect departmental promotion prospects and also research quality assessments, as in the forthcoming UK Research Excellence Framework submissions due in 2014. The first publication emerging from the process evaluation team was a review paper analysing the construct of caregiver in the context of increasing policy and social pressures for lay caregivers to provide support for those with long-term conditions. This paper arose through dialogue about caregiver experiences and expectations in the TRACS process evaluation, but does not report directly on findings from that study. The paper was developed by team members with backgrounds in social and medical anthropology, but was reviewed and commented on by the entire team. Two subsequent papers which address LSCTC implementation processes drawing on the NPT framework, and the influence of professional power in the production of the ‘ideal caregiver’ in stroke rehabilitation, which uses a Foucauldian perspective, have benefited similarly from the interdisciplinary forumin which they evolved and have been reviewed. A research fellow commented:

‘When writing papers it has been helpful that the disciplinary expertise of the qualitative researchers on the steering group challenged or added to the disciplinary expertise of the researchers working on the project. It has been really helpful to have insights from an experienced researcher with a background in anthropology and familiar with the theoretical framework we are using in our paper. At the same time it was equally important that others in the team have commented on the paper. The wider consultation on publications has undoubtedly slowed down progression to submission for publication but has broadened the scope and theoretical perspectives addressed in the papers.’ (Research Fellow 2)

Discussion

Tangible evidence of health service or public health benefits from recent investment in IDHR is not specifically reported in the literature, with the exception of the CIHR1 report. Research with clinician and biomedical scientists (n = 61), also in Canada, indicated continued scepticism, particularly among biomedical scientists, that IDHR resulted in added value and warranted the investment currently being ringfenced for IDHR projects in Canada and other countries. However, it is increasingly the case that IDHR approaches are being encouraged or required by funding agencies commissioning or supporting research targeted at complex and continuing health problems.[2,4] There is now a quite substantial literature examining the challenges of IDHR projects.[5,7,11,12,15] The issue of training researchers to work in and lead IDHR teams is also a recurring concern.[1,9,12] We concur with findings of Pincus et al[10] and also Poole et al[14] from evaluations of their IDHR training programmes that formal and informal training in IDHR can occur as an integral part of funded collaborative projects and that experienced researchers as well as doctoral and postdoctoral researchers can benefit from active engagement in IDHR. We also note some evidence of a shift to interdisciplinary research training as part of ESRC doctoral training centres (DTC) and in some university collaborations such as the White Rose DTC in Social Science in Yorkshire[46] and the King’s Interdisciplinary Social Sciences (KISS) DTC47 at King’s College in London. Recent publications are beginning to focus on the ways in which some IDHR teams have addressed challenges inherent in and to report on the benefits which can accrue from IDHR.[10,14,18,19] The TRACS process evaluation project reported here demonstrates some of the microlevel practical challenges which arose as part of a large qualitative process evaluation and discusses how we responded to these. The research team believe that over time we developed trust in team members’ knowledge and confidence in the rigour of the methods employed. We benefitted from the depth of disciplinary knowledge and skills brought to our discussions; the synthesis of this knowledge added depth to this interdisciplinary research project. More macro-level issues impacting on IDHR remain; these include ensuring that grant applications incorporate anticipated additional costs associated with an increased number of face-to-face meetings or video conferences between researchers operating in different research sites or institutions. Establishing interdisciplinary health research, even in universities with a traditional focus on disciplinary excellence is now perhaps less problematic as there is recognition at all levels of need to work across disciplinary boundaries, to train new health researchers and to secure research funding to address complex health and social problems.

Acknowledgements

The Process Evaluation team thanks the many people who assisted in the set up and operation of the study referred to in this paper. All sites will be fully acknowledged in subsequent papers. We are extremely grateful to all the patients, caregivers and staff who participated in the study. This paper refers to and draws on independent research commissioned by the National Institute for Health Research (NIHR) under its Research for Patient Benefit (RfPB) Programme (Grant Reference Number PB-PG-0407–13308). The views expressed are those of the author(s) and not necessarily those of the NHS, theNIHR or the Department of Health.

Author Contributions

AF, CM, JM, DJC and MG designed the process evaluation study. DJC, RH, ES and GH collected and analysed the data. DJC developed the manuscript. AF, MG, CM, RH, ES and GH reviewed the manuscript and made recommendations for changes and additions.

Funding

The Process Evaluation study referred to in this paper was funded by an NIHR Research for Patient Benefit Grant (PB-PG-0407–13308).

Ethical Approval

The study referred to in the paper was approved by the Leeds (West) Research Ethics Committee.

Peer Review

Commissioned, not externally Peer Reviewed.

Conflicts of Interest

None.

References