MIKE FISHER 20 Social Work & Social Sciences Review 16(2) pp.20-36. DOI: 10.1921/903160201 Beyond evidence-based policy and practice: Reshaping the relationship between research and practice Mike Fisher1 Abstract: This paper concerns the impact of social work research, particularly on practice and practitioners. It explores the politics of research and how this affects practice, the way that universitybased research understands practice, and some recent developments in establishing practice research as an integral and permanent part of the research landscape. While focusing on implications for the UK, it draws on developments in research across Europe, North America and Australasia to explore how we can improve the relationship between research and practice. Keywords: evidenced-based practice; practice research; social work; international 1. Professor of Evidence-based Social Work Address for correspondence: The Tilda Goldberg Centre, University of Bedfordshire University Square, Luton LU1 3JU. mike.fisher@beds.ac.uk Date of publication: BEYOND EVIDENCE-BASED POLICY AND PRACTICE 21 Introduction The relationship between research and practice has been a constant theme in the development of social work. As early as 1908, Mary Richmond was teaching social statistics and research in the Philadelphia Training School for Social Work and in 1917 she called for researchers to work with practitioners: We should recognize… the evident desire of social workers to abandon claim to respect basedupongoodintentionsalone;weshouldmeethalfwaytheir…endeavorstosubject the processes of their task to critical analysis; and should encourage them to measure their work … (Richmond, 1917, p.25) As social work developed in North America, this relationship has been explored throughthecallforpractitionerstobecome‘personalscientists’(Blyth,1988),through practitioner-researcher partnerships (Hess & Mullen, 1995), joint practice research projects such as Homebuilders (see Blyth, 1988), partnerships between agencies and universities (e.g. http://www.bassc.net/) and through the increasing emphasis on teaching research at qualifying and post qualifying level (Howard et al., 2003). In the UK, the development of research was gradual and it was not until 1970 that Tilda Goldberg’s randomised controlled trial of services for older people given by trained versus untrained social workers emerged (Goldberg, 1970). The relationship between research and practice in the UK was not easy. In typically sharp tones, Goldberg took social workers to task for not engaging sufficiently with research: If social workers were prepared to define their middle range goals in more precise operational terms, and where appropriate to categorise and measure their social work activities, to tolerate independent assessments and to heed their clients’ evaluations, then we could take a great leap ahead in any sphere of social work we cared to study (1970, p.200). Throughoutdevelopedwelfarestates,theevidence-basedpractice(EBP)movement has produced a similar tension. As researchers produced greater precision in their science, the distance between research and practice grew, as did the disdain that some researchers evidenced for practice. Researchers adopting the systematic review approach of the Cochrane and Campbell Collaborations narrowed their questions and their criteria for relevant research, and failed to address practice concerns. For example, a now famous review of supported housing for people with severe mental disorder identified 139 studies, but decided that none had relevance to the question of effectiveness (Chilvers et al., 2006). Those outlining a broader approach to what counts as research-based knowledge were dismissed as failing to provide the evidence to support public services (Macdonald, 1999), neatly ignoring the fact that the systematic reviews were not providing appropriate evidence either. Indeed, Antilla MIKE FISHER 22 noted that the findings from systematic reviews were often ‘ambiguous’ and fell short of the clarity required to guide practice (2006). Meanwhile, some proponents of EBP continued to lambast practitioners for not being able to cite references for the knowledge they used in practice, and for not being able to define basic research and statistical terms (Sheldon et al., 2001). This unhappy relationship was perhaps exacerbated in the UK context, where the separation of research from practice constrained mutual influence and where the rise of new public management, feeding on the promise of certainty from the EBP movement,soughttorestrictprofessionaldiscretioninfavourofmanagementsystems that promised consistent and higher standards (Dunleavy & Hood, 1994). In the UK too, there was evidence that the priorities of researchers (as evidenced in funded work) do not reflect the priorities of practitioners, at least in child welfare (Stevens et al., 2009). It is also clear that there is no national policy for the development of social work research as a distinctive discipline and research to support social services lacks investment (in comparison with research support for health - see Fisher & Marsh, 2003; Marsh & Fisher, 2005). This has meant that it is difficult to identify the kind of critical mass of social work research required to provide a trustworthy foundation for national policy, which in turn undermines the case for investment. Whatever the contributory causes, EBP in the UK became known in the mid 90s and early years of the new century for its distance from practice issues, its tendency to blame practitioners for not being able to engage with research and to regard practitioners as the passive recipients of knowledge (see Shaw, 1999: 3), its adherence to methods that excluded large areas of relevant evidence, and its inability to provide effective guidance for practice. The rise of relevance As is often the case with the history of social movements, just as we were reaching ‘peakEBP’inthemid90sandearly yearsof the new century, theoristswere developing otherapproachestotherelationshipbetweenscienceandsociety.Theseminalanalysis by Gibbons and colleagues, distinguishing Mode 1 from Mode 2 ‘ways of knowing’ (Gibbons, et al., 1994), combined with a growing emphasis on practice research, served to establish the foundations of a broader approach to evidence-based policy and practice. The relevance criterion The authority of evidence-based policy and practice rested on achieving greater scientific precision, often by emulating aspects of health care research, particularly BEYOND EVIDENCE-BASED POLICY AND PRACTICE 23 in narrowing the research question and pre-determining the kinds of evidence that should be used. This laudable goal ran two key risks, however. The first was that there existed no practice-competent research community to ensure that the questions remained relevant to services: there was no equivalent of clinician-researchers, and the separation of research in universities from practice meant that there were too few people with practice concerns and research credibility to influence the agenda. The second risk was that the critical mass of evidence required for reliable findings was not present. Controlled trials (of any kind, let alone randomised) were rare, and secondary analysis could not be undertaken without a body of research that could be statistically synthesised. Social work was at risk of adopting the philosophy and technology of effectiveness research and synthesis without a body of effectiveness studies. In effect, an emerging discipline of EBP had the methods, but nowhere near enough data, to support its version of evidence based practice. This gap between the kind of evidence available and that required to support evidence-based practice is one of the key themes in the analysis of the role of science in modern societies. In their book on this topic, Gibbons and colleagues describe two ‘ways of knowing’, termed Mode 1 and Mode 2(Gibbons et al., 1994). Knowledge primarily consisting of propositional knowledge designed to advance a particular academic discipline, developed in universities and led by researchers, is conceptualised as Mode 1. In contrast, knowledge that is intended from the outset to be applied, developed through collaboration between a wide range of stakeholders in a variety of settings, drawing on whichever disciplines can usefully contribute, is termed Mode 2. This framework is an analytic device and it is not intended as a precise description of any particular movement or discipline. However, it focuses attention on the area of practice to which the research is intended to be relevant as a first step, not as one that can be added at a later stage. It argues that people who use or benefit from research-based knowledge should be central to its production, and not an audience to be addressed at the end of the research. And it argues that the relevance of academic disciplines is determined by their contribution to addressing the problem at hand, rather than by their scientific standing in universities. In short, the analysis provides a useful framework for considering how knowledge could be produced in social work if we wish to address practice. Sommerfeld, for example, shows how the approach helps to bridge the gap between the academic discipline and the practice of social work (2005). Fisher contrasts the kind of systematic review produced within the Mode 1 framework with the kind of engaged approach inherent in Mode 2, where the inclusion of broader forms of evidence and user participation allow a greater understanding of the explanatory mechanisms behind intervention effects (2005a). Gredig shows how the approach offers a greater ability to use research to improve practice (2005). In developing evidence-based practice, therefore, the early part of the new century witnessed the rise of the ‘relevance’ criterion as a quality marker for knowledge. MIKE FISHER 24 This was not a replacement for rigour, but rather placed rigour within the context of usefulness, or to put it in the words of a statistician, ‘perfect information on the wrong topics is not useful’ (Brackstone, 2001, p.2). Research on practice knowledge As we developed the methods of EBP in social work and social care, some expensive lessons were learned that had major implications for the relationship between research and practice. Anttila’s (2006) comment about the ambiguity of findings from systematic reviews was replicated in the experience of the major UK source of these reviews in social care, the Social Care Institute for Excellence (SCIE - www.scie. org.uk). Without sufficient high quality evidence to make a definitive assessment of the effectiveness of interventions, recommendations for practice had to be hedged with qualifications. It was rare that subgroup analysis could be undertaken, so the conclusions were difficult to apply to specific groups of the population; research was often two years out of date and practice had changed substantially; and there was little data on cost-effectiveness (Francis et al., 2011; Francis, 2011). SCIE ceased commissioning full reviews in 2008, and although knowledge from practice was always a key component in SCIE’s approach to reviews (Fisher, 2005b), SCIE took steps to strengthen evidence from practice. A good practice initiative offered opportunities for providers to report examples, to be assessed against agreed criteria before being incorporated into a database made freely accessible to the social care community (www.scie.org.uk/goodpractice/browse/ default.aspx).SCIE also used practice enquiries as a key way of systematically gathering data on practice, and these sometimes became a primary source of evidence where research evidence was unavailable or dated (Rutter, 2009). The outcome has been to renew interest in ways of gathering evidence directly from current practice, which in turn has refocused attention on the potential for practice-led research. Practice-led research In 2008, an international group of researchers and service providers developed a statement on practice research, designed to pave the way for increased attention to evidence that is more directly relevant to practice – the Salisbury Statement (Salisbury Forum Group, 2011). While much of the thinking behind the statement reflected international developments in the field over some years, the Statement provided a renewed opportunity to develop the broader approach to EBP in order to impact on practice. The Statement emphasized the need for an equal dialogue between research and practice, and between both and service users; the need for practice to BEYOND EVIDENCE-BASED POLICY AND PRACTICE 25 lead research priorities; the involvement of practitioners in conducting research; and an applied orientation, or the creation of problem-solving knowledge that could be directly employed by practitioners. The Statement did not attempt a conclusive definition of practice research. Participants preferred instead to identify some of the key components: Practice research involves curiosity about practice. It is about identifying good and promising ways in which to help people; and it is about challenging troubling practice through the critical examination of practice and the development of new ideas in the light of experience. It recognises that this is best done by practitioners in partnership with researchers, where the latter have as much, if not more, to learn from practitioners as practitioners have to learn from researchers. It is an inclusive approach to professional knowledge that is concerned with understanding the complexity of practice alongside the commitment to empower, and to realise social justice, through practice. Practice researchinvolvesthegenerationofknowledgeofdirectrelevancetoprofessionalpractice and therefore will normally involve knowledge generated directly from practice itself in a grounded way. (Salisbury Forum Group, 2011, p.5) This approach to definition helps to ensure continuing dialogue, but a paper that explores the impact of practice research requires (slightly) tighter boundaries. For the purposes of this paper, practice research is defined as research that • originates in the concerns of practice and develops practice–based solutions; and • is based on a collaborative, developmental approach that respects the knowledge held by practitioners, and engages practitioners in the research process. This definition is intended to ensure, for example, that practice research does not dissolve into researchers merely talking to practitioners (as would be the case in any research on practice). In practice research, the topic must originate in practice concerns. Similarly, practice research is not conceptualised as aiming solely at understanding, but must incorporate some attention to what practitioners should do as a result of that understanding (‘practice-based solutions’). What is left completely open is the method of engagement: in this definition, practice research may be led by any professional group as long as it is based on engagement, and on respect for practice knowledge. As we shall see later, some practice research projects insist on practitioner leadership, so this definition leaves some scope for a range of models (which is appropriate for a developing field). The UK audience may identify a gap concerning user involvement. It is just becoming possible to detect some approaches that imply that, if user involvement is a key feature of social work, practice research shouldreflectthis,butthisisnotuniversal.Whilesomeprojectstakeuserinvolvement very seriously, it can be argued that this should be a feature of all research in social MIKE FISHER 26 work and does not serve to distinguish practice research. The Salisbury Statement also provided an opportunity to incorporate a much broaderconceptualisationofresearchuseinpractice.Forexample,practiceknowledge could be conceptualised as a form of knowledge in its own right (see Kondrat, 1992), and understood on its own terms, rather than as a ‘methodologically depleted’ form of knowledge (Furlong & Oancea, 2005, p.9). Osmond and her colleagues showed how practitioners’ accounts revealed research knowledge through the use of stories, anecdotes and metaphors (Osmond & O’Connor, 2005; Osmond, 2006), a very differentformofreportingknowledgethantheattemptstogetpractitionerstoprovide accurate citations or definitions of research. As early as 1995, work by Rosen and colleagues on how practitioners decide what to do demonstrated the articulate selection of solutions, involving a rational process in which interventions are carefully selected to maximise outcome attainment (Rosen et al., 1995). Later work focused, like Osmond, on surfacing the tacit knowledge used by practitioners but not necessarily expressed in accessible form (Zeira & Rosen, 2000). What was emerging was a picture of rational practice that had been obscured by practitioners expressing it in ways that researchers could not readily recognise. If this revealed that practice knowledge was indeed rational, rather than based simply on experience or routine, and that practitioners used research, albeit adopted in informal ways, what about research by practitioners? For the first time, studies of practice research were emerging that offered empirical accounts of the kind of knowledge that practitioners were engaged in creating and the kind of supports that were required. Shaw and colleagues (2005, 2006) studied 42 examples of practitioner research in South Wales, and later 15 examples in Scotland (2011). These studies emphasised the potential of practice research to transform both practice and research, but also the need to improve its quality (2005: 1245) and to counter the view that it is in some way a rudimentary form of academic research (2011: 1561). Several long term projects to develop practice research were conducted in the Nordic countries. In Norway, the HUSK project explored collaboration between universities and social services (see Fook et al., 2011), while a similar project – Knowledge-based Social Services – took place in Sweden (led by the National Board for Health and Welfare, a government-funded central agency, rather than by universities – see Hansson, 2003). In Denmark, Ramian and colleagues developed an approach to embedding research in practice, part of which involved practitioners spending 80% of their time on research (see analyses by Uggerhøj, 2011a and 2011b). In Finland, providers and academics had campaigned and experimented since the 1970s to find the right structure for an initiative in practice research. Finally, in 2001, legislation provided for centres of competence (or expertise) in social work (Kananoja, 2009, p.17), and two centres were established in the Heikki Waris and Mathilde Wrede Institutes. University researchers were based here alongside practitionerresearcherstodevelopresearchledbyandconductedbypractitioners.An important development has been the partnership with people who use services, who BEYOND EVIDENCE-BASED POLICY AND PRACTICE 27 are also enrolled in research. As with the schemes studied by Shaw and colleagues, this provides an opportunity to undertake empirical studies of how practice research operates. Recent analysis by Julkunen has identified four models of practice research: the practitioner-oriented, the method oriented, the democratic and the generative model (Julkunen, 2011). As well as providing a conceptual framework to understand practice research, Julkunen’s analysis can be used to identify the support required for the different kinds of activity. For example, the democratic model involves ‘practice reference groups, including users, practitioners and leaders’ and it ‘engages larger systems seeking for broader debates and at the same time empowering participants to create their own knowing-in-action in collaboration with other actors’ (2011: 69): this kind of broad engagement calls for administrative and management resources which will be quite different from resources required for a practice research project focused on intervention methods with individuals. Outside the Nordic context, the question how research can be made more useful in a service context and what supports practice research has been a key theme of Austin’s work with the Bay Area Social Services Consortium, a partnership between universities, particularly the University of California at Berkeley, and San Francisco BayAreaserviceproviders(seewww.bassc.netandhttp://www.mackcenter.org/index. html).This work has led Austin to identifycuriosity as a key to motivating practitioners to engage in research and to outline the organisational support required to foster this motivation (Austin et al., 2012). In Singapore, Sim has developed a practice research project as a partnership between the University and services for ex-offenders, and has demonstrated the potential both for involving people who use services in the research and for generating research that is directly useful to providers and policymakers (Sim, 2011). Driessens describes community-based research and practice collaboration to address poverty in Belgium, while Fargion illustrates how practice research can be embedded in qualifying education in Italy (Driessens et al., 2011). Evidence-based practice and practice research These developments have important implications for EBP and for the relationship between research and practice. Practice research is part of EBP First, it is important to be clear that the rise of relevance and the increasing emphasis on practice research is not a counter-movement to EBP. Certainly, some versions of EBP focus on effectiveness to the point that other relevant evidence is dismissed. However, we should not disparage EBP for this emphasis on effectiveness: without MIKE FISHER 28 such evidence we simply do not know whether we are serving people as well as we can, and controlled studies offer the most secure route to discovering this. But effectiveness is not the only question. We need to know why interventions are effective or not, how they relate to existing practice knowledge, whether they can be implemented in daily practice, and whether the intervention is acceptable and accessible to people who use services. An example from public health - research on smoke alarms – demonstrates the risks of limiting the focus to effectiveness (Roberts et al., 2004; Arai et al., 2007). Children from poor families are more frequently injured in fires than children from better off families, and the installation of smoke alarms could be an effective preventive measure. However, outcome studies showed that the installation of smoke alarms had a much smaller effect than expected. Puzzled by this finding, Roberts and her colleagues gathered information on the context of the studies (sometimes called implementation data – see Popay et al., 2006) and found extensive barriers to the adoption of smoke alarms, such as being unable to install them appropriately or to afford the batteries. Their investigation of implementation data not only helped to explain the poor results but also to make future intervention of this kind more effective. This work has been repeated with non-intentional injury (Roen et al., 2006).The point is that implementation data uses knowledge about daily practice, precisely the kind of knowledge already available to practitioners, and accessible through practice research. There is therefore no contradiction between underlining the value of controlled studies in identifying effectiveness – the core EBP methodology – and an emphasis on the role of practice research in accessing evidence abut implementation. One kind of evidence complements the other. Thisisnowbeginningtoberecognisedinwiderdebatesaboutthekindofknowledge required to improve public services. For example, Kelly and Moore, writing about the evidence requirements of the National Institute for Clinical Excellence (NICE) in the UK, emphasise the need to understand the ‘causal pathway’ that delivers better outcomes and calls for randomised controlled trials to be supplemented with process data in order to achieve this (Kelly and Moore, 2011). NICE’s counterpart in social care, the Social Care Institute for Excellence, has developed an inclusive approach to what counts as knowledge, including different sources of knowledge such as that held by people who use services and by practitioners (Fisher, 2005b). In addition to these examples from secondary analysis, good examples exist in primary, empirical research. For example, the Families and Schools Together project (FAST - an after-school multi-family group programme: see http://www. familiesandschools.org/) shows how it is possible to combine controlled studies with implementation data both to explain results and to refine the intervention. This is how they tackled retention, for example. Parenting programmes of this kind suffer from high withdrawal rates, typically 40%, but in African-Caribbean communities, as high as 90%. McDonald and her colleagues have shown that it is possible to BEYOND EVIDENCE-BASED POLICY AND PRACTICE 29 reduce this to below 20%, by studying the processes of participation and refining the strategies adopted by practitioners to engage families (McDonald et al. 2012). Practice research has important implications for ‘knowledge transfer’ The field of knowledge transfer is growing as we discover more about the lack of research impact. Perhaps typically, researchers have invented a great number of terms including knowledge exchange, knowledge transfer and even implementation science (for which there is now a journal, Implementation Science- http://www. implementationscience.com/). An ungenerous observer might view this with some scepticism: having failed to engage with practice, researchers invent a new field of study on the reasons and how it can be corrected. This perception might be confirmed by the realisation that, almost without exception, this field is characterised by studies of how practitioners and services can be helped to adopt research. The focus of Implementation Science, for example, is ‘research relevant to the scientific study of methods to promote the uptake of research findings into routine healthcare in clinical, organisational or policy contexts’ (http://www.implementationscience.com/).What is lacking from this field currently is a genuine attempt to view the issues through the lens of practice, and to ask whether we need a different kind of research or different kinds of researchers. Oneofthekeypurposesofpracticeresearchisthereforetoquestiontheassumption that the problem in knowledge transfer is getting practitioners to adopt research. The approach allows us to surface the differences between the pre-occupations of researchers and those of practitioners, to ask whether researchers have the right training, skills and interests to undertake research that has practice relevance, and to raisequestionsaboutwherepracticeconcernsfigureinprioritiesfornationalresearch. The analysis has already highlighted some of the different priorities between researchers and practitioners in the field of child welfare (Stevens et al., 2009). In adult care, an earlier analysis showed how key research lacked practice applicability: research excluded awkward clients, reported outcomes based on group means without considering how findings applied at the level of the individual client, took little account of the views of people who use services about the outcomes, or was unconcernedwithimplicationsforpractice(Fisher,1997).Thisdifferenceinpriorities is not confined to Western social science. In the new welfare states such as China, similar problems are beginning to arise. For example, Au and colleagues show how a sense of self-efficacy helps to protect people from depression arising from caregiving for relatives with dementia, but do not consider what social work practices increase self-efficacy among caregivers (Au et al., 2009). Lai has shown how the symptoms and behavior associated with depression vary significantly between different groups of Chinese elders, but does not go on to consider the implications for social work assessment (Lai, 2009). Chou shows how willingness to use institutional elder care MIKE FISHER 30 is mediated by views on the role of children in caring for elders and family harmony, but does not consider the implications for social work assessment (Chou, 2010). China and other emerging welfare states are likely to provide a major source of research-based evidence in the future, particularly on elder care. In their own terms, the examples cited are high quality studies, but they suggest a risk that research from thenewlyemergingwelfarestatessharessomeoftheshortcomingsofWesternresearch in that researchers show a tendency to be primarily concerned about understanding and practice application is not seen as a key concern. If knowledge ‘transfer’ is typically seen as improving the adoption of knowledge by practice, that is not what these examples suggest is required. Instead of implementation science directed at examining why service providers do not use this research, it would be much more useful to examine why research does not centrally address practice application. Instead of investigating the research literacy of practitioners, we might usefully turn our attention to what might be termed the ‘practice literacy’ of researchers (Fisher, 2011) or to the development of ‘practice-research-mindedness’ (Karvinen-Niinikoski, 2005). Practice research requires respect and a collaborative approach If we take seriously the claim that researchers need practice literacy, this would not of course obviate the need for research-minded practitioners. There is little doubt that research literacy amongst practitioners could be improved and that this would be likely to generate a more critical approach to the use of evidence. (It is still an open question whether practice itself would improve, especially given the gravitational pull of the academic community on practitioner researchers – see Shaw, 2005.) However, the genuine engagement of researchers with practice is the hallmark of the practice research movement. Developments in Finland provide many examples. Greatvalueisplacedonsystematicreflexivitybypractitionersasasourceofknowledge rather than assigning primary importance to the analytic skills of researchers (Satka and Karvinen, 1999). The experience of practice is therefore the starting point for critical analysis, which then must be contextualised in wider understanding of social processes. Symbolically and practically, researchers are placed in a practice environment in the Mathilde Wrede or Heikki Waris Institutes (see Kananoja, 2009). Researchers may still be employed by a university, but practice is their place of work, just like social workers. Capitalising on their personal interests and experience, practitioners select topics for investigation, which are then negotiated to take account of the needs of the service and focused in the light of research reviews that identify related work (Julkunen, 2011). There is no gap to be bridged between research and practice, and the specific topic is situated in the wider social science knowledge base. Julkunen also describes how the particular value placed in social work practice on the knowledge and BEYOND EVIDENCE-BASED POLICY AND PRACTICE 31 experiences of people who use services is reflected in research processes, that are in themselves participatory and that have the potential to be empowering. Saurama describes how ‘the researcher can participate in direct social work and in that way implicitly get in touch with the service users’ viewpoint (in Driessens et al., 2011: 79). The emphasis on the relevance of knowledge derived from practice opens a door to a more respectful relationship: Social-work-practice research knowledge is tied to the need to develop practice. It promotes interaction and equal discussion among different actors in order to enable change (Julkunen, 2011, p.64) In exploring the lessons from the Finnish model, there is of course a risk of overlooking the distance still to be travelled. As its proponents recognised in the title of the 2012 Helsinki conference (Practice Research in Social Work – Producing Robust Knowledge), there is an inherent risk that knowledge based on practitioner experience and local services lacks generalisability. There remains the risk that the status of universities distorts knowledge production and that practitioner-researchers will withdraw to study practice, rather than researching practice through remaining engaged. In addition, the lessons from user involvement in other countries suggest therewillbeatensionarisingfromthelogicalprogressionofuserinvolvementtowards user control, rather than involvement at the invitation of practitioners. These issues remain to be addressed in the future. Conclusion The problematic relationship between research and practice is therefore nothing new. It has been a constant theme throughout the history of social work, and the evidence-based policy and practice movement has been one of the most powerful ways of defining the relationship. As a social movement, EBP has suffered from over-expectations and under-delivery. As those providing public services sought better knowledge about effectiveness, there was not the evidence base to support the kind of certainty expected by policymakers and politicians. The more limited and cautious formulations in health care disciplines, coupled with their greater critical mass, allowed incremental and sustained programmes of work, based on sophisticated measurement tools, and this meant that social work (and social care more broadly) lost out in any comparison. Nor has EBP in social work and social care provided a means of addressing any of the underlying problems of a lack of national policy, a lack of skills in systematic reviews, in quantitative and economic analysis, and the lack of investment in the intervention studies that provide the building blocks for EBP. MIKE FISHER 32 It may be unfair to lay all this at the door of the EBP movement, but its proponents have certainly added other reasons to question its current formulation. Its emphasis on effectiveness at the expense of relevance, its failure to engage with practice priorities and often to respect practitioners, and its lack of focus on developing problem-solving interventions, are problems that arise from the way that EBP has been formulated. This means that we need to go beyond the limitations of current versions of EBP. This paper suggests that a practice research perspective provides one key way to reframe the relationship between research and practice. Its emphasis on practice issues as the topics for research helps to maximise investment in studies that service providers want (and helps to avoid having to find additional investment to overcome theirresistancetotransferringfindingsintopractice).Itprovidesawayofregenerating afocusonproblem-solving,developingpracticesthataddressdailyissuesthatservices encounter, in ways that can be implemented in existing services, with attention to what is affordable as well as effective. Because it is integral to (the best) practice, the involvement of people who use services, and their carers, is integral to practice research. User and carer involvement provides the primary source of research data on acceptabilityandaccessibility.Indeeditcanbearguedthatthequalityofresearchmust increasingly be defined in terms of whether it incorporates attention to user-defined outcomes, and attention to whether services are viewed as accessible and acceptable. A practice research perspective reduces the gap between researchers and practitioners, as practice becomes the principal environment for knowledge production. The knowledge held by practitioners about implementing services becomes a key research resource and the skills and knowledge possessed by practitioners offer an extensive research workforce, far exceeding the number of university-based researchers in social work and social care. A practice research perspective thus offers a way of redefining evidence-based policy and practice and achieving a greater impact of research on practice. References Anttila, S. (2006) Pursuit of Evidence in Social Work Practice: A Critical Thinking Approach to Evaluation, The Inter-Centre Network for the Evaluation of Social Work Practice Workshop, 2006, Copenhagen Arai, L., Britten, N., Popay, J., Roberts, H., Petticrew, M., Rodgers, M., and Sowden, A. (2007) Testing methodological developments in the conduct of narrative synthesis: A demonstration review of research on the implementation of smoke alarm interventions, Evidence & Policy, 3, 361-383 Au, A., Lai, M. -K., Lau, K. -M., Pan, P. -C., Lam, L., Thompson, L., and Gallagher-Thompson, D. (2009) Social support and well-being in dementia family caregivers: The mediating role BEYOND EVIDENCE-BASED POLICY AND PRACTICE 33 of self-efficacy, Aging & Mental Health, 13, 5, 761 – 768 Austin, M, Dal Santo, T. and Lee, C. (2012) Building organizational supports for researchminded practitioners. Journal of Evidenced-Based Social Work, 9, 1, 1-39 Blythe,B.(1988)Applyingpracticeresearchmethodsinintensivefamilypreservationservices’, in J Whittaker, J Kinney, E Tracy, and C Booth (Eds.), Improving Practice Technology For Work With High Risk Families: Lessons From The HomeBuilders. Social Work Education Project. University of Washington, School of Social Work, Center for Social Welfare Research, Monograph No. 6. Seattle: University of Washington( pp. 147-164) Brackstone, G. (2001) How Important is Accuracy? Ottawa: Proceedings of Statistics Canada Symposium: Achieving Data Quality in a Statistical Agency: A Methodological Perspective Chilvers, R., Macdonald, G.M., and Hayes, A. A. (2006) Supported housing for people with severe mental disorders. Cochrane Database Syst Rev 4 Chou, R. (2010) Willingness to live in eldercare institutions among older adults in urban and rural China: A nationwide study. Ageing & Society, 30, 4, 583-608 Driessens, K., Saurama, E., and Fargion, S. (2011) Research with social workers to improve their social interventions, European Journal of Social Work, 14, 1, 71-88 Fisher, M. (1998) Research, knowledge and practice in community care. Issues in Social Work Education, 17, 2, 17-30 Fisher, M. (2005a) Knowledge production for social welfare: enhancing the evidence base, in Sommerfeld, P. (ed.), Evidence-Based Social Work. Towards a new professionalism? Bern: Peter Lang (pp.127-147) Fisher, M. (2005b) The Social Care Institute for Excellence: The role of a national institute in developing knowledge and practice in social care. in A. Bilson (Ed.) Evidence based Practice in Social Work. London: Whiting and Birch (pp.141-175) Fisher,M.(2011)Practiceliterateresearch:Turningthetables.SocialWork&Society,9,1,20-28 Fisher, M., and Marsh, P. (2003) Social work research and the 2001 research assessment exercise: An initial overview. Social Work Education, 22, 1, 71-80 Fook, J., Johannessen, A, .and Psoinos, M. (2011) Partnership in practice research: A Norwegian experience, Social Work & Society, 9, 1, 29-44 Francis, J. (2011)SCIE’s Approach to Economic Evaluation in Social Care. London: Social Care Institute for Excellence Francis, J., Fisher, M., and Rutter, D. (2011)Reablement: A cost-effective route to better outcomes. Research Briefing 36. London: Social Care Institute for Excellence, Furlong, J. and Oancea, A. (2005)Assessing Quality in Applied and Practice-Based Educational Research, Oxford : Oxford University Department of Education Studies Gibbons, M., Limoges, C., Nowotny, H., Schwartzmann, S., Scott, P., and Trow, M. (1994) The New Production of Knowledge: The dynamics of science and research in contemporary societies. London: Sage Gredig, D. (2005)The co-evolution of knowledge production and transfer: evidence-based intervention development as an approach to improve the impact of evidence on social work practice, in Sommerfeld, P. (ed.)Evidence-Based Social Work. Towards a new professionalism? Bern : Peter Lang (pp.173-98) MIKE FISHER 34 Kananoja, A. (2009) Practice-Research Iin Social Work in Finland: Background factors and developments, Helsinki : Helsinki Social Services Karvinen-Niinikoski, S. (2005) Research orientation and expertise in social work—challenges for social work education tutkimussuuntautuneisuus ja asiantuntijuus—sosiaalityön koulutuksen uudet haasteet. European Journal of Social Work, 8, 3, .259-271 Kondrat, M. (1992) Reclaiming the practical: formal and substantive rationality in social work practice, Social Service Review, June, .237-255 Hansson, J-H. (2003)Promoting Evidence-Based Practice in Social Services and Health Care. The 11th Annual European Social Services Conference, Venice Hess, P.M. and Mullen, E. (1995) Practitioner-Researcher Partnerships: Building Knowledge from, in and for practice. Washington DC: National Association of Social Workers Howard, M., McMillen, C., and Pollio, D.(2003)Teaching evidence-based practice: Toward a new paradigm for social work education, Research on Social Work Practice 13, 2, 234-259 Kelly, M. and Moore, T. (2011) Methodological, theoretical, infrastructural, and design issues in conducting good outcome studies, Research on Social Work Practice, 21, 6, 644-653 Julkunen, I. (2011) Knowledge-production processes in practice research: Outcomes and critical elements, Social Work & Society, 9, 1, 60-75 Lai, D.W.L. (2009) Depressive symptoms of elderly Chinese in Guangzhou, Hong Kong, and Taipei, Aging & Mental Health, 13, 5, 725-735 McDonald, L., FitzRoy, S., Fuchs, I., Fooken, I., and Klasen, H. (2012) Strategies for high retention rates of low-income families in FAST (Families and Schools Together): An evidence-based parenting programme in the USA, UK, Holland and Germany. European Journal of Developmental Psychology, 9, 1, 75-88 Marsh, P. and Fisher, M. (2008) The development of problem-solving knowledge for social care practice. British Journal of Social Work, 38, 5, 971-987 Osmond, J. (2006) Knowledge use in social work practice. Journal of Social Work, 6, 3, 221-37 Osmond, J. and O’Connor, I. (2004) Formalizing the unformalized: Practitioners’ communication of knowledge in practice. British Journal of Social Work, 34, 5, 677-692 Popay, J., Roberts, H., Sowden, A., Petticrew, M., Arai, L., Rodgers, M., and Britten, N. (2006). Guidance on the Conduct of Narrative Synthesis in Systematic Reviews. Lancaster: Lancaster University Roberts, H., Curtis, K., Liabo, K., Rowland, D., DiGuiseppi, C., and Roberts, I. (2004) Putting public health evidence into practice: Increasing the prevalence of working smoke alarms in disadvantaged inner city housing. Journal of Epidemiology and Community Health, 58, 4, 280-28 Roen, K., Arai, L., Roberts, H., and Popay, J. (2006) Extending systematic reviews to include evidence on implementation: Methodological work on a review of community-based initiatives to prevent injuries, Social Science & Medicine, 63, 10-10-71 Rosen, A., Proctor, E., Morrow-Howell, N,. and Staudt, M. (1995) Rationales for practice decisions: variations in knowledge use by decision task and social work service, Research on Social Work Practice, 5, 4, .501-523 Rutter, D. (2009)The Conduct of SCIE Practice Enquiries. London: Social Care Institute for BEYOND EVIDENCE-BASED POLICY AND PRACTICE 35 Excellence Salisbury Forum Group, The (2011) The Salisbury statement, Social Work & Society, 9, 1, 4-9 Satka, M., and Karvinen, S. (1999) The contemporary reconstruction of Finnish social work expertise. European Journal of Social Work, 2, 2, 119-129 Saurama, E., and Julkunen, I. (2011) Approaching practice research in theory and practice, Social Work & Social Sciences Review, 15, 2, 57-75 Shaw, I. (1999)Qualitative Evaluation. London: Sage Shaw, I. (2005) Practitioner research: Evidence or critique? The British Journal of Social Work, 35, 8, 1231-1248 Shaw,I.andFaulkner,A.(2006)Practitionerevaluationatwork.AmericanJournalofEvaluation, 27, 1, .44-63 Sheldon, B. and Chilvers, R. (2001) Evidence-based Social Care: Problems and prospects. Lyme Regis: Russell House Sim, T. (2011) Collaborating or colluding: A practice research project with ex-offenders and their families in Singapore. Social Work & Society, 9, 1, 76-88 Social Care Institute for Excellence (2004) Practice Guide 3: Fostering. London: Social Care InstituteforExcellence.Availablefromhttp://www.scie.org.uk/publications/practiceguides/ fostering/index.asp Sommerfeld, P. (2000) Forschung und Entwicklung als Schnittstelle zwischen Disziplin und Profession. Neue Formen der Wissensproduktion und des Wissentranfers. in H. Homfeldt and J. Schulze-Krüdener (eEds.), Wissen und Nichtwissen: Herausforderungen für Soziale Arbeit in der Wissensgesellschaft. Weinheim and München: Juventa (pp.221-236) Southampton Practice Research Initiative Network Group (2008) The Salisbury Statement on Practice Research. available fromhttp://www.socsci.soton.ac.uk/spring/salisbury/ Stevens, M.,Liabo, K., Witherspoon, S., and Roberts, H.(2009) What do practitioners want from research, what do funders fund and what needs to be done to know more about what works in the new world of children’s services? Evidence & Policy, 5, 3, 281-294 Uggerhøj, L. (2011a) What is practice research in social work: Definitions, barriers and possibilities. Social Work & Society, 9, 1, 45-59 Uggerhøj, L. (2011b) Theorizing practice research in social work, Social Work & Social Sciences Review, 15, 1, 49-73 Zeira, A. and Rosen, A. (2000) Unraveling ‘tacit knowledge’: What social workers do and why they do it, Social Service Review, 74, 103-123 MIKE FISHER 36 Web resources Bay Area Social Services Consortium http://www.bassc.net/ (accessed 21 June, 2012) Families and School Together (FAST) http://www.familiesandschools.org/ accessed 21 June 2012 Implementation Science (Journal) http://www.implementationscience.com/ accessed 21 June 2012 The Mack Center http://www.mackcenter.org/index.html (accessed 21 June 2012) The Salisbury Statement http://www.socsci.soton.ac.uk/spring/salisbury/ (accessed 21 June 2012) Social Care Institute for Excellence http://www.scie.org.uk www.scie.org.uk/goodpractice/browse/default.aspx (accessed 21 June 2012)