Open Access
Research article
First Online: 08 December 2018
- 693Downloads
- Citations
Part of the following topical collections:
Abstract
Background
Evidence-based practice (EBP) is a clinical decision-making framework that supports quality improvement in healthcare. While osteopaths are key providers of musculoskeletal healthcare, the extent to which osteopaths engage in EBP is unclear. Thus, the aim of this cross-sectional study was to investigate UK osteopaths’ attitudes, skills and use of EBP, and perceived barriers and facilitators of EBP uptake.
Methods
UK-registered osteopaths were invited to complete the Evidence-Based Practice Attitude and Utilisation Survey (EBASE) online.
Results
Of the 5200 registered osteopaths in the UK, 9.9% (517/5200) responded to the invitation, and 7.2% (375/5200) completed the EBASE (< 20% incomplete answers). The demographic characteristics of the survey sample were largely similar to those of the UK osteopathy workforce. The osteopaths reported overall positive attitudes towards EBP, with most agreeing that EBP improves the quality of patient care (69.3%) and is necessary for osteopathy practice (76.5%). The majority reported moderate-level skills in EBP, and most (80.8%) were interested in improving these skills. Participating osteopaths typically engaged in EBP activities 1–5 times over the last month. Barriers to EBP uptake included a lack of time and clinical evidence in osteopathy. Main facilitators of EBP included having access to online databases, internet at work, full-text articles, and EBP education materials.
Conclusions
UK osteopaths were generally supportive of evidence-based practice, had moderate-level skills in EBP and engaged in EBP activities infrequently. The development of effective interventions that improve osteopaths’ skills and the incorporation of EBP into clinical practice should be the focus of future research.
Keywords
Evidence-based practice Osteopathy Cross-sectional surveyBackground
Recent decades have witnessed a gradual movement towards clinical care informed by research evidence [1]. This concept of evidence-based medicine - more inclusively termed evidence-based practice (EBP) - is described as the integration of best available research evidence with clinical expertise and patient values [2], and is now considered a common-sense approach to modern healthcare provision [3]. Despite widespread support for EBP, its integration into healthcare policy and practice has been ad hoc across professions and jurisdictions [4, 5, 6, 7, 8, 9, 10, 11]. This may be partly due to a criticism of EBP in housing reliance on algorithm-driven decision-making; challenges in translating research evidence into patient-centred care and difficulties applying EBP to complex clinical presentations may present as additional barriers to EBP uptake [12, 13]. These concerns have been highlighted by various disciplines, including healthcare philosophy [14, 15], medicine [12, 16, 17, 18, 19], physiotherapy [20, 21, 22], and more recently, osteopathy [23, 24, 25].
Osteopathy is a health profession originating in the United States in the late 1800s, where its providers, osteopathic physicians, are licenced to practice in all areas of medicine; yet, although the majority (56%) practice in primary care specialities [26], few (1–2%) specialize in osteopathic manipulative treatment [27]. This is in contrast to osteopathic practitioners, or osteopaths, trained outside the US, where manual therapy treatment may be considered the main scope of osteopathic practice [28]. In the UK, osteopaths are autonomous primary care practitioners primarily trained to manage musculoskeletal conditions, of which spinal pain is the most common, and whom typically provide manual therapy treatment, exercise and self-management recommendations [29, 30].
Osteopathic practice and clinical decision-making is embedded within traditional concepts and principles [31, 32], many of which are drawn from interpretations and observations by prominent individuals made early in the history of the profession. Contemporary research has led some authors to question the validity and usefulness of such models [33, 34, 35]. Certainly, there is insufficient research evidence to support all aspects of osteopathy practice and the need for a broader research agenda has been proposed [36, 37]. Although the role of research evidence in osteopathy has been debated, there is agreement that EBP needs to be integrated into the osteopathic approach [23, 38, 39, 40, 41]. Nonetheless, potential barriers to the acceptance of EBP by UK osteopaths remain with some for example concerned that implementing EBP may fail to preserve traditional osteopathic principles [13, 42]. Research in physiotherapy points to other factors that may act as barriers to implementing EBP, such as financial and time constraints, and the possible conflict of evidence with patients’ treatment preferences and expectations [43, 44]. Whether these barriers also apply to osteopathy is largely unknown given the relative paucity of EBP research in osteopathy. The aim of the study presented in this paper was to investigate UK osteopaths’ attitudes, skills and utilisation of research evidence in practice, their training in EBP, as well as the barriers to, and facilitators of EBP adoption.
Methods
Design and ethics
National cross-sectional survey in the UK. The Research Ethics Committee of the University College of Osteopathy (London, UK) approved the study. No identifying data were collected and results were only reported as aggregate data, thus maintaining participant anonymity. Study participation was voluntary with the option to withdraw/drop-out at any time.
Setting and participants
All osteopaths registered with the General Osteopathic Council in the UK by June 2017 were eligible and sampled.
Description of questionnaire and variables
The Evidence-Based practice Attitude and utilization SurvEy (EBASE) is an 84-item instrument evaluating the attitudes, perceived skills and use of EBP amongst healthcare providers [47]. The questionnaire has been previously administered to different health provider groups [4, 5, 8, 9, 47, 48], and psychometric evaluation shows good internal consistency, content validity, construct validity, and acceptable test-retest reliability [47, 49].
EBASE is divided into seven parts: attitude (Part A), skill (Part B), education and training (Part C), use (Part D), barriers to EBP (Part E), and enablers of EBP (Part F). The final section, Part G, gathers information on participant demographics. Parts A, B and D of EBASE generate three subscores: attitude subscore, range from 8 (predominantly strongly disagree) to 40 (predominantly strongly agree); skill subscore, range from 13 (primarily low-level skill) to 65 (primarily high-level skill); and use subscore, range from 0 (mainly infrequent use) to 24 (mainly frequent use). For this study, several survey items were modified for the target population (e.g., the term ‘osteopathy’ was substituted for complementary medicine ‘CAM’). Several response options in Part G (demographics) also underwent change to ensure suitability for a UK audience. These changes did not alter item meaning, and thus, did not affect the validity or reliability of the instrument. The survey was administered electronically (hosted by SurveyMonkey™), with all questions made compulsory to minimise missing data.
Recruitment and data collection procedures
A pilot study was performed on a convenience sample of five osteopaths associated with the University College of Osteopathy (London, UK), each with varying clinical and academic experience. The purpose of the pilot was to ensure the survey items were clear and appropriate to osteopaths. Some minor terminological changes were made as a result, primarily to enhance clarity. Estimated completion time was 10–15 min.
For the subsequent full study, potential participants were invited to voluntarily participate in the survey through emails sent by the General Osteopathic Council (GOsC), the Institute of Osteopathy (iO) and the National Council for Osteopathic Research (NCOR) in the UK. These agencies also promoted the survey via electronic/paper media. The emails and posts contained a link to EBASE and a participant information sheet, which explained the study, its relevance to the osteopathic profession and what participation would involve. Interested participants had to provide informed consent by responding to a screening question at the beginning of the questionnaire hosted by SurveyMonkey™. A follow-up email was sent two weeks after the initial invitations to remind participants to participate in the study. Data collection was undertaken between the months of June and August 2017. The online survey tool automatically restricted attempts to respond to the survey more than once by the use of cookies per device.
Statistical methods
The sample size calculation, based on a target population of 5200 osteopaths (March 31, 2017) [45], and a response distribution of 50%, indicated that at least 358 osteopaths would need to complete the EBASE questionnaire to attain a 5% margin of error with a confidence level of 95% for each survey item [46]. Survey responses were exported from SurveyMonkey™ into SPSS (v.24.0) for coding and statistical analysis. Partially-completed surveys, due to drop-out, were excluded from the analysis if more than 20% of all items were incomplete [10]. Any missing data were reported as missing values. Categorical data were described using frequency distributions and percentages. Measures of central tendency and variability were used for normally distributed descriptive data (including continuous [i.e. EBASE subscore] and categorical [i.e. Likert scale] data), while medians and the interquartile range were used to describe non-normally distributed data. Associations between ordinal-level variables were examined using Kendall’s Tau correlation coefficient (Ƭ), and relationships between nominal-level variables assessed using Cramer’s V, with coefficients between 0.10–0.29 representing a weak association, 0.30–0.49 a moderate association, and 0.50 and above a strong correlation. The tests of association were informed by previous research using EBASE [4, 5, 6, 8, 9, 10]. The level of significance was set at p < 0.05.
Results
A total of 517 (9.9%), out of 5200 UK-registered osteopaths [45] responded to invitations to participate. Excluding 142 responses that were > 20% incomplete, the final response rate was 7.2% (375/5200), which exceeded the minimum required sample size.
Description of the sample
Participants were largely ≥40 years of age (53.9%), and held an honours degree or higher (54.2%) (Table 1). Participant gender was equal, most had practiced osteopathy for at least 6 years (65.8%), and half worked in southern UK (50.4%). For further demographic data, see Table 1.
Table 1
Demographic characteristics of sample (n = 375)
Attitudes toward EBP
Participants reported generally positive attitudes toward EBP, with a median subscore of 30 (IQR 26,33; range 11–40; with a median score ranging between 24.1 and 31.9 defined as a predominantly neutral to agree response). The majority (82.6%) agreed that professional literature and research findings are useful for practice, EBP assists in clinical decision making (80.8%), and EBP is necessary in the practice of osteopathy (76.5%) (Table 2). Most (80.8%) also reported an interest in learning or improving the skills necessary to incorporate EBP into practice.
Table 2
Participant attitudes toward evidence-based practice (n = 375)
There was no significant association between attitude subscore and most demographic characteristics. There was a weak association between attitude and gender (with higher attitude scores reported in males; V = 0.294, p < 0.001) and geographical region (with higher attitude scores reported among osteopaths working in the city and inner city suburbs; V = − 0.246, p < 0.001). There was also a weak negative correlation between attitude and years since receiving highest qualification (Ƭ = − 0.130, p = 0.01) and a weak positive correlation between attitude and hours/week participating in research (Ƭ = 0.242, p < 0.001).
Skills in EBP
Participants reported moderate levels of perceived skill in EBP, with a median subscore of 39 (IQR 32,45; range 13–65; with a median score ranging between 26.1 and 39.0 defined as a predominantly low-moderate to moderate skill level). The highest levels of perceived skill in EBP were reported for items relating to clinical problem identification (Table 3). The lowest levels of perceived skill were reported for the conduct of systematic reviews (74.6%) and clinical research (80.5%).
Table 3
Participant self-reported skills in evidence-based practice (n = 375)
There was a weak positive correlation between skill subscore (categorised by quartiles) and highest qualification (Ƭ = 0.240, p < 0.001) and hours per week teaching in the higher education sector (Ƭ = 0.212, p < 0.001), and a weak negative correlation between skill subscore and years since receiving highest qualification (Ƭ = − 0.204, p < 0.001). A moderate positive correlation between skill subscore and hours per week participating in research (Ƭ = 0.382, p < 0.001) was also observed.
Use of EBP
Participants engaged in EBP activities at a moderately-low level in the month preceding the survey, with a median subscore of 12 (IQR 11,15; range 6–30; with a median score ranging between 6.1 and 12.0 defined as a predominantly moderately-low level of use). Most (> 65%) participants partook in the first five EBP-related activities no more than five times in the preceding month (Table 4). A similar level of activity was also reported for consultation with a colleague/industry expert (77.1%) or use of the lay literature (80.6%) to assist clinical decision-making. The only exception to this was the use of online search engines to pursue practice related literature or research, which was performed by 68% of participants, between 1 and 10 times in the month prior.
Table 4
Participant use of evidence-based practice (i.e. number of times each activity was performed over the last month) (n = 375)
There was a moderate positive correlation between use subscore (categorised by quartiles) and hours per week participating in research (Ƭ = 0.300, p < 0.001). There was a weak positive correlation between use subscore and highest qualification (Ƭ = 0.226, p < 0.001), hours per week teaching in the higher education sector (Ƭ = 0.120, p = 0.032) and geographical region (i.e. higher use scores amongst osteopaths working in the city and inner city suburbs; V = 0.174, p = 0.005). A weak negative correlation was observed between use subscore and years since receiving highest qualification (Ƭ = − 0.235, p < 0.001) and years practicing osteopathy (Ƭ = − 0.133, p = 0.013).
Almost one-third (30.7%) of participants reported that no more than 25% of their clinical practice was informed by clinical trial evidence; 20.5% reported clinical evidence informed 26–50% and 51–75% of practice, and 6.4% indicated clinical evidence informed 76–99% of practice. The information source most frequently used by participants to inform their clinical decision-making was traditional knowledge (median rank 3; IQR 1,5), followed by clinical practice guidelines (median rank 4; IQR 2,7) and personal intuition (median rank 5; IQR 2,7) (Table 5).
Table 5
Sources of information used to inform clinical decision-making (ranked by most frequent to least frequently used source)a (n = 375)
Training in EBP
Most participants reported some level of training in evidence-based practice/osteopathy (81.3%), evidence application (71.5%), critical thinking/analysis (72.3%), and clinical research (57.1%). Participants mainly received this training as a component of a study program (38.7–46.7%), and to a lesser extent, via a seminar or short course (9.6–25.1%). Over half (52.3%) of respondents had received no training in the conduct of systematic reviews and meta-analyses.
Barriers to and enablers of EBP uptake
The only factors perceived by most participants as being moderate to major barriers to EBP uptake were a lack of clinical evidence in osteopathy (69.1%), and lack of time (56.6%). Most participants perceived other factors to be minor barriers or no barrier to EBP uptake.
Most participants reported that internet in the workplace (70.5%), ability to download full-text articles (60.1%), access to free online databases (55.9%) and online EBP education materials (55.9%) were ‘very useful’ enablers of EBP implementation. Factors considered ‘moderately to very useful’ included access to critical reviews (80.1%), databases requiring licence fees (73.3%), and critically appraised topics relating to osteopathy (67.5%). Very few participants rated the listed enablers of EBP uptake as not useful.
Discussion
Key results
Response rate and sample characteristics
The response rate for the study was 7.2%. While low, it was within the range (4–8%) of previous national surveys examining EBP use among complementary therapists [5, 8, 10]. However, the response rate was considerably lower than the 30% reported in the UK nationwide Osteopaths’ Opinions Survey in 2012 [50], possibly due to the relatively longer time required to complete the EBASE. Nonetheless, the demographic characteristics of our sample were similar to those of the UK osteopathy profession in terms of gender, age and geographical distribution, years of experience, typical practice setting and types of treatment provided [50, 51]. This suggests that our survey sample was broadly representative of osteopaths in the UK.
EBP attitudes
Positive attitudes of UK osteopaths towards EBP support results of previous surveys of CAM professions, including US/Canadian chiropractors [4, 5, 8], western medical herbalists [9], yoga therapists [10], Australian naturopaths and TCM/acupuncture practitioners [6]. Similarly, positive attitudes toward EBP have been reported among physiotherapists [52] and occupational therapists [53].
Thirty-seven per cent of the respondents disagreed that patients’ treatment preferences should be taken into account in EBP, compared with 21–24% of chiropractors [5, 8], and 22% of yoga therapists [10] in North America. This finding suggests UK osteopaths have a limited understanding of EBP and patient-centred care, particularly the awareness of the role of patient’s perspectives among the three main elements of EBP decision-making, alongside best available evidence and the clinician’s expertise [1, 54]. Notwithstanding, most participants agreed or strongly agreed that EBP incorporates clinical expertise into account in clinical decision-making; a view shared by other health professions, especially chiropractors [4, 5, 55]. The implications of these findings to clinical practice require further investigation.
EBP skills
Overall, participants reported moderate levels of perceived skill in EBP, with the highest levels reported for items relating to problem identification and evidence acquisition. This suggests that osteopaths perceive themselves as sufficiently skilled in the first two stages of the EBP process, i.e. asking a searchable question and acquiring the right evidence [56]. The findings also indicate that UK osteopaths are moderately skilful in critically appraising, synthesising and applying research evidence to clinical practice, i.e. the final three stages of the EBP process [56]. An implication of these findings may be that clinician training in EBP should focus more on developing skills related to the appraisal and application stages of the EBP process. Indeed, an Australian longitudinal study showed research training for student nurses improved perceived skill level in the latter stages of the EBP process [55].
Participants reported lower levels of skill in using findings from systematic reviews compared to findings from other types of clinical research, which is consistent with previous surveys among chiropractors [4]. The use and interpretability of systematic reviews may be challenging in health professions [57], and given that such studies and meta-analyses represent the highest level of evidence, it is essential that osteopaths gain sufficient skills to utilise such results.
EBP use, barriers and facilitators
Although most participants engaged in various EBP-related activities one month preceding the survey, albeit infrequently, almost one-quarter of respondents reported never engaging in EBP activities in the preceding month. This moderately-low level of engagement in EBP-related activities are attributed to several factors, where for example participants cite a lack of time. Participants further reported a perceived lack of clinical evidence in osteopathy as a barrier to EBP uptake; a possible demotivator for practicing osteopaths [58, 59]. Participants indicated that access to the internet at work, online medical databases, full-text journal articles, and online education materials would be very helpful in facilitating EBP uptake. Thus it is likely that inadequate access to these services may significantly affect osteopaths’ ability to engage in EBP [13].
Further, the nature of osteopathic practice in the UK may not impose high-level engagement in EBP-related activities. Perhaps patients seeking care from UK osteopaths present with such a consistent range of symptoms and disorders that osteopaths do not feel the need to engage in EBP activities at a high-level? Such hypothesis may also help explain similar levels of EBP activity within the chiropractic profession [4, 5]. However, the level of engagement in EBP activities of US physical therapists has been reported as higher than that of our cohort, with one study showing 66% of respondents consulting research material and 52% having used a medical database, four to 10 times weekly to make clinical practice decisions [60]. Thus, future research should determine ideal levels of EBP activity for practicing osteopaths, and whether that might vary between different clinical settings and scopes of practice (e.g. Europe vs. US), and further, whether different levels of EBP activity translate into poorer or improved patient outcomes.
EBP education
While osteopathic degree courses in the UK provide at least 1000 h of clinical training [51], the extent to which EBP training is embedded within these courses is less clear. The majority (58–81%) of responding UK osteopaths reported some level of training in EBP such as critical thinking/analysis. Most UK osteopaths reported they had undertaken EBP training as a component of a study program, rather than through seminars or short courses.
Almost one-third of participants had been in clinical practice over 16 years. Recognising that Sacket’s EBP model [1] started gaining recognition among healthcare professionals in early 2000, it is probable that some respondents may have received little to no EBP training. Indeed, our analyses showed that the more years in practice, the lower the osteopaths’ EBP attitude and use scores. Similarly, lower EBP use and skill scores correlated with a greater number of years since receiving their highest qualification. Given the current push for EBP in healthcare, and that osteopathic education institutions are facing increasing scrutiny to prepare students to practise osteopathy in a safe and effective manner, effective EBP training and continuing education programs are needed. Notably, our results, which concur with survey findings among yoga therapists [10], suggest that providing opportunities and incentives for clinicians to engage in teaching and research may help to improve EBP uptake.
Limitations
The low response rate (7.2%) was surprising given that the pilot testing of EBASE did not identify the survey to be too long or time-consuming. However, post-hoc exploratory analysis of the response rates point to possible response fatigue, as several respondents dropped out before completing the survey. However, difficulties completing more complex questions, such as ranking multiple items, is another possible limitation suggested by most drop-outs occurring from question 38 onwards. Other possible explanations for our response rates may be that clinicians were too time-poor to complete the survey (noting that lack of time was a moderate to major barrier to EBP uptake), were generally uninterested in EBP (noting that most osteopaths engaged in EBP activities at a moderately low level), or were not incentivised to participate. Additional study limitations intrinsic to the survey design include recall bias and self-selection bias. Nonetheless, this was the first national survey to comprehensively explore EBP among osteopaths in the UK; the findings of which may be used to inform future research and EBP educational activities for this target group of healthcare providers. The online survey tool automatically restricted attempts to respond to the survey more than once by the use of cookies per device. While it may have been possible to answer the survey multiple times using different devices, the screening of duplicate ISP entries did not reveal any matching demographic data to suggest that any participants completed the survey more than once [61].
Conclusions
Our study contributes towards an increased understanding of UK osteopaths’ attitudes, skills and use of EBP. The findings suggest that responding UK osteopaths have a generally positive attitude toward EBP, self-report moderate-level skills in EBP, and typically engage in EBP-related activities at a moderately-low level. Encouragingly, most respondents wanted to improve their skills to facilitate the uptake of EBP into osteopathic practice. Additionally, the findings highlight the need for further research; in particular, the need to (i) investigate the meaning that osteopaths ascribe to EBP, (ii) establish the skill level of osteopaths in implementing EBP in clinical practice, (iii) determine a reasonable and clinically feasible level of EBP-related activity for osteopaths, and (iv) develop suitable interventions and strategies that support osteopaths to effectively improve the uptake of EBP.
Notes
Acknowledgements
The authors would like to thank Dr. Michael Ford for his valuable assistance with survey administration, and the University College of Osteopathy (formerly the British School of Osteopathy), General Osteopathic Council, Institute of Osteopathy and National Council for Osteopathic Research, for promoting the survey and distributing invitations to their respective members.
Ethical approval and consent to participate
The Research Ethics Committee of the University College of Osteopathy (London, UK) approved the study. No identifying data were collected and results were only reported as aggregate data, thus maintaining participant anonymity. Study participation was voluntary with the option to withdraw/drop-out at any time.
Funding
The study did not receive any financial support from any sponsor.
Availability of data and materials
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.
Authors’ contributions
TS drafted the manuscript in collaboration with OT and ML. ML conducted the statistical analysis and drafted the results. TS, OT, ML, GF, PA and JA critically interpreted the data, edited the manuscript and reviewed and approved the final manuscript.
Consent for publication
Not applicable.
Competing interests
The authors declare that they have no competing interests.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Sackett DL, et al. Evidence based medicine: what it is and what it isn't. BMJ. 1996;312(7023):71–2.CrossRefPubMedPubMedCentralGoogle Scholar
- 2.Sackett D, et al. How to practice and teach EBM. London: Churchill Livingstone; 2000.Google Scholar
- 3.Porta M. Is there life after evidence-based medicine? J Eval Clin Pract. 2004;10(2):147–52.CrossRefPubMedGoogle Scholar
- 4.Alcantara J, Leach MJ. Chiropractic attitudes and utilization of evidence-based practice: the use of the EBASE questionnaire. EXPLORE: J Sci Heal. 2015;11(5):367–76.CrossRefGoogle Scholar
- 5.Bussières AE, et al. Self-reported attitudes, skills and use of evidence-based practice among Canadian doctors of chiropractic: a national survey. J Can Chiropr Assoc. 2015;59(4):332–48.PubMedPubMedCentralGoogle Scholar
- 6.Leach MJ, Gillham D. Are complementary medicine practitioners implementing evidence based practice? Complement Ther Med. 2011;19(3):128–36.CrossRefPubMedGoogle Scholar
- 7.Mota da Silva T, et al. What do physical therapists think about evidence-based practice? A systematic review. Man Ther. 2015;20(3):388–401.CrossRefGoogle Scholar
- 8.Schneider MJ, et al. US chiropractors' attitudes, skills and use of evidence-based practice: a cross-sectional national survey. Chiroprc Man Ther. 2015;23:16.CrossRefGoogle Scholar
- 9.Snow JE, Leach MJ, Clare BA. Attitudes, skill and use of evidence-based practice among US Western herbal medicine providers: a national survey. J Complement Integr Med. 2017;14(1).Google Scholar
- 10.Sullivan M, et al. Understanding north American yoga therapists' attitudes, skills and use of evidence-based practice: a cross-national survey. Complement Ther Med. 2017;32:11–8.CrossRefPubMedGoogle Scholar
- 11.Upton D, Upton P. Knowledge and use of evidence-based practice of GPs and hospital doctors. J Eval Clin Pract. 2006;12(3):376–84.CrossRefPubMedGoogle Scholar
- 12.Greenhalgh T, Howick J, Maskrey N. Evidence based medicine: a movement in crisis? BMJ. 2014;348.Google Scholar
- 13.Veziari Y, Leach MJ, Kumar S. Barriers to the conduct and application of research in complementary and alternative medicine: a systematic review. BMC Complement Altern Med. 2017;17(1):166.CrossRefPubMedPubMedCentralGoogle Scholar
- 14.Loughlin M. Reason, reality and objectivity–shared dogmas and distortions in the way both ‘scientistic’and ‘postmodern’commentators frame the EBM debate. J Eval Clin Pract. 2008;14(5):665–71.CrossRefPubMedGoogle Scholar
- 15.Miles A, Loughlin M, Polychronis A. Evidence-based healthcare, clinical knowledge and the rise of personalised medicine. J Eval Clin Pract. 2008;14(5):621–49.CrossRefPubMedGoogle Scholar
- 16.Hancock HC, Easen PR. Evidence based practice–an incomplete model of the relationship between theory and professional work. J Eval Clin Pract. 2004;10(2):187–96.CrossRefPubMedGoogle Scholar
- 17.Mykhalovskiy E, Weir L. The problem of evidence-based medicine: directions for social science. Soc Sci Med. 2004;59(5):1059–69.CrossRefPubMedGoogle Scholar
- 18.Rosenfeld JA. The view of evidence-based medicine from the trenches: liberating or authoritarian? J Eval Clin Pract. 2004;10(2):153–5.CrossRefGoogle Scholar
- 19.Tonelli MR. Integrating evidence into clinical practice: an alternative to evidence-based approaches. J Eval Clin Pract. 2006;12(3):248–56.CrossRefGoogle Scholar
- 20.Bithell C. Evidence-based physiotherapy: some thoughts on ‘best evidence. Physiotherapy. 2000;86(2):58–9.CrossRefGoogle Scholar
- 21.Herbert RD, et al. Evidence-based practice--imperfect but necessary. Physiother Theory Pract. 2001;17(3):201–11.CrossRefGoogle Scholar
- 22.Shaw, J.A., D.M. Connelly, and A.A. Zecevic, Pragmatism in practice: mixed methods research for physiotherapy. Physiother Theory Pract, 2010(0): p. 1–9.Google Scholar
- 23.Fryer G. Teaching critical thinking in osteopathy–integrating craft knowledge and evidence-informed approaches. Int J Osteopath Med. 2008;11(2):56–61.CrossRefGoogle Scholar
- 24.Thomson OP, Petty NJ, Moore AP. Clinical reasoning in osteopathy–more than just principles? Int J Osteopath Med. 2011;14(2):71–6.CrossRefGoogle Scholar
- 25.Fawkes C, Ward E, Carnes D. What evidence is good evidence? A masterclass in critical appraisal. Int J Osteopath Med. 2015;18(2):116–29.CrossRefGoogle Scholar
- 26.American Osteopathic Association, 2017 Osteopathic medical profession report, 2017.Google Scholar
- 27.American Osteopathic Association. Osteopathic Medical Profession Reports, 2007-2014.Google Scholar
- 28.Osteopathic International, A., Osteopathy and Osteopathic Medicine A Global View of Practice, Patients, Education and the Contribution to Healthcare Delivery, 2013.Google Scholar
- 29.Fawkes C, et al. Profiling osteopathic practice in the UK using standardised data collection. Int J Osteopath Med. 2013;16(1):e10.CrossRefGoogle Scholar
- 30.Fawkes CA, et al. A profile of osteopathic care in private practices in the United Kingdom: a national pilot using standardised data collection. Man Ther. 2014;19(2):125–30.CrossRefPubMedGoogle Scholar
- 31.Cotton A. Osteopathic principles in the modern world. Int J Osteopath Med. 2013;16(1):17–24.CrossRefGoogle Scholar
- 32.Paulus, S., The core principles of osteopathic philosophy. Int J Osteopath Med, 2013. 16(1): p. 11–16.Google Scholar
- 33.Hartman SE. Cranial osteopathy: its fate seems clear. Chiropr Osteopathy. 2006;14(1):10.CrossRefGoogle Scholar
- 34.McGrath MC. A global view of osteopathic practice–mirror or echo chamber? Int J Osteopath Med. 2015;18(2):130–40.CrossRefGoogle Scholar
- 35.Vogel S. Evidence, theory and variability in osteopathic practice. Int J Osteopath Med. 2015;18(1):1–4.CrossRefGoogle Scholar
- 36.Steel A, et al. The role of osteopathy in clinical care: broadening the evidence-base. Int J Osteopath Med. 2017;24:32–6.CrossRefGoogle Scholar
- 37.Steel A, et al. Osteopathic manipulative treatment: a systematic review and critical appraisal of comparative effectiveness and health economics research. Musculoskelet Sci Pract. 2017;27:165–75.CrossRefPubMedGoogle Scholar
- 38.Vogel S. Research - the future? Why bother? The British Osteopathic Journal. 1994;XIV:6–10.Google Scholar
- 39.Green, J., Evidence-based medicine or evidence-informed osteopathy? Osteopathy Today, 2000. April: p. 21–22.Google Scholar
- 40.Leach J. Towards an osteopathic understanding of evidence. Int J Osteopath Med. 2008;11(1):3–6.CrossRefGoogle Scholar
- 41.Licciardone JC. Educating osteopaths to be researchers - what role should research methods and statistics have in an undergraduate curriculum? Int J Osteopath Med. 2008;11(2):62–8.CrossRefPubMedPubMedCentralGoogle Scholar
- 42.Humpage C. Opinions on research and evidence based medicine within the UK osteopathic profession: a thematic analysis of public documents 2003–2009. Int J Osteopath Med. 2011;14(2):48–56.CrossRefGoogle Scholar
- 43.Poitras S, et al. Guidelines on low back pain disability: interprofessional comparison of use between general practitioners, occupational therapists, and physiotherapists. Spine. 2012;37(14):1252–9.CrossRefPubMedGoogle Scholar
- 44.Parr S, May S. Do musculoskeletal physiotherapists believe the NICE guidelines for the management of non-specific LBP are practical and relevant to their practice? A cross sectional survey. Physiotherapy. 2014;100(3):235–41.CrossRefPubMedGoogle Scholar
- 45.General Osteopathic Council, General Osteopathic Council Annu Rep and Accounts 2016–2017, 2017.Google Scholar
- 46.Naing L, Winn T, Rusli B. Practical issues in calculating the sample size for prevalence studies. Medical Statistics. 2006;1:9–14.Google Scholar
- 47.Leach MJ, Gillham D. Evaluation of the evidence-based practice attitude and utilization SurvEy for complementary and alternative medicine practitioners. J Eval Clin Pract. 2008;14(5):792–8.CrossRefPubMedGoogle Scholar
- 48.Leach MJ. Does ‘traditional’ evidence have a place in contemporary complementary and alternative medicine practice? A case against the value of such evidence. Focus Altern Complement Ther. 2016;21(3–4):147–9.CrossRefGoogle Scholar
- 49.Terhorst L, et al. Evaluating the psychometric properties of the evidence-based practice attitude and utilization survey. J Altern Complement Med. 2016;22(4):328–35.CrossRefPubMedPubMedCentralGoogle Scholar
- 50.General Osteopathic Council, GOsC Osteopaths’ Opinion Survey 2012: FINDINGS, 2012.Google Scholar
- 51.General Osteopathic Council, Training courses - General Osteopathic Council. 2018.Google Scholar
- 52.Scurlock-Evans L, Upton P, Upton D. Evidence-based practice in physiotherapy: a systematic review of barriers, enablers and interventions. Physiotherapy. 2014;100(3):208–19.CrossRefPubMedGoogle Scholar
- 53.Dominic U, et al. Occupational Therapists' attitudes, knowledge, and implementation of evidence-based practice: a systematic review of published research. Br J Occup Ther. 2014;77(1):24–38.CrossRefGoogle Scholar
- 54.Leach MJ. Evidence-based practice: a framework for clinical practice and research design. Int J Nurs Pract. 2006;12(5):248–51.CrossRefPubMedGoogle Scholar
- 55.Leach MJ, Hofmeyer A, Bobridge A. The impact of research education on student nurse attitude, skill and uptake of evidence-based practice: a descriptive longitudinal survey - leach - 2015 - journal of clinical nursing - Wiley online library. J Clin Nurs. 2015;25:194–203.CrossRefGoogle Scholar
- 56.de Groot M, et al. Evidence-based practice for individuals or groups: let’s make a difference. Perspect Med Educ. 2013;2(4):216–21.CrossRefPubMedPubMedCentralGoogle Scholar
- 57.Murthy, L., et al., Interventions to improve the use of systematic reviews in decision-making by health system managers, policy makers and clinicians, 2012.Google Scholar
- 58.Leach MJ, Tucker B. Current understandings of the research–practice gap from the viewpoint of complementary medicine academics: a mixed-method investigation. EXPLORE: J Sci Heal. 2017;13(1):53–61.CrossRefGoogle Scholar
- 59.Wardle J, Adams J. Are the CAM professions engaging in high-level health and medical research? Trends in publicly funded complementary medicine research grants in Australia. Complement Ther Med. 2013;21(6):746–9.CrossRefPubMedGoogle Scholar
- 60.Fell DW, Burnham JF, Dockery JM. Determining where physical therapists get information to support clinical practice decisions. Health Info Librar J. 2013;30(1):35–48.CrossRefGoogle Scholar
- 61.Konstan JA, Simon Rosser BR, Ross MW, Stanton J, Edwards WM. The story of subject naught: a cautionary but optimistic tale of internet survey research. J Comput-Mediat Commun. 2005;10:00.CrossRefGoogle Scholar
Copyright information
© The Author(s). 2018
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.