A comparative case study of two models of a clinical informaticist service
BMJ 2002; 324 doi: https://doi.org/10.1136/bmj.324.7336.524 (Published 02 March 2002) Cite this as: BMJ 2002;324:524All rapid responses
Rapid responses are electronic comments to the editor. They enable our users to debate issues raised in articles published on bmj.com. A rapid response is first posted online. If you need the URL (web address) of an individual response, simply click on the response headline and copy the URL from the browser window. A proportion of responses will, after editing, be published online and in the print journal as letters, which are indexed in PubMed. Rapid responses are not indexed in PubMed and they are not journal articles. The BMJ reserves the right to remove responses which are being wilfully misrepresented as published articles or when it is brought to our attention that a response spreads misinformation.
From March 2022, the word limit for rapid responses will be 600 words not including references and author details. We will no longer post responses that exceed this limit.
The word limit for letters selected from posted responses remains 300 words.
An Australian project(1) to test the feasibility of providing
evidence-based clinical literature search services in two centres, using
rigorous but pragmatic methods to allow for rapid answers to clinical
questions, demonstrated that such services can contribute to best practice
in primary care.
The service dimension as highlighted by Greenhalgh et al can be
enhanced by providing broad access to the information gathered by such a
service while providing a model of formulating questions and answering
them with the best available evidence.
Questions and answers from both Australian centres are available on
their websites (2,3) and a regular series of articles featuring questions
handled by the services appears in the Medical Journal of Australia as
"Evidence in Action" (eg this week's article on oral contraception and
migraine, 4) to facilitate access to the best available evidence relevant
to general practice.
Cost effectiveness is one aspect of the service dimension yet to be
evaluated.
1.Del Mar CB, Silagy CA, Glasziou PP, Weller D, Spinks AB, Bernath V,
Anderson JN, Hilton DJ, Sanders SL. Feasibility of an evidence-based
literature search service for general practitioners. Med J Aust
2001;175:134-137.
2. http://www.spmed.uq.edu.au/quest/
3. http://www-miph.med.monash.edu.au/CCE_GPQuestion/cgi-bin/start.asp
4. Bernath VF, Clavisi O, Anderson JN. Risk of taking oral
contraceptives in patients with a history of migraine with neurological
signs. Med J Aust 2002;176: 237-238.
Competing interests: No competing interests
Informaticist services for policy-makers: different again?
Editor,
We welcome the paper by Greenhalgh et al (1) and were pleased to see
formal evaluation of two quite different informaticist services. Our own
experience in facilitating evidence-supported policy-making suggests that
some of the points raised apply equally to policy as to practice, though
there are additional facets to consider. Our developing approach has some
commonality with both projects, but is aimed at policy-making bodies,
comprising a variety of health professionals and lay representatives,
rather than individuals or groups of practitioners.
We formed the view that a 'laboratory test service' approach would
not be appropriate in policy-making, although as an academic unit we found
it intuitively attractive, at least initially. We were obliged to
recognise that health policy issues rarely resolve into a 'three-stage
answerable question.' Often, several questions need to be tackled
simultaneously since policy makers also need to consider local processes,
structures, times-scales and concerns.(2,3) The outcome then is a set of
options rather than an answer. Over time we have increasingly favoured an
approach closer to the Basildon model. We aim to arrive at 'best accepted
wisdom' by accessing authoritative sources of pre-appraised guidance (4)
particularly those whose methodological approach to evidence appraisal is
both explicit and systematic. We have worked closely with local policy-
making bodies and have found it helpful to design dissemination tailored
to each issue. The on-going close relationship between our academic unit
and service departments of the local NHS has made this possible.
By way of example, during recent months, we have addressed diverse
questions such as the health risks from household waste incinerators, cost
-effectiveness of photodynamic therapy for macular degeneration,
and the organisation of antenatal screening programmes.
Our experience suggests that facilitation of evidence-based policy-
making should be underpinned by an acknowledgement of the complexity of
policy. We therefore must accept that we may sometimes need to compromise
the 'technical quality of the answers'. For example, if a decision is
needed quickly it may be appropriate to base decisions on easily
accessible authoritative guidance rather than a comprehensive review of
all available evidence including appraisal of primary research. We feel
that the adoption of a pragmatic approach such as we have described can
achieve a more systematic use of evidence than is the norm in the NHS.(5)
The question of whether our approach, which focuses on the service
dimension produces good policy decisions needs evaluating. The approach is
now mature enough for evaluation and we believe that the two dimensions of
a health informatics service as described by Greenhalgh et al will prove
to be an important component of the methodology.
We also agree that such an evaluation requires a developmental rather
than a conventional approach to research.
Helen Thornton-Jones, Senior Lecturer in Health Services Research,
The University of Hull.
Susan Hampshaw, Research Associate. The University of Hull
Andrew Taylor, Health Economist. Effectiveness Facilitation Unit,
East Riding and Hull Health Authority.
References
1. Greenhalgh T, Hughes J, Humphery C et al. A comparative case study
of two models of a clinical informaticist service. BMJ 2002;324:524-9
2. Dopson S, Locock L, Chambers D and Gabbay J Implementation of evidence-
based medicine: evaluation of the Promoting Action on Clinical
Effectiveness Programme. Journal of Health Services and Research Policy
2001;6(1):23-21.
3. Lomas J. Connecting research and policy Canadian Journal of Policy
Research 2000;1:140-144
4. Thornton-Jones H, Hampshaw S, Soltani H and Madhok R. Reviewing local
screening programmes- a worthwhile exercise. Journal of Clinical
Governance, 2002 [in press]
5. Milbank Memorial Fund. Better Information, Better Outcomes. The use of
Health Technology Assessment and Cost Effectiveness Data in Health Care
Purchasing Decisions in the United Kingdom and the United States, July
2000. Available from: URL: http://www.milbank.org/000726purchasing.html
[accessed September 2000]
Competing interests: None declared.
Competing interests: No competing interests