Is content analysis either practical or desirable for research evaluation?
This note responds to comments by Doyle (Omega, 1999;27:403-405) and Jones (Omega, 1999;27:397-401) on my contribution (Omega 1997;25:599-603) to the ongoing debate on judging the quality of research at business schools (a debate initiated by the same two authors and their co-authors). Both contributors have critically examined the use of Reisman and Kirschnick's work on the content analysis of MS/OR articles, each from a different perspective. Doyle sets out the analytical steps that would be required and argues that there are few, if any, gains to be made from the additional work involved in the content analysis. Jones argues that, even though content analysis has yet to be tried, peer review of journals and citation indices studies are to be preferred because they appear relatively more valid, reliable and practicable. In response I restate the case for analysing content, consider the specific arguments of Doyle and Jones, air other concerns, and conclude that content analysis should remain on the agenda despite the obvious difficulties. An analysis of the 1994 volume of the Journal of the Operational Research Society is described to illustrate how the use of content analysis can provide insight.
Year of publication: |
2000
|
---|---|
Authors: | Ormerod, R. J. |
Published in: |
Omega. - Elsevier, ISSN 0305-0483. - Vol. 28.2000, 2, p. 241-245
|
Publisher: |
Elsevier |
Subject: | Quality of research Research strategies |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
The transformation competence perspective
Ormerod, R. J., (2008)
-
Articulate intervention revisited
Ormerod, R. J., (2010)
-
The effect of the physical properties of coal reserves on deep mine productivity in the UK
Gregory, K., (1979)
- More ...