Construction and evaluation of qualitative articles

In the editorial letter written by the researcher António Pedro Costa for the School of Nursing of the University of São Paulo (2016), a summary and some models that can be used for evaluation and construction of qualitative articles are presented. We transcribe here some parts of this text, which can be consulted through the reference presented at the end of this article. “Peer review processes are a symbol of credibility and reliability for scientific journals. These publications depend on different types of evaluation: single-blind, double-blind, open, cascading and pre-publication and post-publication peer review, so as to ensure the quality of their publications (Wijesinha-bettoni, Shankar, Marusic, Grimaldo, & Seeber, 2016). In the social sciences, the double-blind peer review is most commonly used, in which the reviewers do not know who the authors are, and vice-versa. However, do the paths defined vary in the review of qualitative articles? Articles that are grounded in qualitative data analysis use nonnumerical and unstructured data (texts, videos, images and audios). There are no standards as to how results must be presented. Nevertheless, “qualitative” articles are characterized by explanation of the analysis process, in which the authors describe how data were organized, whether the dimensions, categories, and subcategories were defined deductively or inductively, the definitions that reflect the theoretical framework, inferences to data, and the grounding and evidence for the articles. Essentially, they differ from quantitative articles in their methodological strand, considering that, in many points, their frontier is a very fine line or simply does not exist.”(Costa, 2016, p. 892)

The same author continues his explanation by presenting a list of tools / checklists, such as: Consolidated Criteria for Reporting Qualitative Research (COREQ) (A. Tong, Sainsbury, & Craig, 2007); Standards for Reporting Qualitative Research (SRQR) e Enhancing Transparency in Reporting the synthesis of Qualitative Research (ENTREQ)(Allison Tong, Flemming, McInnes, Oliver, & Craig, 2012); Critical Appraisal Skills Programme (CASP), which features several checklists, two of which stand out: 1) CASP Systematic  Review Checklist e 2) CASP Qualitative Checklist (Healthcare, 2013).

“The objective of these tools is to improve the transparency of the aspects of qualitative research, providing clear models for describing studies. These models help both authors in the preparation of their manuscripts, and editors and reviewers in the evaluation of potential articles for publication, providing readers with critical, applied and synthesized analysis of study results. These tools can also show the ability of researchers to write articles based on qualitative data (texts, audios, videos and images).” (Costa, 2016, p. 893)

Along with the use of checklists, doesn’t the integration of ICT throughout the research process improve the quality of a qualitative article?

References

Costa, A. P. (2016). Processo de construção e avaliação de artigos de índole Qualitativa: possíveis caminhos? (Carta Editorial). Revista Da Escola de Enfermagem Da USP, 50(6), 890–891. https://doi.org/https://dx.doi.org/10.1590/s0080-623420160000700002

Healthcare, B. V. (2013). Critical Appraisal Skills Programme (CASP). Retrieved November 25, 2016, from https://www.casp-uk.net/casp-tools-checklists

Tong, A., Flemming, K., McInnes, E., Oliver, S., & Craig, J. (2012). Enhancing transparency in reporting the synthesis of qualitative research: ENTREQ. BMC Medical Research Methodology, 12(1), 181. https://doi.org/10.1186/1471-2288-12-181

Tong, A., Sainsbury, P., & Craig, J. (2007). Consolidated criterio for reporting qualitative research (COREQ): a 32- item checklist for interviews and focus group. International Journal of Qualitative in Health Care, 19(6), 349–357. https://doi.org/10.1093/intqhc/mzm042

Wijesinha-bettoni, R., Shankar, K., Marusic, A., Grimaldo, F., & Seeber, M. (2016). Reviewing the review process : New Frontiers of Peer Review, 82–85. https://doi.org/10.7423/XJENZA.2016.1.11

Leave a comment