How to critically analyse a methodology The table below provides the criteria for judging the strengths and weaknesses of methodology. An evaluation of a methodology usually involves a critical analysis of its main sections: design; sampling (participants); measurement tools and materials; procedure design tests the hypotheses or research questions method valid and reliable potential bias or measurement error, and confounding variables addressed method allows results to be generalized representative sampling of cohort and phenomena; sufficient response rate valid and reliable measurement tools valid and reliable procedure method clear and detailed to allow replication Evaluating a Methodology Strengths Weaknesses Research design tests the hypotheses or research questions research design is inappropriate for the hypotheses or research questions Valid and reliable method dubious, questionable validity The method addresses potential sources of bias or measurement error. confounding variables were identified insufficiently rigorous measurement error produces questionable or unreliable confounding variables not identified or addressed The method (sample, measurement tools, procedure) allows results to be generalized or transferred. Sampling was representative to enable generalization generalizability of the results is limited due to an unrepresentative sample: small sample size or limited sample range Sampling of cohort was representative to enable generalization sampling of phenomena under investigation sufficiently wide and representative sampling response rate was sufficiently high limited generalizability of results due to unrepresentative sample: small sample size or limited sample range of cohort or phenomena under investigation sampling response rate was too low Measurement tool(s) / instrument(s), appropriate, reliable and valid measurements were accurate inappropriate measurement tools; incomplete or ambiguous scale items inaccurate measurement reliability statistics from previous research for measurement tool not reported measurement instrument items are ambiguous, unclear, contradictory Procedure reliable and valid Measurement error from administration of the measurement tool(s) Method was clearly explained and sufficiently detailed to allow replication Explanation of the methodology (or parts of it, for example the Procedure) is unclear, confused, imprecise, ambiguous, inconsistent or contradictory Critical analysis examples of a methodology The unrepresentativeness of the sample makes these results misleading. The presence of unmeasured variables in this study limits the interpretation of the results. Other, unmeasured confounding variables may be influencing this association. The interpretation of the data requires caution because the effect of confounding variables was not taken into account. The insufficient control of several response biases in this study means the results are likely to be unreliable. Although this correlational study shows association between the variables, it does not establish a causal relationship. Taken together, the methodological shortcomings of this study suggest the need for serious caution in the meaningful interpretation of the study’s results.

Comments

Popular posts from this blog