Drozd B, Couvillon E, Suarez A. Medical YouTube Videos and Methods of Evaluation: Literature Review. JMIR medical education. 2018 Jan;4(1).
Do your patients check YouTube videos for getting health information? The answer is a most likely “Yes”. Not only would they get them directly on YouTube, but their friends would forward them YouTube links too. How many of them think about the authenticity / quality of these videos? And if they did, how many would know how to evaluate them and then make a decision to view them?
Here is a review article where the authors have identified 37 studies that have checked and assessed YouTube Videos.
The most common methods that authors used to find out videos and determine their quality were:
1. Search for several videos on a topic, using all possible search terms
2. Decide inclusion criteria for what kind of videos they would assess
3. Deciding on what parameters they would assess videos
4. Review videos individually (each author of a study would review)
5. Collectively decide final results
6. Analyze results and share details of usefulness/quality
While so much was reasonably consistent across the 37 studies, the authors found that the actual assessment and methods varied a lot amongst studies. The study describes several methods that different authors use for evaluating videos and most were comprehensive. One interesting mention was that there were “personal experience videos” which were also evaluated, because most patients tend to “identify with” such videos and relate to them. So accuracy is indeed a much needed important element, in these, but may be compromised.
A great idea that has emerged from this study is based on the fact that the young population and the educated population have, over the years probably learned to identify some characteristics about accuracy. The rest of the population may not be so sure. So, if video parameters can be linked to some “predictability of accuracy” that could be created, then such patients may feel more confident in navigating this pool of easily accessible medical knowledge.
Overall a very interesting study. By compiling all criteria that were used for evaluation, and then identifying some more, one wonders that if some experts could arrive at a standard comprehensive method of evaluation. If this happens, maybe researchers could re-evaluate videos on various topics with the standard criteria. And then such a study (maybe even a systematic review) could be done on all such papers!. This could lead to YouTube videos on medical topics being indexed in several ways that could help patients find reliable videos, and healthcare professionals recommend reliable ones to their patients! I wonder if there would be takers.
Note: Thanks to Swapnali Patil for identifying this article for review and giving inputs.