Online physician ratings may not be accurate

The physician ratings that patients see online may be inaccurate when compared to ratings compiled from surveys conducted internally by health care organizations, according to new research published today (Sept. 17, 2019) in the Journal of General Internal Medicine.

“We know that patients often rely on online ratings when choosing doctors,” said Kanu Okike, MD, MPH, an orthopedic surgeon with Kaiser Permanente Hawaii. “Yet those online ratings are often based on a small number of patient reviews – perhaps 3 or 4. When we compared the online ratings with the ratings compiled by Kaiser Permanente Southern California, which are typically based on more than 100 patient reviews, we found that there was little correlation.”

It was only after a physician had accumulated 15 or more online reviews that their rating began to resemble the internal Kaiser Permanente ratings.

Dr. Okike began his research after noticing the rise of online physician ratings. He suspected that from a statistical standpoint they were unlikely to be accurate. At first, however, he wasn’t sure how to research the issue.

Assessing online physician ratings

The study began to take shape after he spoke with Tad Funahashi, MD, and Michael Kanter, MD. Dr. Funahashi is an orthopedic surgeon who heads the Health Innovations and Transformation Department for Kaiser Permanente Southern California. Dr. Kanter is the former medical director of Quality and Clinical Analysis for the Southern California Permanente Medical Group, and now chair and professor for the Department of Clinical Sciences, Kaiser Permanente School of Medicine.

The Southern California region of Kaiser Permanente turned out to be the prime location to examine the issue because it captures the same concept in its patient surveys as the online ratings services. With an apples-to-apples comparison, the research was possible.

Researchers scoured internet for online ratings

In 2016, 3 research assistants did internet searches for nearly 6,000 physicians in Kaiser Permanente Southern California, then tallied the number of raters and their star ratings. The online ratings were then compared with internal ratings from Kaiser Permanente. The study found:

  • There was low correlation between the online ratings, which were based on an average of 3.5 patient reviews, and the internal ratings, which were based on an average of 119 reviews and were from verified patients.
  • The correlation between the online and internal ratings increased with the number of reviews used to formulate each online rating. For the small percentage of online ratings that were based on 15 patient reviews, the online ratings began to correlate with the internal ratings.
  • Older physicians had systematically worse online ratings than their younger counterparts, although the internal ratings showed no difference. Researchers believe this may be because people who tend to rate physicians online are younger.

“The research should caution patients against using these online rating systems to choose a physician,” Dr. Kanter said. “We hope this study will improve patient understanding of online ratings and their limitations.”

Better ways to find your physician

To choose a physician, patients can often find information about their doctors online on websites, which list the schools attended, the physician’s philosophy of medicine, and other relevant information, he said. Referrals are another resource.

Dr. Kanter also noted that patients can be better assured when they choose a doctor “from a good medical group where quality and professionalism are highly prized.”

Researchers hope that the study will help patients better assess physicians and understand that online ratings may not accurately reflect their doctor’s quality. They also hope the study will help patients understand the limitations of online rating systems and encourage appropriate improvements.

Dr. Okike noted: “Online ratings usually don’t include a representative sample of patients and rarely tell the whole story.”

In addition to Dr. Okike, Dr. Funihashi, and Dr. Kanter, others who worked on the research included Natalie Uhr, of Harvard University in Cambridge, Mass; Sherry Shin, of UCLA in Los Angeles; Kristal Xie of Boston University in Boston; and Chong Y. Kim, PhD, of the Southern California Permanente Medical Group, Pasadena, Calif.