Healthcare data and the known unknowns

5 minute read


Hospital safety failures occur despite reporting and governance procedures that could and should offer protection


Data collection riddled with inconsistencies and dead ends is behind the “depressingly frequent” run of safety scandals in Australian hospitals, a report from the Grattan Institute says.

The report says doctors, patients and managers are poorly served by data about hospitals. As a result, patients cannot assess risks and make informed choices about providers, and the people they rely on might also be at a loss to interpret what data is available.

“For this reason, it is important that general practitioners also have access to information about which hospital is likely to be the safest for their patients, and that journalists have enough information to report on high-level trends,” the report says.

The report, by Dr Stephen Duckett and Associate Professor Christine Jorm, an honorary fellow of the institute, said hospital boards were often “blissfully ignorant” of the disparities.

In a Victorian survey, more than 70% of hospital board members reported that the quality of care in their hospital was above average; none thought they were below average, and only 3% admitted they did not know how their health service compared with others.

“For nearly 20 years, significant resources have gone into producing graphs and maps showing large disparities in treatment and procedure rates across Australia. Yet over that period nothing has changed,” the report said.

“Rarely is there evidence of what might be an appropriate rate might be in any population.

“It is not possible to tell from the data what might be causing the variations: differences in severity of a condition; patient preferences; doctor choices; or other factors.”

The report said it was a tragedy that safety failures occurred despite the existence of reporting and governance procedures that might protect patients if they worked properly.

“Data registries need to share information more widely, they need to capture a greater proportion of the care given, and they need to get data back to clinicians more quickly,” it said.

“States and private hospitals need to give more information to clinicians, including routine data and patient-experience data. The data needs to be clear and detailed, so clinical teams can see how they are performing compared with their peers and how they can improve.”

Clarity was especially important because of shortfalls in doctors’ understanding of quantitative health information.

“There are gaps in some doctors’ knowledge on more basic concepts such as incidence, prevalence, and relative or absolute risk,” the report said.

“A surprisingly high number of doctors are not capable of, or confident in, communicating numerical information to patients. This can result in patients being confused or misled.”

The Grattan report, titled Strengthening Safety Statistics, noted there was no publicly available, exhaustive list of Australian clinical registries.

The researchers identified 74 registries, but found only 37 were established clinical quality-monitoring registries (as defined by the Australian Commission on Quality and Safety in Health Care) of at least two years’ standing.

Of the 37, however, only four scored highly in all areas of cohort coverage, nature of data, public reporting and feedback, the report said.

The authors said public funding should be limited to clinical registries that enrolled at least 90% of relevant patients – as is the case in Denmark – or provided evidence of a valid sampling process.

Registries should also be required to extend their reporting to all relevant clinicians, managers, funders and accreditors, and use data aids for more transparent reporting.

The report was scathing about the lack of shared information about patient experiences in the hospital system despite huge resources devoted by the states to incident reporting.

“In Victoria, at least, this investment of time and money in a state-wide, centralised system leads nowhere: no reports back to hospitals, no recommendations for change, and no action to improve care,” it said.

“The collection of these reports is not an adequate system for monitoring a hospital’s overall safety.”

It called for better data linkages to align routine data, registry data and patient-reported experience and outcome measures.

While hospital and death data was routinely linked in all states, public reporting was the norm only in NSW.

As a first step, PBS data and data on deaths of patients after discharge should be linked to routine data and included in hospital morbidity data sets. This would help with analysis and reporting of 30- and 90-day mortality rates and the care of people with chronic disease.

Governments should link state collections of routine data regularly with PBS and Medicare data every six months and death registrations every month, it said.

To illustrate how available data might impact on care, the report looked specifically at knee replacement surgery, a relatively well-documented procedure.

From routine data, it said, patients could obtain a NSW report for 2012-15 that showed a 60-day readmission rate of 12% for patients at public hospitals.

Of them, 43% were for orthopaedic complications and 25% for a condition potentially related to hospital care. The data, available by hospital, showed one Sydney hospital had a higher readmission rate than expected and two were lower than expected.

“This information can help GPs give recommendations to their patients, help patients make informed choices, and identify units with best practice from which others could learn,” the report said.

However, the information did not include private hospitals, was not updated annually, and did not extend to other states, it said.

End of content

No more pages to load

Log In Register ×