Pages

Thursday, October 28, 2010

Getting to the heart of medical imaging standards

With OGF’s discussions about the importance of standards and interoperability still fresh in the mind from earlier this week, I found a presentation from the Virtual Pathological Heart Project, part of the Virtual Physiological Human very timely. In the last 10 years in the US there has been a fourfold increase in the number medical images used to diagnose patients with heart problems. This is partly because more techniques are available, but also to make really sure nothing is missed to avoid lawsuits. To the extent that in ‘western’ countries, imaging could represent a person’s main source of exposure to ionising radiation.

Processing these images usually involves a software framework, which takes in the standardised DICOM (Digital Imaging and Communications in Medicine) images, resolves the high resolution data and outputs the results. The problem is that this output can often be just text, low resolution images or even pdfs. At the moment , there is no standard output and patients can’t take their data with them from hospital to hospital. Everything has to be done all over again.

One solution from VPH2 is to use a single imaging session to capture everything, so there is less exposure and inconvenience for the patient. This data is then imported into the VPH2 tool, which can map the extent and thickness of damaged areas in the patient’ heart after a heart attack. This data can then be exported in an XML format that is easy to move from place to place and to interpret. Very helpful – but it shows that more standard output formats are needed in medical imaging.

No comments: