From rxpgnews.com

Latest Research
What the doctor prescribes: Customized medical-image databases
Aug 2, 2010 - 4:00:00 AM

Digital archives of biomedical images could someday put critical information at doctors' fingertips within seconds, illustrating how computers can improve the way medicine is practiced. The current reality, however, isn't quite up to speed, with databases virtually overwhelmed by the explosion of medical imaging.

Rochester Institute of Technology professor Anne Haake recently won grants from the National Science Foundation and the National Institutes of Health to address this problem. Haake envisions an image database built on input from the intended end-users and designed from the beginning with flexible user interfaces. Haake and her interdisciplinary team will develop a prototype using input from dermatologists to refine the search mechanism for images of various skin conditions.

We need to involve users from the very beginning, says Haake, professor of information sciences and technologies at the B. Thomas Golisano College of Computing and Information Sciences. This is especially true in the biomedical area where there is so much domain knowledge that it will be specific to each particular specialty.

Haake understands the real need to make biomedical images useful. She began her career as a developmental biologist before pursuing computing and biomedical informatics. This project combines her two strengths and was inspired by research she conducted while on sabbatical at the NIH National Library of Medicine.

Dr. Cara Calvelli, a dermatologist and a professor in the Physician Assistant program in RIT's College of Science, has recruited dermatologists, residents and PA students for the project. She is also helping to properly describe the sample images, some of which come from her own collection. The best way to learn is to see patients again and again with various disorders, Calvelli says. When you can't get the patients themselves, getting good pictures and learning how to describe them is second best.

Funding Haake won from the NSF will support visual perception research using eye tracking and the design of a content-based image retrieval system accessible through touch, gaze, voice and gesture; the NIH portion of the project will be used to fuse image understanding and medical knowledge.

Bridging the semantic gap is the challenge facing researchers working in content-based image retrieval, Haake says. Search functions can go awry when computer engineered algorithms trip on nuances and fail to distinguish between disparate objects, such as a whale and a ship. Building a system based on the end-user's knowledge can prevent semantic hiccups from occurring.

Pengcheng Shi, director for Graduate Studies and Research in the Golisano College, is providing his expertise in image understanding. For many years computing/technical people have said we can write algorithms such that it will work, he says. But people start to realize that machines are not all that powerful. At the end of the day we need to put the human back into it. What are the physicians looking at and how are they looking at it in order to make their decisions?

A novel aspect of the project explores the use of eye tracking to find out what an expert thinks is important. Watching where physicians look when making a diagnosis from a picture can reveal the key regions in an image in a more reliable manner than by asking the same people to remember where they concentrated the most to make their conclusions.

Where people look is not really where people say they look because we're just not aware of our visual strategies, Haake says. Eye tracking is a way to identify the perceptually important areas, what people pay attention to and where they are looking.

The eye tracking effort is taking place in RIT's Multidisciplinary Vision Research Laboratory in the Chester F. Carlson Center for Imaging Science under the supervision of co-director Jeff Pelz. People tend not to pay attention to where they look. People move their eyes 150,000 times a day, but you don't spend time thinking about where you will move your eyes next and you don't waste any memory remembering where your eyes have been, says Pelz, whose lab is part of the College of Science. You just move your eyes to the next place you need information and a fraction of a second later you move them again.

The study asks 16 pairs of dermatologists and PA students to view skin conditions in 50 different images displayed on a monitor. The pairing creates a master-apprentice dynamic.

If you record the interaction between the master and apprentice while the master is explaining to the apprentice how to do something, it is an excellent way to learn domain knowledge from an expert, Pelz says. You get something different and better than if you just listen to two doctors talking to each other or a doctor talking to a layperson.

A tracking device attached to the monitor recorded the physicians' eye movements as they lingered on the critical regions in each image. At the same time, vocabulary mined from audio recordings of the physicians' explanations will form the common search words in the database.





All rights reserved by RxPG Medical Solutions Private Limited ( www.rxpgnews.com )