Students Use Virtual Reality To View Tissue

Virtual reality can take you to Mars, the North Pole, even back in time. But how about inside the kidney of a mouse? Students in Medical Device Design & Innovation (MENG/BENG 404) have developed a virtual reality platform that allows for a closer examination of 3D images from kidney biopsies.

The project became a part of the course when co-teachers Dr. Joseph Zinter and Dr. Alyssa Siefert spoke with Dr. Richard Torres, part of a team of researchers who developed a 3D-imaging technique that allowed them to get a better look at the tissue of patients with kidney disease. The technique, known as multiphoton microscopy, produces high-resolution 3D images into the kidney and provides a deep view of the organ’s internal structures. Conventional methods provided only superficial images of the organ. 

“It has quite a bit of advantages over current techniques,” said Torres, a hematopathologist at Yale University School of Medicine. “There’s less technical labor, it’s faster to process, and you’re able to visualize an entire piece of tissue without consuming any of it.” But Torres and his fellow researchers wondered whether the new images would be served better with a new technology for viewing them. That’s how the idea of developing a virtual reality platform came up. Four students in the class were charged with the task of designing one. 

“A lot of 3D visualization programs that exist aren’t tuned well to the uses of pathhology,” said team member Alex Ringlein ‘18. “We wanted to make it usable for pathologists.” 

Trying to figure out exactly what the pathologists needed was one of the team’s challenges. The four-member team – which also included Acshi Haggenmiller '17, Henry Li '17, and Sachith Gullapalli '17 – developed the technology in the Klingenstein Design Lab of the Center for Engineering Innovation & Design (CEID). They initially considered allowing users to “scoop out” sections of the 3D data would be best approach. But the rendering method they used to do this distorted the data. 

“After talking to Dr. Torres, it was clear that more important was showing the more orthogonal views in the highest resolution possible,” Ringlein said. “So we made a system where you could scroll through, slice by slice.” 

Other challenges were related to interface design and figuring out what sorts of control mechanisms and interactions with the data proved to be the most useful. With motion controllers from the HTC Vive (one of the more popular pieces of VR hardware), the team's platform allows the user to rotate and translate the rendering using multi-touch-like controls. In addition to scrolling through the sides of the block of data, they can modify the transparency, contrast, and coloration of the data within the program.

When the team, VR Pathology, gave their final presentation in December, Torres was excited by the device’s potential.

“It’s excellent – it’s fun to use – that’s the first reaction you have, but I think what they’ve been able to develop is something that has the makings of a practical tool,” Torres said. “It has very smooth motion, it’s easy to navigate and I can go through large amounts of visual data, and I get perspectives that I don’t get from 2D slides.”

Ringlein said he’d like the team to pursue the project further. In particular, he wants to do more scientific testing, as well as optimize how the renderings are displayed when working with larger data sets.