Google Glass Application for Physician Assistants
Earlier this year, Rhode Island Hospital in Providence, RI, became the first hospital to test Google Glass in a dermatology setting. Now, other areas of practices are seeing the technology’s usefulness. One telemedicine company, Remedy, is focusing on developing Google Glass applications (app) for health care providers, specifically physician assistants (PA). Recently Remedy launched a 30-day pilot study with three Harvard hospitals in which they provided Google Glass to PAs who are covering night shifts, allowing them to send videos from their own perspective to remotely located supervising physicians. Researchers said that they hope the app will provide generalists with greater autonomy over their work and more ownership of their cases.
Remedy developed the app for Google Glass to assist health care providers in managing workflow, and the company’s co-founder, Gina Siddiqui, noted that the point-of-view feature of the device in particular could be useful in a field when many healthcare professionals use each other’s expertise to get work done. Siddiqui states, “When I learned about Google Glass, the idea that someone can literally see through your perspective, exactly what you’re seeing, and that they can continue doing their job, with their hands, with their eye contact with the patient, I realized that healthcare is probably the most important place for that technology to be adopted.”
For the study, Remedy disabled most of the Google Glass’s off-the-shelf features, set up its own secure server, and authenticated log-ins so that the device is secure enough to use in a healthcare setting.
The app is designed to not only make capturing information effortless for the user and but also allowing him or her to immediately send it to the consult with a complete picture of their patient’s current conditions: “With a vast majority of [those] appointments, it’s just recapitulating what you said to somebody else. Just getting somebody up to speed with what’s already happened.” Siddiqui says. “What we realized is we have to implement a tool that makes it effortless. Wearable devices, Google Glass, the implementation that we used, that’s the closest to if they look at something, they can capture it. As soon as they capture it we can route it with our intelligent systems to the right expert who needs to see it and give them real time feedback.”
Image: Ted Eytan