Augmented Reality Applications in Surgery
‘Meet your surgeon! Well, technically, he’s half a world away, but he can get you through this.’
Imagine that’s what doctors will be able to say in the near future. With the current breakthroughs in technology, doctors in the US could remotely control surgical operations in, say, Afghanistan, from the comfort of their homes, or of their offices. This is the miracle modern medicine has been waiting for since its inception. And it’s really amazing.
And yet, there’s one other thing that’s not that amazing. And that’s the fact that the technology needed to perform remote surgery via augmented reality has been readily available for quite some time. And yet, only now are people starting to use it for medicine. I, for one, refuse to believe that there were not enough bright minds out there that could transform AR to be used for this purpose. So, ultimately, it all comes down to one key element: money.
If you will look at the start-up video below and then compare it with, say, that video from Magic Leap, this one will indeed seem amateurish and clunky. I’m going to let you guess what budgets these two projects have. Suffice it to say that Magic Leap is one of the most expensive project in the AR market today, if you don’t believe me, read our article here.
And yet, this is the great thing about the ambitiously named STAR Project. STAR stands for System for Telementoring with Augmented Reality. Such a project doesn’t have to have $592 million to get its message through. It can just as well do a simple video and a research paper and the realization that this new tech can indeed revolutionize medicine and surgery will spread easily.
The STAR project is the child of two renowned universities: engineers and researchers from both Indiana University’s School of Medicine and Purdue University have worked together developing the plan. For this, they used the already existing concept of a telestrator – a device that lets people scribble over videos. The researchers adapted this tech so that it could be used through a simple app on a tablet.
Let’s review the scenario presented by the video accompanying the research:
- A long distance reconnaissance unit of the US Army is inspecting what looks like a deserted town in Afghanistan.
- An explosive device sets off (it could be a mine or maybe they were attacked). Two of the three men are hit. The one that was out of the blast radius hurries back.
- One of the soldiers is badly injured. The remaining soldier needs to operate in order to save the leg of his fallen brother-in-arms. He has no skill to perform surgery.
- The nurse says that they can use remote guidance via a concept much like what STAR proposs.
- The soldier calls a doctor back at base or back in the US. Both install their tablets. Now, the surgeon can see exactly how the situation looks like. He makes a few markings on the tablet.
- The inexperienced soldier sees the doctor’s markings via augmented reality directly on the spot of the leg where he needs to operate. He then proceeds to make the incisions.
- The soldier’s leg is saved. After approximately one hour of simple guidance, the surgery – a typical and uncomplicated procedure for an experienced doctor – is successfully completed.
Here is the video that STAR project has made – don’t mind the poor quality of the animation:
However much this may seem simple and straightforward, there are a few issues that may occur. First of all, the tablet and its camera need to be perfectly held in place right between the patient and the man performing the operation. This means that a robotic arm is needed to keep it steady. As you may already know, surgery is an extremely delicate procedure. Even the slightest deviation from the original instruction could have catastrophic consequences.
Second of all, for the same reasons, those performing the operation must not divert their attention from the instructions that appear on the tablet. And also, the tablet needs to keep the annotations in perfect alignment. This means devising an algorithm that understands what surrounds the annotation, and keeps it perfectly aligned with it.
Voicu Popescu, the computer science professor form Purdue who has helped with the STAR, says that the project is very far from its completion. The first thing they desperately need is a way to integrate depth of field into the equation, as current cameras (like those on a Samsung Galaxy Tab) are really quite poor for the job.
The researchers have also tested this with Google Glass and with similar results.
One other problem may stem from the fact that, given the fact that the hands need to operate around the tablet, the surgeon has less control over what he is doing. Also, his hands may end up completely covering up the camera – which may indeed be counterproductive.
Still, Popescu is hopeful. He and his team have taken the very important big first step into the right direction. He says that the data they have now looks extremely promising. They have gathered these results throughout a series of test operations performed on dummies and on small animals. The ultimate applicability of this system will be with field hospitals, or with rural area hospitals which don’t have sufficiently skilled surgeons to complete the more complicated operations.
The question remains: when will be seeing Google, or any other giant tech company for that matter, buying a start-up like this? The current project was funded by the US Department of Defense – which I’m sure did not give too much money into something like this. The fundamental issue with modern technology remains the same: utility is sacrificed for the sake of entertainment.
The precision displayed in the video by Magic Leap is indeed astonishing. And the depth of field is more than premium. Just imagine what a start-up like STAR Project could do if it had the technological prowess of Google’s recent big investment. Unfortunately, until that day, we can only dream of the augmented reality applications in surgery and in medicine in general.