Skip to main
News

The Dawning of the Age of the Metaverse

As I slid the headset’s clear lenses down over my eyes, the plastic skull on the table in front of me jumped into technicolor. Several inches beneath its smooth, white cranium, I could see vibrantly colored neurological structures waiting to be punctured by the medical instrument I held in my quivering hand.

Lining up the long, thin catheter with a small hole in the back of the skull, I slowly began pushing its tip downward. Numbers floating off to my right told me exactly how far the instrument was from reaching its goal. My target was highlighted on the right ventricle—one of two large, fluid-filled structures deep inside the brain that keep it buoyant and cushioned. As I carefully adjusted my angle of attack, a lightsaber-like beam extended through the current trajectory of my catheter, burning red when its path was off kilter and glowing green when it was on target. 

It’s a good thing I’m not a neurosurgeon and that this was only an augmented reality demonstration, or the patient’s brain would have been stirred like a bowl of ramen. Even the slightest twitch or tremble turned the lightsaber to the dark side, indicating it was off track. 

“We had a group of medical students in here a few weeks ago who were making a game of it,” said Maria Gorlatova, the Nortel Networks Assistant Professor of Electrical and Computer Engineering at Duke. “We 3D printed some molds to create a Jell-O–like filling to make it feel more realistic. And even for them it’s a real challenge to get the catheter placed correctly.” 

The system, called NeuroLens, is the brain child of PhD student Sarah Eom, and is one of many AR projects underway in Gorlatova’s lab. She hopes the tool will one day help neurosurgeons with a relatively common but delicate procedure: placing catheters into the brain’s ventricles to drain excess fluid. And not just in practice, but with real patients. 

Surgical procedures and other medical uses are one of the applications furthest along the development track for augmented reality. While the tasks can be intricate and delicate, they have the advantage of taking place on a stationary object, where a group of cameras can surround the table and keep incredibly close watch on every miniscule movement while relaying the data via a steady computer connection. 

But these advantages evaporate even in the relatively controlled environment of one’s own home, much less the streets of an urban environment. That makes the true potential of the Metaverse possible only in Hollywood depictions, for the time being. We can imagine hungry people walking down a street, seeing the names, menus and reviews of restaurants hovering over buildings as they pass. Directions are overlaid on the sidewalks, while potential hazards of fast-moving bikes and cars are highlighted in red. People come home to lights automatically optimized to the task at hand, while cabinets, appliances and shelves provide useful information about their current contents. 

These visions of the future may currently be relegated to shows like Black Mirror, but they’re actually not as far away as one might think. “The graphic displays and processing needed for this level of augmented reality interaction is actually already pretty decent. That’s not the bottleneck,” Gorlatova said. “We just need to develop the supporting infrastructure in terms of mobile algorithms, local processing power and more efficient AI.” 

Enabling the visionary future of the Metaverse requires research projects of many levels, ranging from AI-accelerating hardware to localized computing networks that can tackle complex software in the blink of an eye. To discover how researchers are approaching these challenges, and perhaps even to get a glimpse of tomorrow’s Metaverse, one need look no further than Duke ECE. 

Image
Undergraduate engineering student Seijung Kim is guided through a tricky brain catheterization procedure with the help of an AR experience designed by PhD student Sarah Eom.

 Undergraduate engineering student Seijung Kim is guided through a tricky brain catheterization procedure with the help of an AR experience designed by PhD student Sarah Eom. 

Image
VR setup image, with cameras detecting movement around the subject