Just because you can, it doesn't mean you should. These are wise words that feel incredibly relevant to the current pursuit of hyper-realistic VR experiences—specifically the rise of smellovision. While developers strive for total immersion, we have to ask: would you really want to include accurate odours in your headset?

The question of whether this is possible is being answered by a group of researchers. They have developed a new approach using ultrasound to stimulate the scent-processing part of the brain. Remarkably, they claim this technology could function not just for scent, but as a highly efficient method for inputting data directly into the human brain.

The Science Behind Ultrasound Smellovision

The researchers achieved these results by pointing an ultrasound probe at the scent-processing region of the brain to trigger different sensations. According to their findings, different focal spots corresponded to specific smells. They have already successfully replicated this on two individuals and validated it through a blind trial.

To achieve this level of precision, the team had to navigate some anatomical challenges:

  • Targeting the olfactory bulb: The team placed a transducer on the forehead, aiming the ultrasound downward toward the olfactory bulb.
  • Overcoming sinus interference: While the frontal sinuses can weaken the signal, careful positioning above the sinuses still allows them to reach the target region.
  • Precision tuning: The process involves using an MRI scan of the subject's skull to optimize the placement of the ultrasound pad.
  • Signal parameters: Researchers adjust frequency, focal depth, and pulsed output to refine the experience.

So far, the replicated smells achieved through this smellovision method include "fresh air, with a lot of oxygen," "the smell of garbage, like few-day-old fruit peels," and "a campfire smell of burning wood." The researchers noted that the olfactory bulb is incredibly densely packed, noting that the distance between "freshness" and "burning" was only about 3.5 mm.

Writing Data to the Brain via Scent

Beyond just smelling a campfire, there is a more radical claim regarding the potential for smellovision as a neural interface. The researchers suggest that because the nose has roughly 400 distinct receptor types, these could serve as a channel for writing information directly into the brain through non-invasive neuromodulation.

The scale of this potential data transfer is staggering. Because we have two nostrils, the olfactory system could potentially allow for writing up to 800 dimensions of data into the brain. This dimensionality is comparable to the latent spaces used by Large Language Models (LLMs), implying that a single paragraph could be encoded into a 400-dimensional vector.

The researchers suggest that if a user learns to associate specific input patterns with meanings, they could eventually "directly smell the latent space." While the idea of using scent to understand complex semantics is currently speculative, it touches on the concept of induced synesthesia.

The Internet's Visceral Reaction

Naturally, the mere suggestion of this technology has caused some observers to recoil. The internet's reaction highlights the potential nightmare of bringing real-world smells into digital spaces. One commenter on X described a terrifyingly plausible scenario:

"Walking into a VR chat lobby and immediately projectile vomiting as your headset psychically projects the scent of yeast-infected rotting fish from the pregnant fox OC on the other side of the room while it tanks your framerate."

This sense of dread isn't limited to smell. This technology sits alongside other questionable pursuits in VR fidelity, such as an electronic tongue designed to simulate tastes like fish soup, and devices that use electrical pulses on the neck to induce sensations of motion. Whether these advancements represent the future of gaming or a recipe for sensory overload is up to you to decide.