Engineering Sensory Landscapes in Synthetic Space

The use of virtual environments has opened up a world of human sensory engineering and enhancements. The potential for creating immersive experiences has captured the public's interest long before it became feasible, and now with sensory input and output devices available, the possibilities for creating limitless new realities are endless. It appears that any imaginable concept can be brought to life and experienced in one form or the other.

The concept is rooted in the ability of a computer to generate a 3D visual representation using numerical information. This data may come from a mathematical model depicting real-world scenarios, such as geometric objects positioned in a specific area, but it can also be used to visualize abstract multivariate data, like a ten-dimensional vector series tracking the economy over time. The human operator can experience the environment as if it were a real part of the world using visual and auditory output devices. Since input devices sense the operator's reactions and motions, the operator can modify the synthetic environment, creating the illusion of interacting with and immersed within it. Although the first experiences in the simulated environments lacked immersion, with new technologies that have been developed and become more accessible, we can expect to see advancements on multiple levels, and eventually wide adoption.

There are still gaps that must be filled and the enabling technologies must be created by science of synthetic environments. One of the crucial fields is "sensory engineering", which is a collection of disciplines whose objective is to help us better understand what makes a compelling synthetic environment. This article talks about various models and schemes of sensory engineering and enhancement of sensory landscape in synthetic space.


SENSORY ENGINEERING

Sensory engineering is the science of synthetic environments. It involves using computer technology to create 3D worlds using a combination of real and mathematical data. This allows for a human operator to experience an immersive environment, and when it’s complete, the user's senses are tricked (guided?) into perceiving the simulated world as reality. To achieve this level of immersion, all sensory inputs must be convincing, despite being generated by the computer. The computer must be able to detect and incorporate any behavioral responses from the user and transmit it into the synthetic environment.

Synthetic space — refers to an artificial, digital, or virtual environment that is created using technology. These spaces are constructed through the combination of software, hardware, and sensory inputs to create immersive and interactive experiences that can simulate real-world environments or generate entirely new, imaginative landscapes. Synthetic spaces are typically experienced through digital platforms like virtual reality (VR), augmented reality (AR), and mixed reality (MR) headsets, as well as through computer screens and other displays.

Sensory engineering is used in synthetic spaces to create immersive and interactive environments that stimulate the human senses of varying types:

  1. Visual augmentation:

    • Virtual reality (VR): Creating immersive visual environments that can simulate real or imaginary worlds.

    • Augmented reality (AR): Overlaying virtual objects onto the real world to enhance visual experiences.

    • Holography: Creating three-dimensional images in synthetic spaces.

  2. Auditory augmentation:

    • 3D Audio: Creating spatial audio environments that simulate real-world soundscapes.

    • Audio synthesis: Generating synthetic sounds and music that enhance auditory experiences.

  3. Haptic augmentation:

    • Haptic feedback: Providing tactile feedback through devices like gloves or suits to simulate the sense of touch.

    • Vibrotactile feedback: Using vibration to convey information or enhance sensory experiences.

  4. Olfactory augmentation:

    • Digital scent technology: Emitting synthetic scents to create olfactory experiences in synthetic spaces.

  5. Gustatory augmentation:

    • Digital taste technology: Simulating tastes using electrical or chemical stimulation.

  6. Proprioceptive augmentation:

    • Motion capture and feedback: Using sensors to capture and feedback user's movements to enhance the sense of position and movement in synthetic spaces.

  7. Multisensory integration:

    • Combining multiple sensory modalities to create rich and immersive experiences that engage all the human senses.

Sensory engineering encompasses varying levels of immersion. Data can be visualized to be viewed on a computer screen, while the operator (viewer) has the ability to interact with, modify, and reorient the graphics, which offers the operator a partially immersive experience.

 

Figure 1: Sensory systems process environmental data such as light, sound, heat, mechanical forces, and chemical signals, generating measurable observations of the surroundings. Our own biological neural networks convert these observations into internal perceptions of the environment. These perceptions are further processed into information that aids in decision-making. Certain decisions may lead to changes in behavior or reactions to modify the environment. [source]

 

By inserting computational devices between the human observer and the environmental data, this model can be modified. The computer is present in every interaction between the human and the environment — it creates a layer called sensory interface, which generates signals for the biological sensors, mimicking the natural environment (Fig. 1). Devices like head- and eyeball-tracking tools, position and orientation detectors, and data gloves are used to register the actions of the human observer in order to modify the environment, changing the relations depicted in Fig. 2.

The synthetic environment is created using physical measurements and models that generate artificial or mathematical data about a geometric setting. This data is then converted into usable information for generating biologically sensed signals. Likewise, the behavioral data collected from the human observer can be either transformed to modify the synthetic environment or used to control servo-mechanisms that affect the real environment. However, Fig. 2 does not depict the internal components of the human observer, but there is scientific research on how cognitive processing influences biologically sensed data, which is also an important aspect of sensory engineering.

Figure 2: Connections between the human participant and the simulated environment. [source]

Comparing experiences, the observer would not be able to discern any difference between the physical world shown in Fig. 1 and the synthetic setting shown in Fig. 2. The advancement of component technology has reached a level where the simulated environment has successfully undergone a form of Turing test (a measure of Artificial Intelligence) for authenticity purposes, but simplifying the data transmissions (such as limiting sensory output to light and sound creating ‘reduced realism’) can also work for some applications.

This type of scheme is used for building Virtual Reality as a system that removes the physical environment and relies on computer-generated elements. In this setup, the human observer has no interaction with the physical environment modules, sensors, or transducers. The goal of true virtual reality is complete immersion where the experience is indistinguishable from a real-world one. But, if the virtual environment goes against perceived natural laws, confusion will arise. This model serves as the basis for data visualization applications but offers varying levels of immersion.


Sensory Enhancement

Since sensory engineering is used to create human sensory experiences, it also applies to their enhancement or augmentation. That can be done by manipulating and augmenting sensory input, which enhances the quality of human experiences in a variety of contexts. Here are some ways in which sensory engineering can be used for sensory enhancement:

  1. Wearable devices — Devices like smart glasses or augmented reality (AR) headsets can enhance visual and auditory sensory input, providing additional information or enhancing what is seen or heard in the natural environment. Haptic feedback suits or gloves can simulate touch sensations to enhance the feeling of presence in virtual environments.

  2. Sensory substitution — Sensory substitution involves using one sensory modality to replace or augment another. For example, a device might convert visual information into auditory signals for visually impaired individuals.

  3. Neurostimulation — Transcranial magnetic stimulation (TMS) or transcranial direct current stimulation (tDCS) can be used to stimulate specific areas of the brain to enhance cognitive functions, such as memory or attention. Deep brain stimulation (DBS) can be used to treat various neurological disorders and potentially enhance certain sensory functions.

  4. Biofeedback — Biofeedback involves the use of sensors and feedback to help individuals gain control over certain physiological functions, such as heart rate or muscle tension. This can be used to enhance sensory awareness and control.

  5. Prosthetics and implants — Advanced prosthetic devices can provide sensory feedback to users, enhancing their ability to interact with their environment. Cochlear implants can restore hearing in individuals with profound deafness.

  6. Sensory augmentation for learning and training — Sensory augmentation can be used in education and training to enhance learning experiences. For example, virtual reality (VR) can provide immersive learning environments that stimulate multiple senses simultaneously.

  7. Sensory enhancement in entertainment — Sensory engineering can be used to create more immersive and interactive entertainment experiences, such as 4D movie theaters that include sensory effects like scent and touch.

  8. Medical applications — Sensory engineering can be used to develop new therapies and interventions for individuals with sensory impairments or disorders, such as sensory processing disorders or autism.

TYPES OF ENHANCEMENT

The technique of sensory enhancement utilizes algorithms to convert sensory data from the real world. It does not incorporate any artificial data or observer behavior to alter the environment. Instead, signal processors manipulate the signals from physical sensors through filtering and enhancing techniques (Figure 3B), where synthetic data sources and transducer modules have been eliminated. In this system, a visually impaired individual wears a device equipped with television cameras that sense visible light. The camera image is then processed using methods such as edge detection and contrast enhancement, which effectively reverses the visual degradation. This improved image is then perceived by the human observer whose poor vision naturally reverses the transformation, resulting in an enhanced visual experience.

The physical sensors module now incorporates a polarization sensor, which translates polarization data into detectable forms, such as visible light color and intensity, for human biological signal processors. This allows the observer to perceive polarized light, which is not typically accessible through normal sensation. This application also opens up possibilities for other uses, such as a tool to convert visible light into auditory information, potentially aiding blind individuals in virtual sight. While this sensory enhancement does not digitally manipulate the physical environment, the system does require detecting user actions (such as head and eye movement) in order to adjust input signals accordingly.

Figure 3: Types of sensory enhancement in schemes:

A. Virtual reality

B. Sensory enhancement

C. Environmental overlays

D. Virtual workstation

[source]

Conclusion

The development and implementation of sensory engineering in synthetic spaces opens up a world of possibilities for augmenting and enriching our experiences in various domains. By leveraging the capabilities of virtual and augmented reality, haptic feedback, 3D audio, and other sensory augmentation technologies, we are able to enhance human sensory capabilities and even compensate for sensory deficits. The integration of sensory engineering in synthetic spaces transforms the way we interact with our environment, learn new information, and engage with reality, or better said realities. It is an interdisciplinary field that evolves, so we can expect to see further advancements that will redefine our perception of reality and unlock new interfaces for human perception.


References:

Schematic diagrams

Title image credits: Samer Fouad