Multimodal Interactive VR Simulations

by emosul26 in Design > Software

27 Views, 0 Favorites, 0 Comments

Multimodal Interactive VR Simulations

download.jpeg

Our VR simulation system integrates multiple input devices to create an intuitive and immersive user experience, centered primarily around tangible controllers. These controllers, positioned on a table, are designed to be picked up and manipulated by the user to interact directly with the simulation. Each device is equipped with an accelerometer to capture motion data, which is transmitted via Bluetooth to a central computer. This data is then processed in real time to drive interactions within the virtual environment.

The controllers themselves are still in the design phase, but they will feature custom 3D-printed enclosures to securely house internal components. To enhance durability and usability, additional protective materials such as foam may be incorporated, improving both grip and resistance to impact.

Beyond physical controllers, the system also incorporates vision- and voice-based inputs. A camera will capture user movements, with supporting software translating hand gestures into spatial commands for navigation and interaction. In parallel, a microphone will enable voice commands, allowing users to adjust environment variables and issue direct instructions within the simulation.

All inputs are unified within a Unity-based framework, which renders the virtual environment and delivers the final output to a VR headset, providing a seamless and interactive experience.

Supplies

59e8ab56-7890-44bd-97fd-2bf84f18fc78.png
  1. Bluetooth Receiver/Transmitter Microcontrollers (x2)Model: Arduino Nano ESP32
  2. Link: https://store.arduino.cc/products/nano-esp32
  3. I/O Voltage: 3.3V
  4. Pins:
  5. 8 Analog
  6. 11 Digital
  7. Accelerometers (x2)Model: Adafruit LSM303
  8. Link: https://www.adafruit.com/product/1120
  9. Input: 5V
  10. Connections:
  11. VCC
  12. Ground
  13. SCL
  14. SDA
  15. Required Libraries:
  16. Adafruit_LSM303_Accel
  17. Adafruit_LSM303DLH_Mag
  18. Adafruit BusIO
  19. Adafruit Unified Sensor
  20. MicrophoneModel: Blue Yeti (Owned)
  21. Link: https://www.sweetwater.com/store/detail/YetiBlk--blue-microphones-yeti-multi-pattern-usb-condenser-microphone-blackout?mrkgadid=&mrkgcl=28&mrkgen=&mrkgbflag=&mrkgcat=&acctid=21700000001645388&dskeywordid=2337469666535&lid=92700080605831475&ds_s_kwgid=58700008755805609&ds_s_inventory_feed_id=97700000007215323&dsproductgroupid=2337469666535&product_id=YetiBlk&prodctry=US&prodlang=en&channel=online&storeid=&device=c&network=g&matchtype=&adpos=largenumber&locationid=9002848&creative=708892062224&targetid=pla-2337469666535&campaignid=21573890532&awsearchcpc=1&gclsrc=aw.ds&gad_source=1&gad_campaignid=21573890532&gbraid=0AAAAAD_RQYkFbuC0db0jRrhddvi2gwi8m&gclid=CjwKCAjwn4vQBhBsEiwAq3hhN2I-AG8H70K1WC0SI0g3fCfdgoa9Y0H-qTMm1uoMpCsZgBuz0Oq1EhoC1noQAvD_BwE
  22. Input: 5V DC (150 mA)
  23. Connection: USB
  24. USB Cable (Microcontroller)Used for programming and power
  25. CameraModel: Intel RealSense D415
  26. Link: https://www.realsenseai.com/products/stereo-depth-camera-d415/
  27. Input: 5V (700 mA)
  28. Connection: USB-C
  29. USB-C Cable (Camera)Used for data and power
  30. VR HeadsetModel: HTC Vive
  31. Link: https://www.vive.com/us/
  32. Output: 12V (7.5 W)
  33. Haptics (Optional / Future Work)Output components for tactile feedback (TBD)

Build Circuit Diagram

2ecf9800-d48a-4d1f-ae5a-2745c34af290.png
IMG_0569.jpg

Step 1: Gather Components

  1. Arduino Nano ESP32
  2. Adafruit LSM303 (x2)
  3. Breadboard
  4. Jumper wires (male–male)
  5. USB cable (for power + programming)

Step 2: Place the Microcontroller

  1. Insert the Arduino Nano ESP32 into the breadboard so each side of pins sits on separate rows.
  2. Ensure the board is centered and stable.

Step 3: Connect Power Rails

  1. Connect the 3.3V pin from the Nano ESP32 to the positive rail of the breadboard.
  2. Connect a GND pin to the negative rail of the breadboard.

This will distribute power to all components.

Step 4: Wire First LSM303 Accelerometer

  1. Place the first Adafruit LSM303 on the breadboard.
  2. Make the following connections:
  3. VCC → 3.3V rail
  4. GND → GND rail
  5. SCL → SCL pin on Nano ESP32
  6. SDA → SDA pin on Nano ESP32

Note: The Nano ESP32 uses I²C communication. Default pins are typically:

  1. SDA → GPIO21
  2. SCL → GPIO22

Step 5: Test First Accelerometer

  1. Upload test code from your GitHub repository:
  2. Location: arduinoAccel.ino
  3. Open the Serial Monitor.
  4. Verify that acceleration and magnetometer data are being printed.


Repeat for second accelerometer.

Sodering

IMG_0568.jpg
IMG_0567.jpg

Step 1: Gather Tools and Materials

  1. Arduino Nano ESP32
  2. Adafruit LSM303 (x2)
  3. Soldering iron (fine tip recommended)
  4. Solder (rosin-core, ~0.6–0.8 mm)
  5. Header pins (male or female, depending on design)
  6. Helping hands or PCB holder
  7. Wire cutters / strippers
  8. Safety glasses

Step 2: Decide Your Layout

Before soldering anything:

  1. Decide whether you want:
  2. Removable components → use female headers
  3. Permanent wiring → solder wires directly

For most projects, using male headers on the boards + jumper wires gives flexibility.

Step 3: Solder Header Pins to the LSM303 Boards

  1. Insert header pins into the breadboard (to hold them straight).
  2. Place the LSM303 board on top of the pins.
  3. Heat one pin and apply a small amount of solder until it flows into the joint.
  4. Repeat for all pins (VCC, GND, SDA, SCL).

Tip:

  1. Heat the pad and pin together, then apply solder
  2. Don’t blob solder—aim for a small cone shape

Step 4: Solder Header Pins to the Nano ESP32

  1. Insert header pins into the breadboard.
  2. Place the Nano ESP32 on top.
  3. Solder each pin carefully.

Check alignment before soldering all pins—if it's crooked, reheat and adjust.

Step 5: Prepare Wires (If Not Using Breadboard)

If you’re moving off a breadboard:

  1. Cut wires to appropriate lengths.
  2. Strip ~3–5 mm of insulation from each end.
  3. Tin the wires (apply a small amount of solder to the exposed wire).

Step 6: Solder Wires Between Components

For each LSM303:

  1. VCC → 3.3V (Nano)
  2. GND → GND (Nano)
  3. SDA → SDA (Nano)
  4. SCL → SCL (Nano)

Process:

  1. Heat the pad/pin on the board.
  2. Insert the tinned wire.
  3. Apply a small amount of solder.
  4. Hold still for ~1–2 seconds while it cools.

Keep wires short but not strained.

Step 7: Inspect Your Work

Check for:

  1. Cold joints (dull or cracked solder)
  2. Solder bridges (accidental connections between pins)
  3. Loose wires

If needed:

  1. Reheat and fix joints
  2. Use desoldering braid for cleanup

Step 8: Power and Test

  1. Plug in the Nano via USB.
  2. Upload your test code (same as breadboard stage).
  3. Confirm both accelerometers are detected.

Step 9: Optional Reinforcement

For durability:

  1. Add heat shrink tubing over connections
  2. Use hot glue or epoxy to secure wires
  3. Mount inside a 3D-printed enclosure


3D Model

Print the Shell

  1. Load filament into your 3D printer.
  2. Start the print and monitor the first few layers to ensure proper adhesion.
  3. Once complete, remove the parts and clean up any supports or rough edges.

Assemble the Electronics

  1. Place your soldered microcontroller and accelerometer setup into the lower compartment of the shell.
  2. Ensure the board sits securely and that wires are not pinched.
  3. Attach the top cover (using screws, clips, or adhesive depending on your design).

Final Check

  1. Confirm all ports are accessible
  2. Shake lightly to ensure nothing moves inside
  3. Power on and verify everything still works

Downloads

Setup Unity Project

# Unity integration


Copy the scripts from this folder into a Unity project, then:


1. Create an empty `GameObject` named `VrInteractionBridge` and add `VrInteractionBridge`.

2. Create another empty `GameObject` named `TestBallSimulation` and add `TestBallSimulationBehaviour`.

3. Create a Quad (or world-space panel) named `CameraPanel` and add `CameraFrameDisplay`.

4. Assign `VrInteractionBridge` and the panel `Renderer` in `CameraFrameDisplay`.

5. Position `CameraPanel` in front of the XR camera rig and scale it to taste.

6. Press Play in Unity.

7. Run the .NET host app from this repository. By default it streams UDP packets to `127.0.0.1:7777`.

8. If you want live camera imagery on the panel, set `VRI_CAMERA_SEND_FRAMES=true`. Gesture-only mode is the default.


## Three-body lab setup


1. Create an empty `GameObject` named `ThreeBodySimulation` and add `ThreeBodySimulationBehaviour`.

2. Assign `VrInteractionBridge` in the component.

3. Optional: assign custom body materials for default and selected body highlighting.

4. Start the host. `fluid-lab` is now the default simulation.

5. Set two controller names with `VRI_BLE_DEVICE_NAMES=ESP32-Sensor-L,ESP32-Sensor-R`.

6. Keep `VRI_ENABLE_BLE_INPUT=true` and `VRI_ENABLE_SPEECH_INPUT=true`.

7. Enable camera gestures with `VRI_ENABLE_CAMERA_INPUT=true` and `VRI_ENABLE_GESTURE_CONTROL=true`.

8. Provide MediaPipe hand assets under `models/mediapipe/mediapipe/modules/...` or set `VRI_MEDIAPIPE_ASSET_ROOT` to the folder that contains the `mediapipe` directory.

9. To fetch them automatically on Windows PowerShell, run `./scripts/setup-mediapipe-assets.ps1` from the repository root.

9. The default host settings expect these files for the selected model complexity:

- `mediapipe/modules/palm_detection/palm_detection_full.tflite` or `palm_detection_lite.tflite`

- `mediapipe/modules/hand_landmark/hand_landmark_full.tflite` or `hand_landmark_lite.tflite`

- `mediapipe/modules/hand_landmark/handedness.txt`

10. Optional: add `-Variant lite` or `-Variant both` if you want the lighter model or both variants available.

11. Optional: if you also want the compressed camera image stream, set `VRI_CAMERA_SEND_FRAMES=true`.

12. Optional speech quality defaults:

- `VRI_WHISPER_MODEL=SmallEn`

- `VRI_WHISPER_QUANTIZATION=Q5_1`

- `VRI_WHISPER_LANGUAGE=en`


`targetCamera` is optional. In an XR-only scene, leave it empty and assign `cameraRigRoot` to the XR rig object if you want the script to find the child headset camera and disable gravity on the rig.

`ThreeBodySimulationBehaviour` avoids moving the XR headset camera when VR is active or when the assigned/found camera is part of an XR-tracked rig. If you want the orbit camera in desktop mode, leave `controlSceneCamera` enabled and assign a separate non-XR spectator camera instead of the headset camera.

If your XR rig has a `Rigidbody`, the script disables `useGravity` on rigidbodies found on the rig root and its children by default so the headset rig does not fall under physics.


The simulation will spawn 3 bodies by default, visualize both controllers as force markers, render a floating status label for gravity / speed / scale / freeze state, and respond to voice commands.

Gesture controls: pinch to select and move a body, and open-hand pan to orbit the scene camera.


## Fluid lab setup


1. Create an empty `GameObject` named `FluidSimulation` and add `FluidSimulationBehaviour`.

2. Assign `VrInteractionBridge` in the component.

3. Optional: assign custom water and boat materials.

4. Start the host. `fluid-lab` is now the default simulation, or explicitly set `VRI_SIMULATION=fluid-lab`.

5. Keep `VRI_ENABLE_BLE_INPUT=true` and `VRI_ENABLE_SPEECH_INPUT=true` for the full interactive experience.

6. Keep `VRI_ENABLE_CAMERA_INPUT=true` and `VRI_ENABLE_GESTURE_CONTROL=true` if you want live gesture input.

7. The floating status label shows boat count, wave amplitude, wave speed, and stir intensity.

8. Supported speech commands include:

- `add boat`, `spawn boat`

- `remove boat`, `remove boat 2`

- `generate waves`, `make waves`, `calm water`

- `stir water`, `shake water`

- `increase wave speed`, `decrease wave speed`

- `set wave amplitude to 0.5`

- `pause`, `resume`, `reset`

9. Boating objects render as floating rectangles on a dynamic water plane.

10. Shake the controller to stir the water and drive stronger wave motion.


## Black hole lab setup


1. Create an empty `GameObject` named `BlackHoleSimulation` and add `BlackHoleSimulationBehaviour`.

2. Assign `VrInteractionBridge` in the component.

3. Optional: assign custom body materials for default and selected body highlighting.

4. Start the .NET host with `VRI_SIMULATION=black-hole-lab` to select the Black Hole Orbit Lab.

5. Keep `VRI_ENABLE_BLE_INPUT=true` and `VRI_ENABLE_SPEECH_INPUT=true` for the full interactive experience.

6. If you want gesture-driven camera control and hand input, also keep `VRI_ENABLE_CAMERA_INPUT=true` and `VRI_ENABLE_GESTURE_CONTROL=true`.

7. The floating status label shows black hole mass, spin, event horizon radius, ergosphere radius, and metric values.

8. Supported speech commands include:

- `pause`, `resume`, `reset`

- `add orbiting body`, `spawn orbit`

- `remove ball`, `remove ball 2`

- `freeze`, `unfreeze`

- `show event horizon`, `hide event horizon`

- `increase gravity`, `decrease gravity`, `set gravity to 2`

- `increase speed`, `decrease speed`, `set speed to 1.5`

- `set black hole mass to 8`, `increase black hole spin`, `set black hole spin to 0.5`

9. The simulation renders a central black hole with orbiting bodies, motion trails, and an event horizon sphere when enabled.

10. If using XR, leave `targetCamera` empty to preserve headset tracking. For desktop mode, assign a separate spectator camera instead.


## Running on two different computers (same Wi-Fi)


1. On the Unity computer, keep `listenPort` at `7777` (or choose your own port).

2. Find the Unity computer's Wi-Fi IPv4 address (for example `10.x.x.x` or `192.168.x.x`).

3. On the .NET input computer, set:

- `VRI_UNITY_HOST=<UnityComputerIPv4>`

- `VRI_UNITY_PORT=7777`

4. Start Unity Play mode first, then start the .NET host.

5. Confirm connectivity in Unity Console by checking for:

- `VR Interaction bridge listening on UDP 7777`

- `VR Interaction packet source: <sender-ip>:<sender-port>`


If packets do not arrive:


- Allow inbound UDP on the Unity machine for the Unity Editor (or build) and chosen port.

- Verify both devices are on the same SSID/subnet.

- Some campus networks isolate clients on Wi-Fi. If so, ask IT to disable client isolation for your lab network.


## What the bridge does


- Receives controller state and simulation state over UDP.

- Keeps the Unity side independent from a specific controller implementation.

- Lets you add more simulation behaviours by subscribing to the same bridge events.

- Spawns balls from the camera when acceleration crosses a threshold, launched in controller pointing direction.

- Draws a live line from the camera showing current pointing direction.


## Runtime calibration


- In Play mode, point the controller where you want "straight ahead".

- Press `C` to capture calibration samples (`calibrationSampleKey`).

- After `calibrationSamplesRequired` captures, averaged offsets are applied.

- Press `V` to save current calibration and `X` to clear saved calibration.


## Guided axis calibration


- Press `B` to start guided calibration mode.

- Red lines are the target orientation, green lines are current orientation.

- Left click (or press `C`) to capture each target when aligned.

- The sequence solves independent pitch/yaw/roll multipliers + offsets, then saves automatically.

- Press `N` to skip a target, `V` to save manually, `X` to clear saved calibration.


## Extending it


- Add a new `MonoBehaviour` that subscribes to `VrInteractionBridge.SimulationStateReceived`.

- Add more input-driven variables in the .NET host by registering new `VariableBinding` entries.

- Replace the mock input provider with your BLE provider or future serial / network providers.


Connect Other Inputs and Run Code

Part A: Set Up VR with Meta Horizon + Unity

  1. Install Required Software Install Unity (use a version compatible with Meta XR tools).
  2. Install Unity Hub to manage versions and projects.
  3. Install Meta XR / Horizon Tools Follow the official Meta tutorial:
  4. https://developers.meta.com/horizon/documentation/unity/unity-tutorial-hello-vr/
  5. This will walk you through:
  6. Installing the Meta XR SDK
  7. Configuring XR Plug-in Management
  8. Enabling VR support in Unity
  9. Set Up Your Headset Use a Meta headset (e.g., Meta Quest 2 or similar).
  10. Enable Developer Mode in the Meta Horizon mobile app.
  11. Connect the headset to your computer via USB.
  12. Allow USB debugging inside the headset when prompted.
  13. Test VR in Unity Create or open a sample scene from the tutorial.
  14. Press Play or build to the headset to confirm VR rendering works.

Part B: Connect the Microphone

  1. Plug in your Blue Yeti via USB.
  2. On your computer:
  3. Set it as the default input device (Sound Settings).
  4. In Unity:
  5. Ensure microphone permissions are enabled.
  6. Use Unity’s audio input APIs to confirm it’s detected.

Optional test: Record short audio input to verify functionality.

Part C: Connect the Camera

  1. Plug in the Intel RealSense D415 via USB-C.
  2. Install required drivers and SDK from Intel’s website.
  3. In Unity:
  4. Import the RealSense SDK package.
  5. Test camera feed (depth or RGB) inside a simple scene.

Confirm the camera is capturing motion data correctly.

Part D: Run the VR Interaction Code

  1. Clone or download the repository:
  2. https://github.com/Syd4r/VR_Interaction.git
  3. Open the project in Unity:
  4. File → Open Project → Select the cloned folder
  5. Configure the project:
  6. Ensure XR Plug-in Management is enabled
  7. Verify input systems (controllers, camera, mic) are recognized
  8. Build and Run:
  9. Connect your headset
  10. Click Build and Run in Unity
  11. The simulation should launch directly in VR


References

References

[1] Unity Technologies, “Unity Real-Time Development Platform.” https://unity.com/

[2] Khronos Group, “OpenXR Specification.” https://www.khronos.org/openxr/

[3] Valve Corporation, “SteamVR Developer Documentation.” https://developer.valvesoftware.com/wiki/SteamVR

[4] Meta Platforms, Inc., “Meta XR Development Documentation.” https://developer.oculus.com/documentation/

[5] Blender Foundation, “Blender 3D Creation Suite.” https://www.blender.org/

[6] P. Milgram and F. Kishino, “A Taxonomy of Mixed Reality Visual Displays,” IEICE Transactions on Information and Systems, 1994.

[7] S. Oviatt, “Ten Myths of Multimodal Interaction,” Communications of the ACM, 1999.

[8] R. A. Bolt, “Put-That-There: Voice and Gesture at the Graphics Interface,” ACM SIGGRAPH, 1980.