Python VR Workshop
Workshop Details#
| Format | In-person (bring your own laptop) |
| Date | April 2026 |
| Duration | 09:00–15:15 (full day with lunch break) |
| Instructors | Chunyu Qu, Artyom Zinchenko & Zhuanghua Shi |
| Institution | LMU Munich |
| Repository | github.com/msenselab/vr-tutorial |
| Capstone project | MazeWalker-Py |
| Funded by | Erasmus+ KA210-VET · xr4vet.eu |
What You Will Learn#
- Build interactive 3D environments with the Ursina game engine (Python wrapper for Panda3D)
- Handle input devices (keyboard, mouse, gamepad) via pygame
- Design experiments with trial sequencing, state machines, CSV data logging, and EEG trigger codes
- Load and animate external 3D models (GLB format)
- Understand a production experiment codebase (MazeWalker-Py)
- See the path from desktop 3D to VR headset (OpenXR)
Who Is This For?#
The workshop serves two roles that exist in every research lab:
- Builders — researchers who design and code experiments. You will create 3D scenes, add interaction, and structure a full experiment with trials, conditions, and data logging.
- Runners — research assistants and technicians who set up and operate experiments. You will learn how the software works, how to configure parameters, and how to troubleshoot.
Prerequisites: Basic Python skills (variables, functions, loops). No 3D or VR experience required.
Prerequisites#
Participants must complete the following before the workshop day:
- Python 3.11 — Download from python.org (not 3.12+ — the 3D engine requires 3.11).
- uv — Fast Python package manager. Install via
curl -LsSf https://astral.sh/uv/install.sh | sh(macOS/Linux) orpowershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"(Windows). - Workshop materials — Download the ZIP or
git clone https://github.com/msenselab/vr-tutorial.git. - Virtual environment —
uv venv && source .venv/bin/activate && uv pip install ursina pygame.
Setup guide: Follow the Laptop Setup Guide step by step.
Schedule#
| Time | Duration | Module |
|---|---|---|
| 09:00–09:20 | 20 min | M1: Welcome & Setup Check |
| 09:20–10:05 | 45 min | M2: Ursina Fundamentals |
| 10:05–10:25 | 20 min | Break |
| 10:25–11:05 | 40 min | M3: Interaction & Input |
| 11:05–11:50 | 45 min | M4: Experiment Paradigm Design |
| 11:50–12:00 | 10 min | Q&A & Hands-on |
| 12:00–13:00 | 60 min | Lunch |
| 13:00–13:45 | 45 min | M5: Capstone — MazeWalker-Py |
| 13:45–14:10 | 25 min | M6: VR Roadmap |
| 14:10–14:30 | 20 min | Break |
| 14:30–15:00 | 30 min | M7: Beyond Primitives (3D Models) |
| 15:00–15:15 | 15 min | Wrap-up & Homework |
Teaching time: ~4 h 30 min | Breaks + Lunch: ~1 h 50 min | Total: ~6 h 15 min
Exercises#
Six progressive exercises, each building on the previous:
| # | Exercise | What You Learn | Duration |
|---|---|---|---|
| 1 | Hello Ursina | Verify setup; first 3D scene with a cube | 10 min |
| 2 | Build a Room | Construct a room from primitives; position, rotate, scale | 25 min |
| 3 | Make It Real | Add colliders, textures, lighting, and a skybox | 20 min |
| 4 | Pick Up the Star | Proximity detection, HUD display, gamepad input | 25 min |
| 5 | Mini Experiment | State machine, trial sequencing, CSV logging, EEG triggers | 35 min |
| 6 | Load 3D Models | Import GLB models, animate, proximity highlight | 15 min |
Each exercise folder contains a template.py (starting point with TODOs) and a solution.py (reference implementation).
Session Details#
M1: Welcome & Setup Check (20 min)#
Verify everyone’s environment works. Run hello_cube.py — if you see an orange cube, you’re ready.
M2: Ursina Fundamentals (45 min)#
Ursina coordinate system, Entity properties (model, position, scale, rotation, color, texture), EditorCamera vs FirstPersonController. Exercises 1–2: build a walkable room from quads and cubes.
M3: Interaction & Input (40 min)#
Colliders, textures, Sky, DirectionalLight, proximity detection, gamepad input via pygame. Exercises 3–4: add realism to the room and make collectible stars.
M4: Experiment Paradigm Design (45 min)#
State machine pattern (INSTRUCTION → FIXATION → TASK → FEEDBACK), trial sequencing with conditions and repeats, CSV data logging, mock EEG trigger codes, invoke() for timed transitions. Exercise 5: build a complete mini-experiment.
M5: Capstone — MazeWalker-Py (45 min)#
Architecture walkthrough of MazeWalker-Py, a production maze-navigation experiment for EEG research. Mapping tutorial patterns to real-world code: how experiment.py extends the same state machine, how .maz files define mazes, how real trigger hardware replaces mock triggers.
M6: VR Roadmap (25 min)#
What changes from desktop to VR headset (~10% of the code). OpenXR integration via panda3d-openxr. Supported headsets (Meta Quest, HTC Vive, Valve Index, Pimax). What stays the same: scene building, state machine, data logging.
M7: Beyond Primitives — 3D Models (30 min)#
Loading external .glb models with load_model(). Handling PBR materials on macOS (the load_glb() helper). Sine-wave animation. Proximity-based interaction with imported models. Exercise 6: load Angel and Swing models into the room.
Wrap-up & Homework (15 min)#
Recap. Pointers to resources. Adapt the exercises for your own research question.
Technology Stack#
| Tool | Purpose |
|---|---|
| Python 3.11 | Programming language |
| Ursina | 3D game engine (wraps Panda3D) |
| Panda3D | Underlying 3D rendering engine |
| pygame | Gamepad/joystick input |
| uv | Python package manager |
Resources#
- Repository: github.com/msenselab/vr-tutorial — all exercises, slides, and documentation
- Capstone: github.com/msenselab/MazeWalker-Py — production experiment codebase
- Ursina docs: ursinaengine.org
- Panda3D docs: panda3d.org
- VR plugins: panda3d-openxr · panda3d-openvr