I am a Ph.D. student with Patrick Baudisch at Hasso Plattner Institute at Potsdam University.

I have interned at Microsoft Research with Andy Wilson.

Research Agenda: An Operating System for Virtual Reality

The goal of my research is to allow virtual reality experiences to be run in arbitrary tracking volumes and with arbitrary physical objects.

VR experiences today are designed with a specific tracking volume and objects in mind, such as “square 5x5m space with a rubber sword”. This prevents experiences from running with different objects or in tracking volumes of smaller size or different shape, making it impossible to share experiences, especially with home users.

I address this by creating an abstraction between VR applications and the space and physical objects they are using. Instead of accessing space and physical objects directly, in my system applications express their needs in an abstract way, which my systems then maps to the actual available physical space and physical objects. This allows VR applications to run on a wide range of installations.

Solving this problem would have substantial commercial impact, as the proliferation of real-walking VR is currently hindered by developers’ reluctance to require users to have space and objects.

My work is inspired by operating systems research. Before opening systems, application programs were written for a specific machine. Operating systems allow applications to run on arbitrary computers and architectures by creating an abstraction of the physical hardware, an API, that allows applications from accessing the hardware directly.

Full Papers at ACM CHI and ACM UIST

7. Mise-Unseen: Using Eye-Tracking to Hide Virtual Reality Scene Changes in Plain Sight

Sebastian Marwecki, , , , and Christian Holz. Full Paper and Demo at UIST '19

We present Mise-Unseen, a software system that applies covert scene changes inside the user’s field of view. Mise-Unseen leverages gaze tracking to create models of user attention, intention, and spatial memory to determine if and when to inject a change. We present seven applications of Mise-Unseen (i) to hide that task difficulty is adapted to the user, (ii) to adapt the experience to the user’s preferences, (iii) to time the use of low fidelity effects, (iv) to detect user choice for passive haptics even when lacking physical props, (v) to sustain physical locomotion despite a lack of physical space, (vi) to reduce motion sickness during virtual locomotion, and (vii) to verify user understanding during story progression.

6. Scenograph: Fitting Real-Walking VR Experiences into Various Tracking Volumes

Sebastian Marwecki, and Patrick Baudisch. Full Paper and Demo at UIST '18

When developing a real-walking virtual reality experience, creators generally design virtual locations to fit a specific tracking volume. Unfortunately, this prevents the resulting experience from running on a smaller or differently shaped tracking volume. To address this, we present a software system called Scenograph. The core of Scenograph is a tracking volume-independent representation of real-walking experiences. Scenograph instantiates the experience to a tracking volume of given size and shape by splitting the locations into smaller ones while maintaining narrative structure.

5. VirtualSpace - Overloading Physical Space with Multiple VR Users

Sebastian Marwecki, Maximilian Brehm, Lukas Wagner, Lung-Pan Cheng, Florian 'Floyd' Mueller, and Patrick Baudisch. Full Paper at CHI '18

Although virtual reality hardware is now widely available, the uptake of real walking is hindered by the fact that it requires often impractically large amounts of physical space. To address this, we present VirtualSpace, a novel system that allows overloading multiple users immersed in different VR experiences into the same physical space. VirtualSpace accomplishes this by containing each user in a subset of the physical space at all times, which we call tiles; app-invoked maneuvers then shuffle tiles and users across the entire physical space. This allows apps to move their users to where their narrative requires them to be while hiding from users that they are confined to a tile. We show how this enables VirtualSpace to pack four users into 16m2.

4. iTurk: Turning Passive Haptics into Active Haptics by Making Users Reconfigure Props in Virtual Reality

Lung-Pan Cheng, Li Chang, Sebastian Marwecki, and Patrick Baudisch. Full Paper and Demo at CHI  '18

We present a system that complements virtual reality experiences with passive props, yet still allows modifying the virtual world at runtime. The main contribution of our system is that it does not require any actuators; instead, our system employs the user to reconfigure and actuate otherwise passive props. We demonstrate a foldable prop that users reconfigure to represent a suitcase, a fuse cabinet, a railing, and a seat. A second prop, suspended from a long pendulum, not only stands in for inanimate objects, but also for objects that move and demonstrate proactive behavior, such as a group of flying droids that physically attack the user. Our approach conveys a sense of a living, animate world, when in reality the user is the only animate entity present in the system, complemented with only one or two physical props.

3. DualPanto: A Haptic Device that Enables Blind Users to Continuously Interact with Virtual Worlds.

Oliver Schneider, Jotaro Shigeyama, Robert Kovacs, Thijs Jan Roumen, Sebastian Marwecki, Nico Boeckhoff, Patrick Baudisch. Full Paper and Demo at UIST '18

We present a new haptic device that enables blind users to continuously track the absolute position of moving objects in spatial virtual environments, as is the case in sports or shooter games. Users interact with DualPanto by operating the me handle with one hand and by holding on to the it handle with the other hand. Each handle is connected to a pantograph haptic input/output device. The key feature is that the two handles are spatially registered with respect to each other. When guiding their avatar through a virtual world using the me handle, spatial registration enables users to track moving objects by having the device guide the output hand. This allows blind players of a 1-on-1 soccer game to race for the ball or evade an opponent; it allows blind players of a shooter game to aim at an opponent and dodge shots.

2. Mutual Human Actuation

Lung-Pan Cheng, Sebastian Marwecki, and Patrick Baudisch. Full Paper and Best Demo Award at UIST '17

We introduce mutual human actuation, a version of human actuation that works without dedicated human actuators. The key idea is to run pairs of users at the same time and have them provide human actuation to each other. Our system, Mutual Turk, achieves this by (1) offering shared props through which users can exchange forces while ob- scuring the fact that there is a human on the other side, and (2) synchronizing the two users’ timelines such that their way of manipulating the shared props is consistent across both virtual worlds.

1. Providing Haptics to Walls and Heavy Objects in Virtual Reality by Means of Electrical Muscle Stimulation

Pedro Lopes, Sijing You, Lung-Pan Cheng, Sebastian Marwecki, and Patrick Baudisch. Full Paper and Demo at CHI '17

In this project, we explored how to add haptics to walls and other heavy objects in virtual reality. Our main idea is to prevent the user’s hands from penetrating virtual objects by means of electrical muscle stimulation (EMS). Figure 1a shows an example. As the shown user lifts a virtual cube, our system lets the user feel the weight and resistance of the cube. The heavier the cube and the harder the user presses the cube, the stronger a counterforce the system generates. Figure 1b illustrates how our system implements the physicality of the cube, i.e., by actuating the user’s opposing muscles with EMS.

Other Publications

Masterthesis published as journal paper

Sebastian Marwecki, Belén Ballester, Esther Duarte, Paul Verschure. Journal of Disability and Rehabilitation '17.

For my master thesis at the Universitat Pompeu Fabra in Barcelona, Spain, I focused on Virtual Reality applications for motor relearning, which benefit stroke therapy. In a study participants were to perform reaching tasks within a gamified context while being provided with feedback towards self-comparative mastery and other-comparative ego goals. This work is published at the Journal of Disability and Rehabilitation. My master was funded by the German Academic Scholarship Foundation.



Bachelorthesis published as poster at CHI '13

Sebastian Marwecki, Roman Rädle, and Harald Reiterer. WIP at CHI '13 EA

For my bachelor thesis at the University of Konstanz, Germany, I developed "Invasion of the Wrong Planet" - a game played on a hybrid interactive surface for cognitive-behavioral therapy for children with conditions in the autistic spectrum. This project has been awarded with the Karl-Steinbuch Stipendium by the MFG Stiftung Baden-Württemberg and the VEUK award of the University of Konstanz. I presented this work at CHI '13 and '14 as a Workshop and WIP.



Other Projects

You can find out about my other projects HERE (board games, apps, etc).

Druckversion Druckversion | Sitemap
© Sebastian Marwecki