Hi, I'm a software engineer and co-founder of 2Sync.
We’re developing an auto-adaptive Mixed Reality SDK that transforms any space into an immersive,
interactive environment—without complex setup or expensive infrastructure.
I believe that VR should respond to the real world as seamlessly as a
website adapts to the size of your screen. In the same way,
the physical world should be the canvas for MR experiences.
With the 2Sync SDK, we're taking the first step: creators can build spatially
aware content that adapts in real time to the user’s surroundings—unlocking true
room-scale experiences without manual calibration or custom mapping.
2Sync is the responsive design tool for Extended Reality:
Our SDK enables immersive, free-roam MR apps that auto-adapt to any physical environment instantly.
By mapping and integrating the real world into the virtual experience,
the 2Sync SDK allows true spatial immersion, combining the benefits of free movement,
passive haptics, and reduced motion sickness.
See more here.
Software Engineer in the field of virtual engineering for R&D projects. Working on in-car AR/VR experiences.
Additional information about current live sports games (like soccer or rugby) in the TV is already ubiquitous. With the departure of computer science at the University of Otago, we are working on an AR project to bring this additional information (like player stats, heat map, offside line ...) directly to the user's phone in the stadium.
Current passive haptics VR experiences only run with a premeditated set of physical props, preventing such experiences from running anywhere else. During my master thesis, I developed with a small team at Hasso Plattner Institute in Potsdam a software system that allows passive haptics experiences to run on different sets of props, such as physical objects found in the home. My system accomplishes this by allowing Experience Designers to define the virtual objects in their experience using a generic format. This format allows the system to run the experience in a wide range of locations by procedurally modelling virtual object sets to match the available props.
In context and beyond of my compulsory internship at the small Start-Up 'You-VR' I started to work as a Software-Engineer. We developed a multisensory location-based multiplayer VR-Experience in Berlin. In the context of the VR NOW CON our experience got nominated for the interactive VR experience award (see: https://www.vrnowcon.io/awards/).
Implementing visualizations and tools for an autonomous driving project.
Directly after my Bachelor present, I worked for the Keio-NUS CUTE Center in Singapore as a research assistant. I started to support their Virtual Interactive Human Anatomy (VIHA) project and finally, I got my own project, where I was responsible to create an Unity3D based plugin, that allows to easily add hand (Leap Motion) or controller (VIVE) interactions to any project without coding skills. The user could choose from a predefined set of triggers (grab, pinch, recorded gestures, ...) and a predefined set of actions (attach, throw, slice, explode, ...) to enhance their VR environment with interactions.
In the context of my bachelor thesis, I implemented and evaluated a software system to increase the robustness of hand and finger tracking for multitouch-tabletops. Compared to other similar systems like the tabletop "PixelSense" of Microsoft and open source projects like Community Core Vision 1.5 and reacTIVision my system produces more robust results. Additionally, my software system is able to identify hands and assign them to different users.
Between October 2014 and July 2015 I have been student assistant at the VR-Laboratory of the Bauhaus University in Weimar.
Turn Your Home into an Elven Watchtower and Defend the Realm!
Greetings, archer, and welcome to Elven Arrows, an auto-adaptive
and immersive Mixed Reality game that transforms your entire room into a free-roam VR watchtower.
TURN YOUR HOME INTO THE LAST SPACESHIP
Welcome to The Last Galaxy, an auto-adaptive and fully immersive Mixed Reality game
that transforms your entire environment into interactive spaceship stations using your Teleporter.
Welcome to Mixed Snow Worlds, an innovative auto-adaptive Mixed Reality game that transforms your home into a magical icy realm! Your entire environment with all objects becomes part of an icy and fully immersive free roam MR and VR experience.
Show me moreHouse Defender is the first publicly available application that uses 2Sync's SDK. It is the first VR location-based shooter at home, fully generated with 2Sync. With House Defender you turn your entire surroundings into a VR playspace and save yourself from the onrushing monsters! Alone or together with your friends.
Show me moreWith 2Sync, we want to take the VR world to the next level and enable unique VR experiences. With our Software Development Kit for Unity, developers can describe their own room-independent experiences, to provide House Scale VR experiences and enhance the immersion for their users immensely.
Show me moreInvention and evaluation of a computer vision based system that uses lite-weight robots to measure car gap sizes fully automatically during the assembly line operations.
View ProjectAR application for live sports games at Otago University in Dunedin, New Zealand. The goal of the project is to bring additional information to the user's phone (or HoloLens) during live games.
View ProjectA System to automatically generate Passive Haptics Experiences in arbitrary environments.
View Project VideoDevelopment of a location-based VR Experience at You-VR that was nominated for the Interactive VR Experience Award 2018 (see here).
View Project VideoVIHA
Supported the development of the Virtual Interactive Human Anatomy (VIHA) project of the Keio-NUS CUTE Center in Singapore.
VRaut
Implemented an Unity3D based plugin, that allows to easily add hand (Leap Motion) interactions to any project without coding skills.
A method to optimize hand and finger tracking for multi-touch tabletops.
View Project Video Download CodeSimultaneous Localization and Mapping for Unmanned Aerial Systems.
View Project Poster Documentation Code