The blending of the real and virtual worlds leads to some exciting opportunities for innovation in teaching. This project collaboration between the immersive team and academics and education developers from both Science and Engineering explores ways to bring the lecturer into their virtual worlds, manipulate 3D digital objects sitting virtually in the real world and related use-cases.
Project Details
The two facets of the project overlap in their intent to merge together real and virtual elements in real-time for their courses. In Engineering David Kellermann (Associate Professor, Mechanical Engineering) and John-Paul Posada (Education Technologist) were looking to bring 3D digital models into live lectures where the teacher could stand in, move around and interact with the virtual image at the front of the class leveraging the sensors and video from the Microsoft Kinect. In Science Dr Siobhan Wills (Senior Lecturer, Chemistry) and Stephen Parker (Education Technology) were looking to bring the live lecturer into their virtual 3D environment for the students leveraging Meta Quest VR headsets and secondary cameras.
TECHNOLOGY & DEVELOPMENT
The active design and development of the project continues in collaboration with staff from Engineering and Science. The staff and work integrated learning student developers are using the Unity game engine to bring together all the elements such as our Meta Quest 3 headsets and Microsoft Kinect sensors.
Get Involved
There are internal Teams Channels, Miro Boards and working groups covering the development collaboration with Science and Engineering academics. To find out more, join the team, get a demo or have your class or promotion use-case included just get in touch.
- Email: [email protected]