The world’s smallest VR headset now has eye tracking!

This is a breakthrough in VR tech innovation!
The virtual reality community is buzzing with excitement over modders expanding the capabilities of the lightweight Bigscreen VR headset. The latest breakthrough integrates eye-tracking technology, with open-source software and designs for the low price of $50. This enhancement signals a major shift towards more immersive VR experiences and highlights the growing VR modding community.
Recently unveiled, EyetrackVR’s Eye Tracking mod introduces a new level of interaction to the Bigscreen Beyond headset. This community-developed mod marks a key milestone by enabling more immersive social experiences in VR apps like VRChat.
The potential impact of eye tracking in virtual reality is profound. It goes beyond simply following a user’s eye to enhancing the virtual landscape.
Eye tracking improves immersion for others in VR, allowing them to see what a person in VR is looking at, building on the connection between users.
By identifying where a viewer focuses within a 3D scene, rendering engines can dynamically prioritise visual resources, enhancing quality precisely where the user’s attention is directed. This technological advancement promises a more realistic and immersive virtual environment.
As a pioneer in virtual reality, Bigscreen has continually pushed the limits of comfort and immersion, redefining what’s possible in engaging virtual experiences. With over 6 million users across headsets like Meta Quest and Valve Index, Bigscreen’s innovative spirit endures.
Why are we excited about this? At Scenegraph, we are building further into the XR-Lab for Research and Development purposes. Combining the latest in VR visual quality and comfort with eye tracking opens up many possibilities for VR research projects:
  • understanding where users are looking at within a 3D scene which leads to better insights than relying on questionnaires.
  • eye-contact simulations are used for users who struggle with eye contact, improving social anxiety.
  • real-time feedback within a 3D scene. For example, if a user focuses on an object, the environment can respond to where the user is looking – something which is not accurately able to be achieved without eye-tracking.

Scoping Session: How it works

On the scoping call, we will quickly understand your needs and help plan a path to help fix the problem you are having.

To make sure we scope your project as fast as possible, we will send you a quick form to fill out so understand your needs before joining the call.

Book in with Dr David Tully by clicking the date and time you prefer.