This short post is about our latest demo called ‘Subway’ which shows Enlighten running in a full level using dynamic lighting and quite a few features of the Enlighten SDK including the ability to dynamically destroy parts of the level to let light pass through and keep the indirect lighting consistent.
Since the dawn of 3D graphics – and even as far back as text-based adventure games – immersion has been at the forefront of game development. How can we create more immersive entertainment
experiences? Ones that will grab the attention of consumers to keep them coming back for more.
2016 saw a major step forward in this regard, with the arrival of genuine virtual reality. Aborted attempts have been made several times before, but with technology finally able to provide a comfortable VR headset, performing at a framerate and with graphical fidelity that won’t cause instant nausea for the user, VR is truly here to stay.
However this is not the end of the VR journey. In fact, we’re really just getting started. With the hardware demands of virtual reality at a premium, and a history in game development of cutting corners by only rendering what the user will see in order to maximise performance – a whole new set of challenges have arrived. Developers can’t hide anymore, not when the user has the ability to explore entire scenes in VR and look under every table.
Lighting has long been the linchpin of immersion. Whilst effective lighting can draw a user in to a scene or game, poor lighting is all too often the first thing that causes the entire thing to fall apart. Within this difficult development context, VR Intelligence sought out three of the shining lights working at the cutting edge of VR development. Representing Oculus Story Studio, ARM Enlighten and game studio nDreams, these industry thought leaders deliver key insights that cut through the mysticism of effective lighting in VR in this free white paper.
Click here to read the full whitepaper.