• Monday July 25th, 2016
  • Blogs

Global Illumination in Mobile Games with Enlighten

By Eleanor Brash, Marketing Manager – Enlighten, ARM

The Mobile Gaming Phenomenon

In 2016, for the first time, mobile gaming will take a larger share of worldwide game revenue than PC with 27% of the total $99.6Bn market[1]. The market has exploded, fuelled by the increased penetration of mobile devices, the increased connectivity of consumers and the increased performance of smartphones.

There are over 2Bn smartphone users in the world[2]; in the UK 76%[3] of adults have a smartphone and they use it for nearly two hours every day to browse the internet, access social media, or shop online[4].  Between 2009 and 2015 the capabilities of mobile devices improved to offer 100 times more compute and 300 times greater graphics performance[5]. The latest product announcement from ARM, whose IP is in over 95% of all smartphones, states that the new ARM Mali-G71 will deliver 1.5x higher performance in-device compared to the previous year’s GPU, the Mali-T880.

What does that extra performance mean for mobile games? The latest Mali-G71 GPU can compute 675M more triangles per second and 109 more pixels per second than the Mali-T880, the GPU within the Exynos 8 Octa chipset of the Samsung Galaxy S7. This means more geometry can be consumed per frame, richer textures can be drawn and more elaborate shading and post processing effects can add further polish to the final render. It means that complex 3D scenes, the kind that make a computer generated world seem real, are coming to gamers’ pockets.

The most popular mobile games across all regions remain casual games[6]; the drop-in, drop-out low commitment of these games reflects the usage habits of consumers who mainly play on the move; their ease of use also opens up their addressable market to non-traditional gamers.

However, there are a couple of interesting trends that could dent the omnipresence of casual games in the top downloaded mobile application lists.

Virtual reality has just started the process of learning what will and will not work on its platform in a trial and error process similar to when mobile gaming first took off. Thanks to the ubiquity and capability of mobile devices, it will be this medium that takes virtual reality and makes it mainstream rather than the PC or console counterparts. Yet in order for the technology to be a success, mobile VR needs to offer an equivalent high quality, immersive experience from the outset. While casual experiences will be part of this, an increased demand for complex 3D graphics on mobile can be expected as virtual reality takes off.[7]

The second trend is that of increased competition, yet increased assurance from game companies that mobile is the battle they want to fight. Let’s take China as the most extreme example.  China alone consists of 25% of global game revenues[8]. Core mobile titles are starting to cannibalize PC game spending and the mobile segment in China is expected to reach $10Bn this year, up 41% on 2015[9].  Over 100 new mobile titles are released each day on their various app stores.  For this region in particular game studios are increasing the budgets of their mobile games and aiming to create a differentiated experience, whether through the gameplay, the audio or the graphics[10].

Mobile Global Illumination

Given the above, it is likely that mobile consumers of the future will demand more complex 3D graphics from their devices, whether for gaming, or virtual reality, or other multimedia experiences such as architectural visualization.

This complexity could come in a number of forms. Physical correctness is one – how photorealistic could one make mobile graphics? Interaction is another; what extent of the scene can the user move or change with a believable graphical result appearing in milliseconds.

As physical correctness becomes more expected of these experiences, global illumination will play a key contributing factor. Global illumination is the effect that computes the way that light reacts with the materials in a scene as it is bounced or absorbed from one object to another. In the same way in which light in the natural world changes colour and intensity depending on the material it interacts, so too does this computer graphics technique. This very fact – that it is replicating the way light works in the physical world – is what makes this effect a large contributing factor to the realism and correctness of computer graphics.

For believable interaction, global illumination needs to be computed in real-time. If the user decides to move objects, or turn off a light, or change the properties of a surface then the light in the scene need to respond accurately and quickly or else the sense of immersion in the scene will be broken. These interactions are present everywhere in games – imagine exploring a dungeon with a torch and how the light affects the overall illumination of such an enclosed space. It is also important for other multimedia experiences such as architectural visualization, mentioned above. Imagine showing a client an interior design in a meeting; in an ideal world this experience would be portable and editable on the fly such that the client can iterate on the design with the developer no matter where they were meeting and see a precise representation of the room no matter what design decisions were made.

There are great benefits to including real-time global illumination in an application. Yet the rendering equation that defines what happens to light on leaving a point on a surface, something that is the basis of virtually all the realistic rendering techniques in the last thirty years, is far too complicated to compute on the fly, especially on mobile platforms.

2

And so the industry develops workarounds.

Baked global illumination

This is the most common technique used by mobile game developers due to its low runtime cost. However, this affordability comes at the expense of runtime dynamism. With baked lighting, when lights and materials are updated at runtime the global illumination does not update with it. They must either remain fixed, or the user must accept visual infidelity in the results.

Baked global illumination fully pre-calculates lighting information for static scenes and stores the output in static data structures which are then consumed by fragment shaders. The most commonly used data structures are textures, spherical harmonics probes and cubemaps.

Faking it

Much like a photographer sets up a set for a photoshoot to obtain a particular effect, so too can an artist set up the lighting in a scene by placing “fake” lights that mimic the effect of bounced light. This is a common technique among game studios as it is quick for artists to iterate on to achieve a result. However, it is very hard to produce physically correct results with this technique. And, especially on platforms where performance is more limited, the engine may limit the number of direct light sources available. For example, Unreal Engine 4.9 allowed for only four dynamic point lights on each object being illuminated.

Enlighten

Looking back at the rendering equation at the start of this section, what would happen if you had a finite set of elements and diffuse transport only? Things become quite a bit simpler.

3

The material dependency can be moved out of the integral which is now a sum of the radiosities of other elements times the form-factor between two elements. The form-factor is the fraction of light leaving element j arriving on element i.  Multiply this sum by the material properties and then add whatever light it is emitting by itself and you have the radiosity for a scene.

This is how Enlighten solves the global illumination problem.  Below the same equation is represented in a more visual form.  For each pixel, it calculates offline what elements it can see, and at runtime it adds up all the component colours.

5

The handle may need to be cranked a few times in order for the solution to converge, but solving this can be run on the CPU over several frames entirely independent of the render frame rate. This makes it ideal for mobile where the developer can set a budget for Enlighten and not worry about it bottlenecking the GPU and generating stutters.

Ongoing mobile development

One of the challenges of mobile development is the breadth of devices available. The smartphone market does not just consist of the latest release, such as the Samsung Galaxy S7; most gamers own a phone from what ARM defines as the “mid-range” or “entry level” – something that costs less than $350 to purchase. These devices normally come with previous generation processor technology, previous generation API support or simply fewer cores.

Enlighten has long been able to compute dynamic global illumination on mobile devices. The challenge has been running dynamic global illumination on all mobile devices, so that any gamer can take advantage of its benefits no matter the device. This is a much larger challenge.

In the first half of 2016 the Enlighten team made some changes to the product to address this.

Surface relighting model

While on PC and console the shader cost of the latest Enlighten surface relighting model is acceptable, for mobile a more efficient approach is preferred. We have introduced to the Unreal Engine integration a more efficient model for mobile that still relies on directional irradiance but simply stores less data.

Cubemap support

Enlighten cubemaps, that are used to capture indirect lighting changes in real-time reflections, are now able to capture the emissive environment in real-time in the mobile path.

Enlightened Mobile Developers

4

The Unity engine is the leading global game development software. More games are made with Unity than with any other game technology. More players play games made with Unity and more developers rely on our tools and services to drive their business. It has a 45% market share of the global game engine market, approximately three times that of its nearest competitor.

For mobile in particular, the majority of the top-grossing 3D mobile games are made using Unity[11].

5

In 2015 Unity released the fifth instalment of their engine with a major focus on reviving graphics fidelity and lighting in particular. It had been limited to baked lightmaps since Unity 3, but after integrating Enlighten as their real-time global illumination solution, dynamic lighting has become possible in games made with Unity. This technology is now shipping to all Unity customers as standard[12].

master_colour-1024x1024

Exient are UK-based developers behind best-selling titles such as Angry Birds Go! and Angry Birds Transformers.

The company licensed Enlighten for two core reasons. Firstly, they rely on continuous production of new content for their current games in order to keep customers engaged. Thanks to Enlighten’s in-editor dynamism, their lighting artists are able to produce publishable levels in unprecedented time.

“Enlighten’s real-time lighting workflow is a game changer. Baked lighting is a huge production bottleneck for any team; it’s very difficult to efficiently produce levels to the quality that players expect.” said Simon Benge, Lead Technical Artist at Exient. “A single artist using Enlighten can work without any disruption to the art pipeline, and can react to lighting and level changes faster than a team of 100 using a baking technology – and even produce higher quality content!”

Secondly, their upcoming title, due for release on iPhone and Android in the second half of 2016, contains gameplay elements that give the player control over the materials and lights in a scene. This title will be a showcase of the capabilities of their internal engine; accurate global illumination responses to these updates were a must. Enlighten enabled them to achieve this at runtime even on a mobile platform.

7

Perfect World is a major Chinese developer and publisher of video games. Its origins are in the MMO space but it has grasped the mobile opportunity by its horns. Yet, as outlined earlier, competition in the Chinese mobile gaming market is intense and games need to differentiate to succeed. For Perfect World, the use of Enlighten in their games acts as a quality badge, reassuring users that the graphics quality within will be awesome.

“Enlighten is redefining the way computer game lighting is delivered to the screen and sets a new standard for global illumination,” said Ming Cui, vice president of technology, Perfect World. “Using Enlighten in our mobile gaming portfolio delivers the highest quality graphics and assures our customers that we are investing in technology that can give us unique advantages over our competition.”

 

Enlighten on Mobile Case Studies

Transporter

Engine: Unity 5 pre-alpha build

Platform: Samsung Galaxy Note 10.1 (2014 Edition) Exynos version

Concept: A space centre flying past a bright star

8

Enlighten can compute both baked and dynamic lighting. Some of the light sources in Transporter are static and so they are baked to improve performance; the global illumination for those that flash or move at runtime are updated in real-time. The result is accurate even on surfaces that are occluded from the direct light source.

There are a couple of interesting Enlighten features to note in Transporter. The first is the inclusion of dynamic specular effects – this effect is a tricky one to compute in real-time on mobile where screen space reflection techniques used on next gen platforms are too expensive. The demo is built using a physically based shading model, the assets are highly metallic and therefore reflective – take for example the image above where both the floor and wall contain specular reflections. By using Enlighten cubemaps it was possible to efficiently capture the effect of changing lighting conditions in these reflections.

Transporter also takes advantage of Enlighten’s support for emissive surfaces. These emissive surfaces are simple to set up, trivial to animate in a script and have minimal impact on GPU workload because they count as an indirect light source and so are computed entirely by Enlighten on the CPU. An example of an emissive material in Transporter is the white lights on the door posts in the second image; these are animated to turn red and flash when triggered.

Ice Cave

Engine: Unity 5 with custom shader

Platform: Samsung Galaxy S6 with Samsung Exynos 7 Octa chipset

Concept: A mysterious, wintry cave with Chinese zodiac inspired statues and a ghost tiger

9

There are a number of dynamic lighting effects in Ice Cave which made Enlighten a sensible choice as the global illumination solution. Firstly, the development team wanted a time of day cycle outside of the cave with realistic bounced lighting inside the cave. Secondly, the fireflies which you can see in the image above are acting as a dynamic light source and the scene, which abounds with reflections, needed to ensure that the reflections responded accurately to the changing lighting conditions caused by the moving fireflies. Finally, the application wanted to give control of the lighting to the player so that he could choose to turn certain lights on and off or change their colour.

Medusa

Engine: Unreal Engine

Platform: Arndale development board

Concept: An enclosed space with complex stone statues lit by dynamic lights

10

This demo shows real-time global illumination on mobile from 20 dynamic light sources at 1080p. All the lights in this room are moving and Enlighten global illumination fills the scene with a realistic bounced light. There is no ambient lighting in this scene.

It took advantage of on-chip Raw Tile Buffer extensions available in ARM Mali GPUs in order to improve Enlighten’s performance.  The demo uses deferred rendering to reach the number of dynamic lights shown on screen, a very popular technique on PCs and consoles. However, deferred rendering is expensive in terms of memory bandwidth consumption. By using the raw tile buffer on mobile devices we can save the vast majority of the memory bandwidth consumption. 1 GB/s is the minimum you would expect to save and this figure increases with screen resolution.

Subway Mobile

Engine: Unreal Engine

Platform: Samsung Galaxy S6

Concept: A run-down underground station with fluorescent light boxes

11

This was a port of the PC version of Subway to mobile in order to test the boundaries of smartphone performance.  We wanted to turn as many of the original Enlighten features on as possible including player controlled lighting (carrying a torch and interacting with large lightboxes) and destruction (the removal of a wall at runtime with consistent global illumination between the two previously separated areas).

Due to the restrictions on movable spotlights on Android in Unreal Engine at the time (this was developed in Unreal Engine 4.8 before the mobile improvements that came in 4.9) we chose to light the scene entirely with emissive surfaces acting as area lights. This meant that while the lights could be animated or changed at runtime (for example, the player could set their colour like in the original PC demo), they had to be static. The dynamic lights in the cubes picked up by the character were no longer available.

It is good to see in this application the real-time indirect lighting of dynamic objects. In the image above you can see a group of rocks which can be moved by the player. As the colour of the area lights on the side walls change, these rocks pick up a new colour and so react realistically and consistently with the changing light conditions.

The expensive part of porting this demo to mobile was the shader cost. The fundamental rule of Enlighten is that if you are able to render the scene, you’ll be able to light it.

The Crystal Cave (mobile and mobile VR)

Engine: Unreal Engine

Platform: Samsung Galaxy S7 (Exynos Edition) (with Samsung Galaxy Gear VR)

Concept: A boat ride through a lake cavern filled with magical crystals that light up on sight

12

The key concept of this demo is the natural interaction with the crystals and the way that the player has control over which light up at any point in time, all while maintaining a sense of immersion that borders on eerie.

Dynamic area lights were ideal for achieving this effect on a mobile platform as they do not have the GPU cost or limitations that direct lights do. Using this effect we were able to quickly and easily port the original scene onto mobile with slight tweaks to optimize the geometry and user interface (the original used a LEAP motion controller).

Given that following this early port to mobile the demo was running at over 60FPS, the team decided to see whether it would work with a Samsung Galaxy Gear VR. The main issue in this process was not the lighting of the scene – indeed, Enlighten dynamic area lights were working very efficiently – it was rendering the scene itself in Unreal Engine. The engine still had tight restrictions on the number of draw calls possible on the Gear VR and limited occlusion support. In order to address this challenge, the team was able to use Enlighten in a very unique way. They introduced lighting based occlusion, a technique made possible because the entire scene was lit dynamically with Enlighten. When part of the scene did not need to be rendered, the lights were turned off and that area occluded in darkness. When that area was required, the lights faded gradually on and the area is rendered. Lights and geometry are turned off and on depending on the position of the player, allowing for more aggressive culling and the hiding of obvious geometry popping.

The VR version of the demo runs at 60FPS on a Samsung Galaxy S7 with 18 dynamic light sources, 28,544 lightmap pixels and an average Enlighten solve time of 7.6ms.

Performance Example

It is useful for evaluators to understand the performance impact of Enlighten on mobile. In order to raise awareness among interested parties of the cost of Enlighten, the following measurements were taken on a Samsung Galaxy S7 using the mobile version of Photons, an application originally developed for Windows PC. We ran the scene at a fixed and at an unlimited frame rate.

One of the mantras of Enlighten is that you only pay for what you change. If your scene has a lot of fast moving dynamic light effects, such as within the Photons demo, then Enlighten will have a proportionately higher cost than if the scene had a single, slow moving dynamic light source such as a sun.

This scene has the following stats:

  • Resolution: native (2560×1440)
  • Primitives: 150 per frame
  • Triangles per frame: 425,036 maximum
  • Draw calls per frame: 142 maximum
  • Enlighten complexity:
    • 28,480 lightmap pixels
    • 81 dynamic light sources
    • Solve time: 24ms maximum

13

The user can set a CPU budget for Enlighten and solve the global illumination over multiple frames. Because indirect lighting is a soft, low frequency effect, the user experience does not suffer if it is not updated instantly. A solve time of 24ms within a render frame rate of 60FPS could easily be spread over 4 render frames, providing an Enlighten update rate of 15 solves per second. This will provide a smooth user experience and only costs 6ms CPU time per frame.

The Enlighten documentation contains further advice for tuning a scene to achieve the best quality / performance balance and the final chapter of this white paper outlines some case studies of how previous applications have achieved their design goals within the necessary performance budget.

Download the PDF here: Enlighten: Mobilised

Trademarks

The trademarks featured in this document are registered and/or unregistered trademarks of ARM Limited (or its subsidiaries) in the EU and/or elsewhere.  All rights reserved.  All other marks featured may be trademarks of their respective owners. For more information, visit arm.com/about/trademarks.

References:

[1] https://newzoo.com/insights/articles/global-games-market-reaches-99-6-billion-2016-mobile-generating-37/

[2] http://www.smsglobal.com/thehub/smartphone-ownership-usage-and-penetration/

[3] http://www.deloitte.co.uk/mobileuk/

[4] http://stakeholders.ofcom.org.uk/market-data-research/market-data/communications-market-reports/cmr15/uk/

[5] http://tech.firstpost.com/news-analysis/7-technology-shifts-to-impact-us-from-smartphones-cloud-iot-and-more-310411.html

[6] https://newzoo.com/insights/rankings/top-ios-games-revenues-downloads/

[7] https://community.arm.com/servlet/JiveServlet/downloadBody/11397-102-2-23235/Expanding%20Virtual%20Horizons.pdf

[8] https://newzoo.com/insights/articles/global-games-market-reaches-99-6-billion-2016-mobile-generating-37/

[9] https://newzoo.com/insights/articles/global-games-market-reaches-99-6-billion-2016-mobile-generating-37/

[10] http://venturebeat.com/2015/09/18/what-happened-to-the-growth-in-mobile-gaming/

[11] https://unity3d.com/public-relations/

[12] http://blogs.unity3d.com/2014/09/18/global-illumination-in-unity-5/