Issue-5

GREEN GAMING NEWS

Issue Number 5 - April 3, 2018

Green Gaming News covers green-gaming research at Lawrence Berkeley National Laboratory.

Our motto is “Gaming Energy Efficiency without Performance Compromise”. This work is

sponsored by the California Energy Commission, and covers the full spectrum of

non-battery-charged gaming platforms, as well as gaming applications.

Meet our team and find out more about our project here.

Back Issues


Contents

Trendsetter Interview

    • Anjul Patney, NVIDIA, Foveal Reconstruction

Energy Factoid

    • Under Skyrim, active power across 20 consoles, laptops and desktops varied from 11 to 221W -- a 20-fold range

Market Metrics

    • The installed base of 15 million gaming platforms in California today is dominated by consoles, but aggregate energy will be proportionately less given their tendency towards lower unit energy consumption

Research Results

    • Console and media-streaming device game energy varies widely depending on which platform it's played

Notable Industry Activities

    • VEGA graphics cards achive big improvement in efficiency

Emerging Technologies

    • Cloud gaming - is it more or less energy intensive than local gaming?

Good Reads

    • Fraunhofer USA doesn't disappoint with their latest survey and analysis for the Consumer Electronics Association

Green-up Your Game

      • Tuning up your rig - in-game settings influence energy use significantly

      • Buying new gear - look for high ratios of active to idle power

Trendsetter Interview

  • Green Gaming News interviewed Anjul Patney, Senior Research Scientist at NVIDIA. We talked with Anjul about the exciting emergence of techniques that improve user experience of virtual reality and save energy by focusing rendering resources primarily on pixels near the center of the gamer’s gaze.

Trendsetter Interview

GGN: Give us a birds-eye view of how foveated reconstruction works and what the state of the art is in terms of the R&D process.

AP: From a thousand feet, foveated rendering is a technique that improves image quality and rendering performance by spending more time drawing pixels which you are looking at, and less on pixels which you aren't. Foveated rendering algorithms use eye-tracking to identify gaze location, and adapt to produce peripheral pixels in an efficient and fast manner. Due to the inherently lower acuity of human peripheral vision, the effect is hard to perceive, and due to the large fraction of pixels in the periphery, the performance benefit is significant. Foveated rendering is a key technique that can enable highly efficient rendering for high-field-of-view applications like virtual and augmented reality.

Researchers have proposed several foveated rendering algorithms, all of which use different techniques to lower the peripheral pixel quality. So far the goal has always been to maximize overall performance improvement with minimal loss of image quality. The most straightforward techniques simply reduce image resolution in the visual periphery, while more advanced ones either reduce the "pixel shading" frequency or rendering complexity (material quality, light count).

"Foveated Reconstruction" is one such technique, which reduces the computational cost of rendering peripheral pixels by doing so in a noisy, lower-quality fashion, and using image post-processing to "denoise" the resulting image. We demonstrated an example of this technique at NVIDIA GTC 2017, where we rendered peripheral pixels by randomly lighting a scene for only 1 out of 16 light sources per pixel, and using post-processing to generate a stable, noise-free image.

In addition to foveated rendering that improves performance without sacrificing quality, ongoing research is also investigating foveation approaches that are easier to integrate within existing game development pipelines.


GGN: You mentioned that FR can actually improve image quality. That’s fascinating, and a bit counter-intuitive. How does that work?

AP: In addition to improving performance, one advantage of foveated rendering is that you can additionally put "higher-quality" pixels in the center of the view, so you get overall higher quality while reducing overall compute workload. In other words, foveated rendering provides the tools to modulate image quality in the center of your vision independently from the periphery. One can use it to purely increase performance (by reducing peripheral quality), purely increase quality (by increasing central quality), or a bit of both.


GGN: By what mechanisms can FR potentially reduce energy use? Any measured data yet?

AP: By doing less overall work, most foveated rendering algorithms are intrinsically energy-efficient solutions. In the future, we can expect further improvements as the domain of foveated rendering expands to include 'foveated displays', in which the display participates in the foveation process so we not only reduce the rendering workload, we also transmit and display fewer overall pixels.

To my knowledge, none of the results for energy usage of foveated rendering are currently public. However, the preliminary data that I have come across strongly supports the energy efficiency of foveation algorithms.


GGN: What are the potential costs of this technology, and are there other benefits to be had in terms of user experience, or other factors beyond energy savings?

AP: The main cost of foveated rendering in the upcoming head-mounted displays (HMDs) will be additional hardware, namely an eye tracker, and all of the software accessories that enable high-accuracy and low-latency eye tracking. Eye tracking, however, has lots of benefits beyond foveated rendering as it will likely enable other novel user experiences and interactions, and potentially also assist other improvements to rendering quality and performance for VR and AR workloads.

Among the opportunities for novel applications are user interaction and input (e.g. "look here to proceed", "stare at the door you want to open", "make eye-contact with the enemy", etc). You can also imagine VR movies where the story only proceeds after you have seen the important parts.


GGN: We’re starting to see rudimentary efforts on the software side (e.g. Batman Arkham VR) to establish a gradient of pixels from the center of view out towards the periphery, plus control of pixel quality. Do you have a sense of how user experience and workload on the GPU for those approaches will differ from those like FR that are managed from the display side?

AP: The motivation of foveation in contemporary games is largely identical for foveated rendering, and they are clearly already very effective. However, since no HMDs currently support eye-tracking, foveated rendering in today’s games must be conservative in reducing peripheral image quality. Note that you can directly look at the low-quality pixels, and the game wouldn’t know to update the location of foveation! I expect more performance gains and overall higher quality in the future.

Energy Factoid

One thing we've encountered in our testing process is that it is difficult to find games that can run across a wide range of platform types. The best exception is Skyrim, which we were thus far able to run on 20 of our 26 systems with striking results. Among desktops, we found power requirements to vary from 50 to 221 watts during active gameplay (a ~4-fold spread). Among laptops the values were 32W to 85W (a ~2-fold spread). Among the consoles the range was 11W to 143W (a ~14-fold spread). Of course user experience varies, even within categories, although, unlike most other games frame rates are more or less pegged at 60FPS in each case (note exceptions). The same 1080p display was used in each case (its power not counted here). Efficiencies (FPS/W) varied by about 21-fold across all the platforms, and even widely within the product sub-categories, with the highest coming in at 5.7 FPS/watt.

Average power during gameplay across all systems that support Skyrim

Note that Skyrim is one of the lesser energy-intensive games in the study, but is available over the broadest variety of systems and hence appropriate for the analysis depicted here. The power metric is the average power measured over an approximate 6-minute test of a tunneled gameplay section of the game, specifically the Helgen Keep escape through the dungeon and caves near the beginning of the game. Skyrim is generally capped at 60 FPS, but laptops L1 and L2 and desktop E2 experienced bottlenecks that resulted in lower frame rates. FPS could not be measured for the consoles (C1-9) and the macs (L3 and M1). The display used during testing is 1080p. These measurements exclude display and network energy. Active-gameplay power levels are the average power measured across all games. A key to the system codes is here.

Market Metrics

In our gaming market characterization project (described in Issue #3 of this newsletter), we found the estimated installed base of over 15 million gaming systems in California today to be dominated by consoles. Desktop systems, laptops, and media streaming devices (MSDs) together make up only about 22% of all gaming systems in use. Of course total energy shares are higher for desktops and laptops given their (mostly) higher per-unit energy requirements. MSDs consume very little power on the customer side, but substantial amounts in the data center and even along the network. We project the numbers of MSDs to grow faster than those of all other devices in the future, which will further shape the allocation of aggregate energy demand.

Research Results

As we did earlier in the project for desktop and laptop systems, we have now measured console and media streaming device power during active gameplay across a range of game titles. We found that power requirements can vary by a factor of ten for a given game depending on which console it is played on, and by several fold for a given console depending on game choice.

Power in active gameplay by game for 21 popular games for consoles and media streaming devices

Notes: Not all systems are able to play all games. Apple TV and NVIDIA SHIELD shift workload to data centers (not counted here). The display used during testing is 1080p, but these measurements exclude the display itself. Active-gameplay power levels are the average power measured during the entire stest cycle. A key to the system codes is here.

Notable Industry Activities

AMD's new Vega graphics cards employ a next-gen pixel engine which includes a “Draw-Stream Binning Rasterizer” (DSBR) that improves performance and saves power by teaming with the GPU integrated HBM2 memory and high-bandwidth cache controller to more efficiently process a scene.

After the geometry engine performs its (already reduced amount of) work, the DSBR uses a “deferred pixel shading” process which identifies overlapping pixels and renders only the top layer pixels allowing the GPU to discard the non-visible pixels rather than wasting energy rendering them.

As an illustration of these improved GPU opportunities, our upgrade of a High-end DIY system (H1) achieved impressive power savings by changing from two AMD R9 Fury X GPUs (our base system) to one RX Vega 64 liquid-cooled GPU. Power reductions for actual games ranged from 8% to 65%, with an average of 32% when powering 1080p displays and 46% when powering 4K displays.

While dual-GPU systems have fallen out of vogue, they were popular in the recent past and so still exist as an element of the installed base in the marketplace. Their original popularity arose from desire for improved performance.

A look at the published hardware and performance specifications of the Fury X versus Vega 64 shows some impressive improvements with the Vega generation, but not enough to explain the full power savings we measured compared to baseline the Dual Fury X configuration. It is with the improved rendering process of DSBR that the single Vega 64 appears to have achieved a much better power draw profile across all games tested as well as the Fire Strike benchmark.

Our array of tests enabled us to quantify impacts on power requirements for systems driving 1080p displays versus 4K displays. As can be seen from the previous figure, significant increases of power requirements occurred. In four of the eight cases, increases ranged from 15% to 64%. In the remaining cases, reductions from two to fourteen were observed, presumably corresponding to the lower frame rates achieved.

Looking at the framerates, we found that the Vega achieved as good or better performance in all games with the exception of Witcher 3 on the 4K display, where rates drop from 49 to 39 FPS. The Vega offers superior metrics of user experience and image quality including substantially greater shader throughput, texture filtering, memory bandwidth and memory capacity.

In terms of the combined performance metric of FPS/W, we observed impressive improvements across all the real-world games and the Fire Strike benchmark. These improvements ranged from 19% to 211%, demonstrating that improved efficiency can be achieved in tandem with improved performance.

Emerging Technologies

Media streaming devices (e.g., Apple TV or Android TV devices) are the least energy intensive gaming technology locally, although their workload is largely shifted to data centers. The client-side in cloud-based gaming typically requires minimal power since the majority of computer processing is occurring away from the user, however the amount of data streaming to and from the client device is significant. The NVIDIA Shield, for example, streams at average rate of 15 Mbps, or 6.75 GB transmitted hourly. A meta-analysis on the energy use associated with data transfer across the Internet, from the point at which the data leaves the client’s router to where it enters the data center, estimated this energy use at 0.03 kWh/GB in 2017 (Aslan et al., 2017), which corresponds to 202.5 Wh during an hour of game play

The majority of computation activity occurs in the data center, a building dedicated to house racks of servers and the infrastructure dedicated to keeping them cool and moving data around. The NVIDIA Shield currently uses rack servers enhanced with eight Tesla P40 NVIDIA GPUs. Average server electricity use, excluding GPUs, is assumed to be 257W, based on typical hardware and operation characteristics found in large data centers.

Example of cloud-gaming server with 8 NVIDIA Tesla P40 GPUs. The NVIDIA Shield is on the gamer (client) side and the server containing all the processing is in the data center. The gaming session is thus streamed or "cloud-based".

Network power for switches and routers within the data centers is estimated as a 15% overhead on server electricity use, excluding GPUs. Each GPU increases server electricity demand by an additional 150W (Rated at 250 TDP by NVIDIA) and 50W during active use and idle periods, respectively (167W and 56W for active and idle, respectively, when accounting for PSU losses). Users of the NVIDIA Shield service are provided a dedicated GPU, indicating that up to eight users can access a server at any time. If at capacity the server electricity demand associated with each player would be 199W, however continuous full capacity is unlikely. NVIDIA aims to achieve use activity of 80% capacity, though the actual capacity could be much lower, or possibly higher, depending on how well server expansion matches the demand for the service. Assuming a use capacity of 80%, each hour of game play must also account for an additional 15 minutes (i.e., for every 75 minutes of server time, 60 minutes are spent in play while the other 15 minutes is idle) of server use with an idle GPU, or 22W.

Data centers require a significant amount of auxiliary power at the facility level for cooling and electrical support of the IT-equipment. While this auxiliary power can range from 10% for best practices to many times the IT power, this analysis assumes 50% to represent the mid- to large-size colocation facilities (i.e., space rented out by a third party), where gaming servers often reside to obtain wide geographic distribution and minimize latency. When accounting for data center server, network, and auxiliary power, as well as the data center power when gaming services are not being utilized, an hour of cloud-based game play corresponds to 340Wh.

All told, for configurations like those described above, we have estimated that a 10W local media streaming device can entail an additional 510 watts of energy in the upstream network together with the datacenter hosting the servers performing the graphics processing. Clearly it's important to think "outside the box" when assessing the energy use associated with gaming on this new generation of devices.

Good Reads

Fraunhoffer USA has once again provided a remarkable window in to the energy-relevant structure of the consumer electronics marketplace, and associated energy demands. You can download their latest rich report here. They updated a very interesting previous chart showing trends in console power during active gameplay. We've added the Switch, which was released while they were already in press. Nintendo has further pushed the frontier in terms of energy use, but all consoles are continuing their dramatic rates of improvement, lowering energy while at the same time enhancing user experience. They tell an interesting story, for the US, that while the console installed base nearly doubled over the past decade, absolute national energy demand fell by half.

Console power during active gameplay, by platform, generation, and year (Source: Fraunhofer USA). Switch values added based on LBNL testing.

Console installed base, unit energy consumption, and aggregate energy consumption by year in the US (Source: Fraunhofer USA).

With this latest report, Fraunhofer expanded their consideration of gaming to include desktop systems. They develop a nationally representative survey of households to gather information on equipment ownership and use. This year they asked about PCs with discrete graphics cards. From the table below, we can see that desktop systems with graphics cards are used 1.4h/day on average in desktops and 1.1 h/day on portables. These are the most rigorous publicly available survey data we are aware of regarding time in active gameplay for such machines.

Active and gaming time (hours/day) and dedicated graphics cards for computers (Source: Fraunhofer USA).

Green-up Your Game

    • Tuning up your existing rig: In the course of our testing, we looked at changes in energy use as a function of thirteen different in-game settings (anti-aliasing, tesselation, shadow matching, color saturation, depth of field, etc.) during active gameplay, using the Fire Strike benchmark on our mid-range "M2" and high-end "H2" systems. Many settings 5%-10% reduction in energy use, but some a good deal more. Vertical synch was the stand-out exception, saving 18% on the M2 system and a whopping 46% on the H2 system.

    • When buying gear: Across the 26 systems we've tested we found a large variation in the ratio of power use while in active mode to that while on but idle mode. High ratios indicate that power management is effective; less power should be needed when the system isn't working. A couple of the consoles show virtually no difference in power use between the two modes, while others use nearly 2-times as much power while in active mode. The ratio for desktops ranges from about 1.3 to 4x, while that for laptops ranges from 2.4 to 6. The problem for buyers is that this information isn't readily available at the point of sale, but do ask for it or seek out measurements by third parties.

More gamer tips here.

* * *

You’ll find lots of information about green gaming at our website.

Send feedback and suggestions of topics you'd like to see us cover to: Evan Mills