One of the topics that attracts the most interest in the "green gaming" space is cloud gaming. Does it help? Does it hurt?
We took a long look [see article in Computer Games Journal] at this in our lab, running scores of tests with different hardware and software configurations, and different games. Here's a high-level summary. More details in our reports. Our "Cloud-gaming" scenario for the entire United States assumes 75% of all gaming hours are on cloud
(vs 20% in Baseline). It results in gaming energy demand rising 17% above baseline in the five years between 2016 and 2021. Note that this is in isolation from other changes that may be
happening in parallel (e.g., shift from consoles to PCs, or
visa-versa).
The two charts below tell the story. The on the left shows the "Strong Uptake of Cloud-based Gaming" scenario, in context with other possible scenarios. These are book-ended by the dotted green line (low case) with high efficiencies across all platform types, and a transition to greater market share of consoles, and no improvements at all in efficiencies or changes in the mix of systems people use to game, i.e. the "Frozen Efficiency and Market Shares" scenario. The chart on the right shows the effect of cloud gaming at the individual system level of cloud gaming versus purely local gaming.
<Click images to enlarge>
Caption: Values shown include all user modes for gaming devices: gaming, non-gaming, video streaming, web browsing, idle, off. Cloud gaming and video-streaming values include network energy and energy used in the data center. Lower values for Entry-level systems reflect the relatively high proportion of “Light gaming” user types (fewer hours spent gaming). There is currently no cloud-based gaming option for PS3, Xbox 360, Nintendo devices, or Apple TV. Display energy not included.
These values are averages, and are weighted by the proportion of "light", "moderate", "intensive", and "extreme" gamers in the user base for each system type. "Light" users of "Entry-level" systems consume 287 kWh/year while "Extreme" users of "High-end" consume 2,111 kWh, which is more than a 7-fold variation. For laptops, the variation is almost 17-fold.
Our project consistently looks at all
uses and modes of gaming devices, so we consider gaming (local and cloud) but also modes
such as video streaming, web browsing, idle, off. We're interested in
total energy use of the equipment, across all it's modes of use. These
modes are more fully defined in our reports. We also break
down the market and "fleet" of gaming devices into several user types
(reflecting intensity of use).
"Streaming"
in the figure above refers to video streaming. The red strip in the bars is the network component of energy used when in video-streaming
mode. "NonCloud" is energy use when no cloud services are being
utilized. That's the reference point we compare against to see how much
more energy, in aggregate, these systems use when connected to cloud
services.
Of course given
systems will use more or less than these averages. Significantly more
in many cases. The bookends of "Light" users on "Entry-Level" systems
versus "Extreme" users of "High-End" systems represent a 7-fold
difference in energy use for desktops and 17-fold difference for
laptops. See caption. Even more differences would no doubt be found in
the wild, as our numbers reflect 'only' the 23 representative systems we
tested in detail.
Here are the questions that come up most often:
- Does cloud-based gaming save energy?
- Unfortunately,
no. In fact, in all of our testing we found that cloud-based gaming
requires significantly more energy than similarly powerful equipment
located in the gamer's home -- three-times as much in the most extreme
cases we identified (see Fig. 5 here).
A key reason for this is that data centers hosting the high-power
gaming servers require very substantial ventilation and air
conditioning. To this one must add the non-trivial network energy between
the server and the gamer. And, of course, the connected user-side
equipment still uses some energy, even when the heavier lifting is
shifted to a remote server. The added energy is even greater if the home
system is less graphically powerful than the cloud-based system and/or
the games run in the cloud are more compute-intensive than what the
gamer would select on their own local equipment. The impact will
be minimized by very "thin" clients on the user side. New research from
other scientists in our lab has found that data center energy
efficiency continues to improve
in leaps and bounds. That said, increased throughput is largely
offsetting these gains -- which isn't good news for the climate.
- How much of this extra energy is associated with the network as distinct from the data centers?
- In
our calculations for PC cloud gaming, the data center is responsible
for about 340 Watts of power per user and the network an additional 180
watts. So, both components of the overall gaming environment are quite
important. The corresponding values for console cloud gaming are 180 and
120 watts, respectively. Internet network energy is covered more deeply
in this article.
It is important to note that efforts to make networks more energy
efficient have achieved dramatic improvements in recent years and can be
expected to continue to do so.
- Does
cloud-based gaming (potentially) provide environmental benefits by
reducing the need for local hardware, which means less e-waste, etc?
- We
haven't evaluated the solid-waste side of this topic, but it's a good
question. Of course, if componentry is shared among multiple gamers,
that can translate into less gear per gamer (although, only in the first
generation of remote implementations one gamer can use a given GPU at a
time, so those servers are enormous -- typically 8 GPUs). This will
hopefully change for the better. Two countervailing factors come to
mind. On the one hand, if centralized technology lasts longer there is
less solid waste per hour of gaming use. However, turnover could well be
faster in data centers, given the pressure to keep with the latest
equipment, and longer operating hours than home-based systems. Also, the
data center facility itself and the associated infrastructure--not to
mention land use--as well as the networking gear represent equipment
that is not needed for distributed gaming in existing buildings.
- How can this new source of gaming energy use best be managed?
- Like
most energy-using enterprises, a given amount of work can be achieved
with widely-ranging amounts of energy input. That holds for
refrigerators, light bulbs, cars, and servers alike.... Herein lies the
main opportunity of energy efficiency in general and green-gaming in
particular. Energy use for cloud gaming will depend on the specs of the
servers, how fully the servers are utilized, efficiency of the data
centers, efficiency of the network equipment used to link gamers to the
data center, and the energy used by the gamer's local device. The local
device hopefully employ power management that minimizes that particular
load. The extent of power management varies widely across gaming devices
(see Fig 22 in our report).
Cleaning up the grid that serves data centers is important, as it is
for home-based gaming. When cloud gaming, the client on the user's side
should be an "thin" and efficient as possible and displays as efficient
as possible, as they both of course use power even when the rendering is
performed remotely.
|
|