Essential terminology
Game Performance Management is not just a science … It’s also something of an art. Pioneering GPM since 2013, GameBench has identified, defined and captured all the metrics that matter to ensure the best possible user experience. These are the fundamental concepts – along with some helpful extra links, and a summary of the best GPM practices – that every gaming professional needs to understand.
Battery drainUser-perceived battery power as measured by the device OS. Corroborated with milliamp readings (where available).
Battery ratingsThe average mA consumption of the whole system, measured during gameplay and combined with battery capacity (mAh) to rate the expected number of gameplay hours on a charge.CPU usageThe amount of work the CPU is doing as a percentage, normalised against the available CPU cores and their given frequencies.
Frame rate (FPS)The number of frames shown in a given second.
Frame timesTied to frame rate, the number of milliseconds to draw a single given frame. For example, a frame every 16.6667ms reflects 60 frames per second.
Frame rate variabilityThe average jump between consecutive frame rate readings taken each second, reflecting the amount of variation in visual fluidity that a gamer experienced (Lower is better).
Frame renderingWhat the user sees while playing a game. Measured using Median FPS, FPS Stability, and Variability Index.
GameplayThe core, interactive component of a game, excluding loading screens, menu screens, and advertisements.
GPU latencyThe demand that a game’s software makes on the graphical processor.
Image consistency ratingsAlgorithms are used to monitor key metrics of image consistency across a gameplay session. Differences between tests – e.g., in average edge or color complexity – are highlighted.
Input latencyThe measured time for a response to a user input, rated according to fixed benchmarks set by research. Example: Time between a player clicking the mouse button, and seeing the muzzle flash of an in-game gun.
JankIn GPM, an isolated, long pause between two frames, usually caused by dropped frames. (“Janky” is also used, less specifically, by gamers to denote poor game quality.)
JitterNegative impacts on video and audio quality, caused by delays in data packet arrival, due commonly to network congestion and/or route changes. (A key network metric, see also packet loss and latency.)
Launch TimeThe time taken to launch an app after it has been removed from memory, but is not freshly installed (i.e., a “cold launch” but not a “first launch”). Measured from tapping the game icon on the home screen to seeing the first interactive screen.
Median frame rateThe middle-most frame rate during gameplay, representing the typical visual fluidity that the gamer experiences (Higher is better).
Minimum frame rateThe worst frame rate experienced during gameplay, typically happens during a moment of heightened gamer activity or a computational bottleneck (Higher is better).
Network latencyAlso known as lag, the time for a packet of data to be captured, transmitted and processed, then received and decoded. (See also packet loss and jitter.)
Packet lossA key network metric (see also jitter and latency) when packets of data fail to reach their destination. Measured as a percentage of packets lost against number of packets sent.
Performance ratingGraphical performance rated by measuring multiple frame rate metrics during gameplay (median frame rate, minimum variability) and compared to established gaming benchmarks and user data analysis.
Pixel shader loadThe operating load for running pixel (fragment) shaders, which update colours and textures on scene geometry.
Vertex shader loadThe operating load for running vertex shaders which shape scene geometry.
Best practices
Experience analyticsData should reflect the experience of real gamers who are playing to win, i.e. no bots, scripts or device farms.
Natural gameplayEach gamer should follow a strict methodology that does not interfere with natural gameplay, but does ensure that certain parameters are matched across tests (e.g., game and device configuration, game scenarios covered). Minimum sessions of 15 minutes are essential.
Testing methodsThe testing elements of GPM should be accomplished using objective tools, with no reliance on traditional testers or subjective opinions.
Game-device pairEvery gaming experience relies on multiple hardware and software working in harmony, so valid testing is always done in game-device pairs.
Validated metricsMetrics should be validated using multiple independent methods. However GPM relies to some extent on the accuracy of underlying metrics produced by e.g., Windows, Android or iOS operating systems, or cloud gaming platform telemetry.
Real-world devicesReal-world, unrooted and non-jailbroken devices should be used, reflecting actual gamer experience as closely as possible.
This is the first of three Performance IQ bulletins dedicated to the essentials of GPM. PIQ 21.06 will unpack and explain the key benchmarks that define and differentiate the “Great” (enthusiast), “Good” (standard), “Basic” (casual) and “Poor” (out of scope) levels of actual gamer experience. What’s more, we’ll be bringing you both drops combined into a single, simple pdf for reference.
Performance IQ by GameBench is a hassle-free, one-minute read that keeps you abreast of precisely what you need to know. Every two weeks, right to your inbox. Here’s a quick and easy sign up for your personal invite to the sharp end of great gamer experience.
And of course, get in touch anytime for an informal chat about your performance needs.
The intelligence behind outstanding performance