Part One – Visual Experience
Game development and commercial teams may live on different planets a lot of the time. But they do share one universal pet hate: negative reviews. Complaints mean that gamers don’t want to play the game anymore. They’ll be telling their friends, as well as posting those damning scores and comments. For a new game, this is worse than just money left on the table … it’s swept off the table altogether. Inspired by our new and radically updated , in this new PIQ bulletin series we focus on minimizing – wherever possible, preventing altogether – the most common App Store performance complaints from mobile gamers. Four critical factors – putting aside for now the worst case of game crash – make up the “experience > < performance” dynamic: 1. Visual experience 2. Device feel (primarily temperature) 3. Device power (duration of play) 4. Network What are the dimensions of experience quality? How do performance issues impact those? And which particular metrics do we need to understand to uncover the causes, and execute the necessary game optimisations? We start the series with a deep dive into visual experience.
The make or breaks
What makes for a good, bad – or just “meh …” – visual experience? Of course, different game formats dictate widely different aesthetic expectations. While the gamer’s requirements of a typical FPS – such as the eternally popular Call of Duty – revolve predominantly around speed of action and reaction, here we focus on the more nuanced demands of the MMORPG.
The player of an RPG is looking above all for an experience that is immersive. The game’s creators are, in effect, developing a “coherent and cohesive world”. While these games will normally be played with lots of other players who typically like to play games earn gift cards – which naturally creates its own performance challenges – certain elements not being in perfect sync is generally forgivable. That said, the essential immersiveness of the MMORPG experience is severely compromised by, for example, delays in speaking to another character. Problems with poorly-designed assets, levels not loading, and smoothness of character movement will send gamers straight to the App Store to vent their frustrations.
The back and forth
When QA identifies an experience problem, a series of formal test stages are initiated. The basics of this iterative process are Replication, Diagnosis, and back to Development.
First, the issue is replicated by QA, and the results sent back to the development team, so they can start looking at possible optimizations. It’s not for GameBench to dictate how to optimize the game. But we’ll provide analytics to detect and analyze further using metrics such as framerate, CPU and GPU utilization and network interference. Visual experiences are normally highlighted through the traditional FPS measurement. Once we have detected a problem, there may be possible causes requiring further data collection. For example, if CPU utilization is high, but network and GPU are low, we need to look at more physics or computation issues in the game. If we see high GPU utilization, but low network and CPU, we look into assets and rendering issues. Network is always worth considering, since downloaded data needs processing, involving the CPU. If the CPU falls behind in physics work, then visual experience will falter. Now we know that we need to collect FPS, CPU, GPU and network. We can establish a predefined test, to set a performance threshold and to check the effect on frame rate, while observing changes in the other metrics. Importantly, each collection of valid and useful test data reduces the number of cycles between replication, diagnosis and development.
The next-gen tools
Let’s put this into practice using GameBench’s . You can integrate the package into your game engine as a library, or import into a Unity project via the package manager. That done, you can configure your collection, or just leave it to collect automatically.
All the data recorded in the SDK is sent to the GameBench Unified Web Dashboard. From here we can start to inspect each recording (or session as we call it). We should now be able to replicate all the relevant performance problems. On your Dashboard home page, select your package and apply your test thresholds, focusing on all the test data from a particular version.
Start by defining the target FPS – in this case, for a MMORPG 60 FPS is a standard goal. Tweak your thresholds for the CPU and GPU to highlight sessions that are failing. From here, you immediately isolate the problem devices, so a reliable decision can be made on the severity of the experience problem. Finally, share these views with your development team, and/or raise a Jira against the most important sessions.
Conclusion
All of the work we’ve done this year on the GameBench Studio tools has been focused on both enriching and accelerating this back and forth between testing and development. While these are the day-to-day outcomes we’re targeting, the bigger prize – where technology and commercial concerns really connect – is the rapid and confident release of a game that suffers as few negative App Store reviews as possible.
Performance IQ by GameBench is a hassle-free, one-minute read that keeps you abreast of precisely what you need to know. Every two weeks, right to your inbox. Here’s a quick and easy sign up for your personal invite to the sharp end of great gamer experience.
And of course, get in touch anytime for an informal chat about your performance needs.
The intelligence behind outstanding performance