“I suddenly find myself needing to know the plural of apocalypse.” – Riley Finn
This week, let’s dive into one of the key test suites that lives at the core of any QA effort: Performance and Stability.
Performance and Stability
If you’re new to QA, you might be wondering: what exactly lives in this suite? At a glance, it might not seem as flashy as testing a boss fight or a new feature, but this suite is home to some of the most invisible-yet-critical systems in a project. These are the quiet heroes. The parts of a game that, when working correctly, players never notice. But when they fail? Everyone notices.
This suite is responsible for checking the systems that directly affect frame rate, visual fidelity, responsiveness, and robustness. All vital foundations if you want a polished user experience.
Here’s a high-level overview of some of the generic test cases you’d typically find in this suite:
🔹 Optimisation
🔹 FPS (Frames Per Second)
🔹 Level Of Detail (LOD)
🔹 Occlusion
🔹 Error Handling
Now, I’m not saying this list will cover every project I ever work on — and yeah, maybe I’ve cut a few sneaky extras for now (they’re secret, shhh) — but this set has me covered for the moment.
I’m not going to go into every item here, but I’ll outline a few, then really dig into one and blow your mind with what’s hiding underneath. No, really. It is that exciting. (Well, for me.)
Optimisation
This test case ensures the game is running efficiently — and not just when things are on fire. Players might not notice the features covered here, but that’s kind of the point. We’re checking things like:
🔹 Draw calls and batching
🔹 Memory usage and garbage collection
🔹 Async loading
🔹 Hardware utilization (CPU/GPU spikes, throttling, etc.)
Even if these issues don’t show up right away, they can bite you hard later in development — so early coverage here saves time, money, and post-launch heartache.
FPS (Frames Per Second)
You probably know what this is, but testing it is more than just watching the number in the corner.
We’re looking for:
🔹 Isolated FPS drops under specific conditions (e.g., explosions, AI-heavy areas)
🔹 Sustained drops over time (e.g., during long play sessions due to leakage or build up)
🔹 Platform-specific dips or thermal throttling effects
This test case helps identify where the game’s performance budget is being blown and why.
Level Of Detail (LOD)
Ever notice objects popping in or out as you move through a game world? That’s the LOD system in action. LOD systems swap models/textures to lower poly/resolution versions at certain distances to help improve performance.
This test case ensures:
🔹 Transitions between LOD levels are smooth
🔹 High-quality assets are used when needed
🔹 Lower LODs don’t break visuals or feel too obvious
A good LOD system is invisible and that’s exactly what this test helps protect.
Occlusion
Fun fact: when you’re playing a game, most objects you’re not looking at aren’t even being rendered. That’s occlusion.
This test case checks whether:
🔹 Hidden objects are properly culled
🔹 Nothing is unintentionally occluded (e.g., a wall vanishing when it shouldn’t)
🔹 The occlusion system actually helps performance, not hurts it
Bad occlusion can tank your frame rate or hide crucial gameplay elements. We don’t want either.
Now, they’re just quick summaries of the contents of those Test Cases. If you want to know more about one of these, just let me know — I’ll happily nerd out with you.
Next, we’re going to flesh out a really juicy one and give all the details.
Error Handling, you’re up.
Error Handling
Here we are, the Almighty Test Case, what QA is built on!
Aaaaaaaand it’s basically just a container for Test Scripts in my setup!

But I jest! It does serve a few other very important purposes.
It’s a hub of bi-directional traceability as well as some catch-alls and important info. Here’s what you can find in one of my Test Cases.
Premise
This section gives a quick summary of when or if the Test Case should be run. I also use it to link to design documentation — for example:
“If the game uses Feature X and the linked doc is signed off, run this case.”
For Error Handling, though? There’s no “if.” This Test Case is always relevant.
Expected Behaviour
This is where I define high-level outcomes that must always be true — even if they aren’t explicitly written into every Test Script. Things like:
🔹 The game does not remain unresponsive for long periods
🔹 The game does not self-terminate
🔹 Error Logs contain the required information (I heard some eyes go wide with that one😁)
QA Notes
A quick section for context, tips, or tools that testers might need. For this one, it could include:
🔹What to watch for on different platforms
🔹Use a test matrix for coverage
🔹Install tools like Klogg for log analysis
Labelling
This is crucial. Without consistent bug labels, the entire tracking system would fall apart. This section spells out:
🔹Any naming conventions or tags for test sessions, logs, or test runs
🔹What labels must be applied to bugs found via this Test Case
Test Case Version
A bullet-point changelog of any updates to the Test Case or its child Test Scripts. Simple but essential.
Ask any coder; keeping track of what changed, when, and why will save you so much pain later.
Test Scripts
This is where the Test Case gets specific. Each script is a focused check, tightly scoped, clearly named, and traceable.
Let’s say our Error Handling Test Case is called CR1000. Its scripts might look like this:
CR1000A – The game does not crash to system.
CR1000B – The game does not crash and reload.
CR1000C – The game does not hang indefinitely.
CR1000D – The game does not become unresponsive during user interactions.
CR1000E – Error Logs can be retrieved.
And this — this — is where that Riley quote comes in.
Any one of these failing is a mini-apocalypse.
Wrapping Up
Okay, I’m calling it — it’s late and I could go on and on about this stuff.
In fact, I think it might help if I mock up a real example soon — something that shows a full Test Suite > Test Case > Test Script > Test Method chain in action. That way, you can see how it all fits together instead of me just making it sound more complex than it really is.
Either way — next week, we’ll dig into the Test Scripts themselves!
Until next time, QA peeps!

Leave a comment