The Erosion of Console Game Quality
There was this time around the mid-2000s where console game quality reached its height. Developers worked hard to squeeze the most out of the 32 MB of system RAM of the PlayStation 2 to create even large open worlds as seen in Grand Theft Auto: San Andreas. The XBOX 360 and Playstation 3 continued that into the next generation. But it wasn't just about increased graphics performance.
Working on console games at the height of this console generation was exhilarating. The feeling to contribute to these great ecosystems really meant something. One you crossed the threshold and actually published a game, it was there to stay and for millions of players to enjoy.
In years playing games on an XBOX 360, I rarely crashed games. Experiences were free from game-breaking bugs and the experiences were tested to be enjoyable. It was possible to crunch your way through Gears of War's 2 co-op campaign on "insane" difficulty without restarting the game or console once.
This level of quality wasn't just achieved due to developers mere benevolence. A reason for this were the strict approval processes on Microsoft's, Sony's and Nintendo's side to ensure that games were matching the high quality standards set for their gaming ecosystems.
If a game crashed, it was rejected. If a game exposed a game-breaking bug, it was rejected. If a game didn't depict a controller button in a consistent way or make use of brand terminology in a way consistent with the rest of the ecosystem, it was rejected.
Game console manufacturers charged third-party publishers for approval submissions. As an independent game company, you wanted to make sure that your game was bug-free before you submitted it to the dreaded Nintendo Lotcheck. You spent weeks checking of and triple-testing the list of Microsoft TCRs because one small mistake and one small bug and it would mean costly re-submission. Every single update to the game meant that you needed two make sure that it was possible to unlock all the achievements in the game in one sitting, which, in the case of some larger games, mean a lot of testing.
Fast forward a decade and everything has started going downhill.
My XBOX one X, although many times faster than the XBOX 360 doesn't convey the same sense of quality as an XBOX 360.
You can go to the Microsoft store on XBOX and download games that don't even run and crash on startup. Many games receive day-one updates to be playable. Some games have game-breaking bugs. Some games have achievements that are impossible to unlock that never get patched.
To be clear, a gaming console still offers a superior experience than the garage-sale like game flood of a Google Play Store, riddled with ecosystem fragmentation, pay-to-win games, game-breaking ad banners and games that are prone to breaking within a couple of years after release.
But in an attempt to stay competitive with the release cycles, diversity and lower cost-to-entry for players as well as developers in the mobile games market, it seems like Microsoft and Sony have been softening their stringent quality control in the last few years.
Developers complained for years that the approval processes were too tedious and costly. But were they? Given that all developers had to go through the same process on the same platform, there was nothing anticompetitive with this practice. If anything, they ensured quality. Having to test the entire product with every update is really annoying, but it's the reality in software development. You never know what you might break.
Having strict quality standards from the start means that less post-release updates are required, keeping down the re-submission costs. When patching a game more often, overall re-submission costs obviously get higher, but decreasing the efforts for testing means that potentially even more patches are required, leading to a self-reinforcing loop of faster updates with less quality.
One could argue that times have changed and games nowadays have more content patches than they did ten years ago, in order to stay relevant to the gaming community longer. But does that mean we should reduce our quality standards? If it's so important to patch in another holiday gift pack, why not test the updated product properly? If you're saying that your game shouldn't be re-tested after such a small change because nothing will break, then what do you have to fear? And if you fear that something will break and the game might indeed crash after 8 hours of testing wall-hugging edge cases, then maybe is shouldn't be submitted in the first place.
The price for submission used to be affordable even for serious independent game studios. But it was a gatekeeper to keep low-quality content out.
Nintendo, however, preferring quality over quantity, hasn't softened their quality standards to the same extent and has made huge gains, selling Millions of Units of the Switch, keeping up a high-price mandate for games while Microsoft and Sony fight over who has the most stuff for sale and Microsoft letting players play AAA games for cheap with the Game Pass.
Nintendo knows that the quality of a single game is reflected on the quality of the Ecosystem. One buggy or crashing game could mean that users will lose trust and are less likely to pay $59.99 for the latest Mario game, even the crash or loss of save data is on account of a third patty developer.
I myself, having a Switch as well as an XBOX find it acceptable to pay this price for Nintendo quality. But would I spend $59.99 for an XBOX game? Frankly, I rarely do, and even though I play 98% on Xbox Live, I spend just as much money on the Switch as I do for the XBOX ecosystem. Why is that?
It's because Nintendo has demonstrated their commitment to quality, whereas Microsoft has shifted their strategy from similar levels of quality control as Nintendo to a quantity-over-quality marketplace.
This might have boosted the ecosystem for a while, but I believe, it's a straw fire.
The traditional console companies need to keep up their quality game if they want to maintain consumer trust in their ecosystems.