Have Modern Game Budgets Made First-Party Games Too Financially Risky?

[Cover image courtesy of Deviant Art user ManicFelicity]

One of the criticisms often leveled at the current generation of consoles is their lack of an identity. As a whole, what truly separates the PS4 from the XB1, really? And moreover, what separates either of them from a PC? I can’t think of any other time in video game history where the decision between two consoles meant less than the choice of PS4 or XB1. The differences between the two consoles are largely arbitrary ones, and one arbitrary reason why one is better is balanced out by an equally arbitrary reason why it is worse.

To be fair, it has been a good long while since one console was as completely different from another in hugely tangible, measurable ways as it was in, say, the PS1 vs N64 battle. But even as consoles began to have less of a unique personality and started to be less different from each other (and more like PCs and multimedia machines that happen to also play video games), there was one thing that still kept each console unique and gave it a sense of “brand”: first-party games. You could argue all day about which console was more powerful, which one had the better controller, which one had more multimedia
functions–what ultimately set one console apart from the other was its first-party lineup. It used to be exclusives in general – first-party or otherwise – but not since the aforementioned PS1/N64/Saturn generational battle were true third-party exclusives an especially prevalent factor in a console race. The PS2/Xbox/GC generation was the first that saw third-party exclusives begin to become a thing of the past: Whereas it once wasn’t uncommon for huge, AAA third-party games like Final Fantasy VII and Metal Gear Solid to be completely exclusive to a single console, you saw far less of that beginning in the sixth generation. That was the generation that the “timed exclusive” came into prominence and became much more of the norm than a third-party game remaining 100% exclusive to a single console permanently.

What Sony and Microsoft did still have during that generation, though, were a wealth of first-party games and franchises. PS2 for instance had Jak, Ratchet, Sly Cooper, Gran Turismo, Ico, Shadow of the Colossus, God of War, Twisted Metal, Mark of Kri, MLB: The Show, Killzone, Syphon Filter, Dark Cloud, Hot Shots Golf, and more, many of which had sequels and several of which were trilogies. How many first-party games came to PS3, and of those, how many are original IP and not just sequels to PS2 franchises? It certainly isn’t because Sony doesn’t have the IP, or the talent, or the resources. And that problem has only gotten worse on the current-gen systems. Now, in place of new installments of first-party games, we get remasters and/or compilations of last gen’s games. We saw collections/remasters of God of WarGears of War, and many other franchises (some of which don’t have “of war” in their titles) from the previous gen’s consoles on the new ones before we even got a new game–and in some cases, before a new game in that franchise has even been officially announced. Half the time, they aren’t even bothering to try and convince us that the remasters are just stop-gaps until the new games get here; they’re just milking extra cash out of their old first-party games and aren’t pretending otherwise because there’s no need to. That’s just the state of first-party franchises these days.

What it comes down to is the fact that first-party games only have their own native system to exist on. No, the PS4’s 35 million+ worldwide install base is nothing to scoff at. But add in the Xbox One’s roughly 20 million, and it doesn’t take a math whiz to see the benefits of selling a game to a potential 55 million people vs 35 million. This is why pretty much all third-party companies release the majority of their games on both systems sooner or later, with just a small number of rare exceptions. When you spend tens of millions of dollars on a game, you need to sell as many units as possible to even recoup your development costs, let alone enough to make a comfortable profit. And this isn’t even taking into account the impossible to measure number of PCs floating around out there, which adds dozens of millions more gamers to your pool of potential buyers, another thing that first-party games haven’t traditionally had the luxury of. That, tellingly, has changed since Microsoft entered the console race, as they often release “first-party” Xbox games on PCs as well, thus birthing another subset of exclusives-that-aren’t-really-exclusives: the “console exclusive.” Yes, those existed way before Xbox, but they are much more high-profile now and are often day-and-date with the console release and of comparable quality, rather than the often delayed–and often sub-par–PC ports of console games that largely used to comprise that market.

So what about Nintendo? Nintendo tends to be the outlier to pretty much any conversation aboutoverall video game trends, and that holds true for this one as they continue to place a much heavier emphasis on first-party lineup than Sony or Microsoft. That said, look at how lacking the Wii U’s first-party library is for a Nintendo system–we’re almost three years (and possibly the final year of major Wii U releases) in with no plans for an Animal Crossing or core Metroid game, and we’ve yet to see the release of the console’s first original Zelda title – all of which both the Wii and GameCube were able to claim less than two years in. Sure, a lot of that is that the Wii U’s install base is especially low at just under 12 million, but it still speaks to an overall scaling back of first-party game production even from a company who has made that their bread and butter for decades now, even when they were at other pretty low points in terms of hardware sales. Nintendo finally joined the HD gaming world, which is where game development costs really begin to ramp up, and they’re definitely feeling the pinch like Sony and Microsoft already had been for awhile.

While it’s easy to knock the big three for not pouring more money into first-party game development and having healthy, thriving first-party lineups like they and other console makers have had in the past, you can’t entirely blame them. Counting on your game selling to half of your console’s install base in order to turn a profit is a risky proposition, especially when those install bases are still relatively low in the grand scheme of things. And as game development is only going to continue to get more expensive and take more time as gaming rolls along, it seems less and less likely that first-party games will ever be the presence that they used to be. All logic and business-related common sense aside, that really sucks, because that’s just going to mean consoles having less and less of a personality for as long as dedicated consoles continue to exist. The “NX” is currently the only new console officially on the horizon, and it’ll be interesting to see how this trend away from extremely heavy first-party libraries goes when that hits stores – especially with it being Nintendo, and especially with Nintendo already making the previously impossible-to-fathom move of putting some of its properties on non-Nintendo hardware (CD-i fluke nonsense notwithstanding).