Yes, video game budgets are skyrocketing, but the reason goes beyond graphics

Skye Jacobs

Posts: 337   +9
Staff
In context: A recent report about the video game industry put the spotlight on skyrocketing development costs and the pursuit of ever more realistic graphics. While graphical fidelity certainly contributes to inflated budgets and extended development timelines, other factors play equally significant roles in driving up costs. These include the creation of massive open worlds and broader, non-gaming-specific issues such as resource misallocation and poor management.

There is a compelling case to be made that the gaming industry's bloated budgets are driven by a frenzied quest for greater graphics. This push for realistic visuals has been a trend for decades as gaming giants sought to captivate audiences. And for many this strategy paid off: it transformed simplistic 2D environments into richly detailed, lifelike worlds, drawing in legions of players.

However, as the cost of achieving such realism surged, the returns started to diminish. At the same time, as a new generation of gamers entered the market, their tastes have gravitated towards games with simpler graphics but robust social features.

"It's very clear that high-fidelity visuals are only moving the needle for a vocal class of gamers in their 40s and 50s," Jacob Navok, a former executive at Square Enix who left that studio to start his own media company, told The New York Times. "But what does my 7-year-old son play? Minecraft. Roblox. Fortnite."

While graphical fidelity undoubtedly plays a role in inflating budgets, it is not the sole culprit. An equally compelling case can be made that the primary factor of rising video game development costs are labor expenses and resource mismanagement. In high-cost cities like Los Angeles, each employee can cost a studio between $15,000 and $20,000 per month, including salaries, benefits, and overhead. As development teams have grown significantly in size, these costs have multiplied exponentially.

Whatever the cause of the soaring development costs, some argue that the gaming industry faces a critical juncture because its current path is not sustainable. Video game journalist Jason Schreier illustrated this point in his Bloomberg report: "Let's do some quick napkin math. If you have 100 employees and you're estimating $15,000 a month (a conservative guess) for each one, you're spending $18 million a year. But these days, the top game studios are much bigger than that. So if you have 300 employees and you're estimating $20,000 a month for each one (got to pay good wages to compete in 2025), you're spending $72 million a year."

Recent revelations from an Activision executive's deposition in a lawsuit shed further light on the staggering costs of modern game development. Call of Duty: Black Ops III (2015) cost $450 million to develop, Call of Duty: Modern Warfare (2019) cost $640 million, and Call of Duty: Black Ops Cold War (2020) cost $700 million. With over 3,000 people working on the Call of Duty franchise, labor costs are undeniably substantial.

There are other factors at play. Modern games often feature massive levels and sprawling open worlds, requiring significant development time and resources. Management issues, such as inefficient workflows, technological shifts, and indecisive executives, can also lead to wasted time and inflated budgets.

Abrupt changes in direction – such as pivoting from single-player to multiplayer games-as-a-service – can further compound inefficiencies and costs.

Industry veterans frequently share stories of wasteful practices, such as features being canceled due to executive whims or teams continuing work on levels slated for removal because of poor communication. While iteration is a natural part of game development, excessive "wasted" work often results in crunch time and budget overruns.

As production expenses for major gaming titles breach the hundred-million-dollar threshold, game companies would be wise to engage in serious self-reflection and examine their internal processes if they hope to maintain a sustainable and innovative industry.

Permalink to story:

 
Just like the Real Estate Market, the Video Game
Industry is in a much need of a crash in order to find themselves as an affordable and less volatile industry again.

Would that mean thousands of game developers would have to lose their jobs? Yes, just as thousands of Real Estate agents would, but when we live in a world with massive overblown gaming budgets and salaries just as much as ridiculously overpriced housing there's only one thing to hope for....a massive crash in both industries to readjust everything.
 
Last edited:
Look, I'm gad there's at least some reflection about graphic fidelity: I'd be perfectly happy if Elder Scrolls 6 looked like Morrowind dated back 20 years but had the scope and mechanics of Daggerfall, yes down to procedurally generated dungeons and non-geographic landscape features.

However this stuff about employee salaries it's just an extremely deceiving rhetoric: First, the employee salaries are like a few drops of water in the veritable pool of water that companies spend on executive bonuses and salaries: So if you truly want to save money, fire all of your CEOs, managers and keep maybe a few middle managers at most and only if they have a track record of being reasonable and not the petty tyrants they usually also turn into.

Second and perhaps most important: Notice how they mention Fornite and Roblox here, and not the countless of other story successes of other smaller indie games except for maybe Minecraft. Truth is that most indie developers can and often do just fine with moderate success on moderately complex games that are not that hard to develop: You don't need to sell a million copies at 60 USD which would probably be a massive failure for any publisher big or small, if your studio its like 5 people not living inside a US urban center paying extortion levels of rent, utilities and other personal expenses.
 
Last edited:
"It's very clear that high-fidelity visuals are only moving the needle for a vocal class of gamers in their 40s and 50s"

IMO there's smt wrong with this statement.

I don't play games anymore daily like I did when I was in my 20's and when I do I play old games made in 2016 and before. Certainly not anything new or even anything made in the past few years.

From what I remember, a few days ago there was a big discussion on that topic here and most ppl stated the same.

I don't know where they get this idea that a vocal minority of Gen X'ers and Gen Y'ers is pushing for development of graphically intense games.

I for one and many other ppl who post here could not care less, in fact.
 
This explains it. They are hiring the best and most qualified...A true meritocracy. To lower costs they must all employ DEI. Hey it worked in LA

For everyone who complains about DEI, all I have to say is, learn to code and build better games. Indy games. I don't complain - I just do better.
 
Management issues, such as inefficient workflows, technological shifts, and indecisive executives, can also lead to wasted time and inflated budgets.

Yeah, like the corporate implementation of AGILE concepts, which does the exact OPPOSITE of what it's supposed to do. Turns out, sitting around in a 90 minute meeting every day, just to give status of "its not done yet", if not a good use of my time.

And don't even get me started on SCRUM masters, who's only job is to oversee the mess corporations have made of AGILE processes.
 
Seems like an argumentation a 7-year-old would come up with. Does the son consult the father on this? :)

No, I agree with this 100%. I’d argue the vast majority of histories games were very unrealistic and cartoony. Look at the numbers, Mario, Pokemon, Minecraft TF2, etc. The top games are all cartoony with the exception of a few like COD.
 
Afaik nowadays the highest budget costs for AA/AAA games is going to marketing and advertising by a wide margin, not programmer salaries or anything related to actual content. I'm honestly baffled by how many dozens or even hundreds of millions they can spend on marketing.

Marketing is the actual elephant in the room, and in that field yeah, you'll probably find lots of overpaid people and contractors like social media influencers and streamers.
 
Yeah, like the corporate implementation of AGILE concepts, which does the exact OPPOSITE of what it's supposed to do. Turns out, sitting around in a 90 minute meeting every day, just to give status of "its not done yet", if not a good use of my time.

And don't even get me started on SCRUM masters, who's only job is to oversee the mess corporations have made of AGILE processes.

OMG, you could have been channeling my inner Business Systems Analyst when you stated this... I HATE AGILE... I still use good old fashioned requirements processes (user, system, logical process..) as often as I can get away with it... (Visio is my best friend!)
 
If we're ready to start paying more for games, as was so famously said, I think developers are ready to start paying more for their employees.
 
The last article said it was because of graphics, now it's because of salary. No no no, it's because of the corporation that's taking everything they can to give it to the billionaires who own them. They have armies of chinese, indians and even south americans working for them remotely and building their assets for very cheap.

All prices are inflated by corporations, and corporations work together which means they can easily justify any additional cost using anything in the production line.
 
It isn't ballooning for indie studios, just fir AAA. And given that the games are getting bigger, and more expensive but almost zero noticeable improvements can be seen outside of (often pre-rendered) tech demos, it's not because GAMES are getting more expensive. It's because bureaucratic business majors with zero passion for games are getting put in charge of "The next hollywood" and are burning the studios money like it's 1920s silver screen time. Just read about where all the money for Ubisofts supposed "first AAAA game of all time" went. And why are they blowing it? Because they think you're stupid and will buy any crap they deliver, regardless of quality. So why put the money into making the game?
 
Yeah.....so $18 million on 100 employees. That would probably be the same amount being syphoned off by each so-called 'executive' on the "board of directors". With their "executive whims....." being about as much use as a lead weight around a drowning man's neck.
 
Don't charge more, trim the unnecessary fat.

Look at what something like COD spends on advertising. You're already a massive franchise, being splattered over every billboard and busstop doesn't do much. Bet the majority of your target demographic doesn't even watch TV so why advertise there?
Someone mentioned sweet baby Inc before, imagine spending money on that. A consultancy firm that you have to pay to make your game actively worse by force inserting a narrative. Then it results in more work for the people actually doing useful work and they have to spend their time listening to some speech or sitting through some presentation. Colossal waste of resources.

Games used to made by a bunch of passionate neckbeards in a basement and it resulted in some real gems. Now you have a research team analysing what would be the most profitable, then you try to have it developed by a bunch of people without passion for the industry that live in expensive areas. Not to mention they're under far to many layers of management.

Just go back to passionate people doing games they want to make themselves. Baldurs Gate 3 has proved that can still be done very successfully. Don't all try to be the next Fortnite money printer, there is only room for a few of those.

Oh, and allow for remote work. It lets you recruit people that work for a lot less and you don't have to pay for the expensive office building. Good for diversity as well, you can recruit from anywhere there's passionate people that want to work on your game.
 
Last edited:
So many great games with sub 100M budget. Like all the Remedy games or robocop. Some of my favorite games. The bigger budgets of some other games makes no difference what so ever. On contrary, seems that the more money they have, the more clueless they are.
 
"It's very clear that high-fidelity visuals are only moving the needle for a vocal class of gamers in their 40s and 50s"
Are they sure they got the age range right? I'm one of those late-40s Gen X's whose been gaming since 1987 and lived through the whole EGA -> VGA -> SVGA, and DOS -> Windows transition, and far from demanding "ever better graphics", I'd even go as far to say beyond a certain "good enough" point (late 2000's to mid 2010's), ultra real visuals have mattered to me less than ever. I often ever prefer the "cleaner" look of older 2000-era games that weren't drowning in 26x layers of blur, DOF, TAA, upscaling, etc, as well as better writing. Looking back, the most visually memorable games to me are the ones that have different / interesting looking art styles irrespective of "fidelity" (eg, Bioshock's Art Deco style, Grim Fandango's "Mexican Day of the Dead" setting, Cuphead, Limbo, Psychonauts, Don't Starve, The Witness, etc). No-one ever had to spend $500m to make a game that will be liked for, and remembered by, its visuals.

"Modern games often feature massive levels and sprawling open worlds"
Which again no-one asked for in every game. Open-world games have their place, but there's more than enough examples of "Open world for the sake of it" that it's a self-inflicted problem. The most immersive games I've played are often the pseudo-linear ones like Deus Ex, Dishonored & Prey (2017) which aren't open-world but offer more than one option / approach such that they never actually feel cramped / claustrophobic whilst allowing the developer a far more immersive level design vs "500 square miles and it all looks the same". The least immersive games are the ones that smell like a "me too, checkbox design" churned out by a managerial committee who just copy-paste 'industry trends' whether they actually fit the game or not...
 
OMG, you could have been channeling my inner Business Systems Analyst when you stated this... I HATE AGILE... I still use good old fashioned requirements processes (user, system, logical process..) as often as I can get away with it... (Visio is my best friend!)
Really, AGILE has just turned into "Multiple Waterfalls" at this point.

I especially love trying to explain "No, my current task doesn't neatly fit into a 10-day Story, and no, it can't be broken down any farther. Call me in three months."

The "real" purpose of AGILE is to protect management from their own piss-poor decisions by spreading authority between multiple individuals so no one individual can take a fall for managements collective failings.
 
Some of you have talked about game price increases... ummm games are cheaper than they ever have been when adjusted to inflation. Its the one product that has been nearly inflation proof. We were paying $60-70 for games back in the 90's 2000's. The price is about the same as it was back then.

When adjusted for inflation games should be costing over $150 and I am not talking the live service 3 dlc and all kinds of digital goodies editions.

Then add into the fact that we have a MASSIVE indie development of games costing anywhere from a few bucks to 20-30 bucks. Never have we had so many cheap games. Millions of games under $50.

Simply put, games are cheaper then they ever have been, stop fooling yourself, you are not fooling anyone that has been around.

The other things is the comment about graphics was about as stupid as it can get. Older people are the ones who tolerate the bad graphics and favor gameplay. Retro games we grew up with that have modern flairs tend to do very well. Younger generations wont play games that look retro or "old". Try go get a zoomer to play an atari game and come tell me the results.
 
Back