Apologies for the delay, my instance is having problems with communities so i can’t reply with that account.
To answer the question, not anymore.
The crunch culture was a big part of me leaving.
Honestly it’s not that different in type from non-game dev houses, the difference is in the magnitude.
I understand why these things happen, the reasons just aren’t good enough for me.
Poor planning compounds with ridiculous timeframes to create an almost immutable deadline to deliver unrealistic goals.
The problem is, they’ll jump right back in to the next project and make exactly the same mistakes. At what point does it stop being mistakes and starts being “just how things are done”.
One of the main reasons this works at all is that they take young idealistic programmers who want to work in their dream industry and throw them into a cult of crunch where everyone is doing it so it must be ok or this is the price of having my dream job.
it’s certainly not all studios and it seems to have gotten marginally better at the indie to small-medium houses but it’s prevalent enough that it’s still being talked about.
When you worked in games, how did your team deal with the unplanned scenarios where a feature, or even the core game, wasn’t fun and you needed to go back to the drawing board?
On paper what you’re “supposed” to do is iterate through gameplay mechanisms and scenarios by building up the bare minimum needed to get a feel for it, then once you have something viable you proceed further along the development process.
In reality it really depends heavily on context, sometimes you find a particular scenario works fine standalone but not as a part of the whole, or some needed balancing change elsewhere breaks the fun of something established, late additions can also cause this.
but again that depends heavily on the type of game, rpg’s are more sensitive to balancing changes than racing sims for example.
Specifically we’d usually evaluate the tradeoff between how much it doesn’t work and how much work it is to “fix” it, sometimes it’d get cut completely, sometimes it’d get scaled back, sometimes we’d re-evaluate the feature/scenario for viability and make a decision after that re-evaluation and sometimes we’d just bite the bullet and work through it.
Over time you get a bit more cautious about committing to things without thinking through the potential consequences, but sometimes it just isn’t possible to see the future.
I understand the realities of managing a project like that, at the same time these kinds of things are known upfront to a degree and yet people always seem surprised that the cone of uncertainty on a project like that is huge.
As i said, i have no problem with re-use, i have a problem with saying re-use is “essential” to stopping crunch, like the management of a project like that isn’t the core of the problem.
Apologies for the delay, my instance is having problems with communities so i can’t reply with that account.
To answer the question, not anymore.
The crunch culture was a big part of me leaving.
Honestly it’s not that different in type from non-game dev houses, the difference is in the magnitude.
I understand why these things happen, the reasons just aren’t good enough for me.
Poor planning compounds with ridiculous timeframes to create an almost immutable deadline to deliver unrealistic goals.
The problem is, they’ll jump right back in to the next project and make exactly the same mistakes. At what point does it stop being mistakes and starts being “just how things are done”.
One of the main reasons this works at all is that they take young idealistic programmers who want to work in their dream industry and throw them into a cult of crunch where everyone is doing it so it must be ok or this is the price of having my dream job.
it’s certainly not all studios and it seems to have gotten marginally better at the indie to small-medium houses but it’s prevalent enough that it’s still being talked about.
When you worked in games, how did your team deal with the unplanned scenarios where a feature, or even the core game, wasn’t fun and you needed to go back to the drawing board?
Depends on the team.
On paper what you’re “supposed” to do is iterate through gameplay mechanisms and scenarios by building up the bare minimum needed to get a feel for it, then once you have something viable you proceed further along the development process.
In reality it really depends heavily on context, sometimes you find a particular scenario works fine standalone but not as a part of the whole, or some needed balancing change elsewhere breaks the fun of something established, late additions can also cause this.
but again that depends heavily on the type of game, rpg’s are more sensitive to balancing changes than racing sims for example.
Specifically we’d usually evaluate the tradeoff between how much it doesn’t work and how much work it is to “fix” it, sometimes it’d get cut completely, sometimes it’d get scaled back, sometimes we’d re-evaluate the feature/scenario for viability and make a decision after that re-evaluation and sometimes we’d just bite the bullet and work through it.
Over time you get a bit more cautious about committing to things without thinking through the potential consequences, but sometimes it just isn’t possible to see the future.
I understand the realities of managing a project like that, at the same time these kinds of things are known upfront to a degree and yet people always seem surprised that the cone of uncertainty on a project like that is huge.
As i said, i have no problem with re-use, i have a problem with saying re-use is “essential” to stopping crunch, like the management of a project like that isn’t the core of the problem.