[For anyone who's coming to here recently because of the IGDA forum, let me say right up front that after working with this a while I've changed the factor to a more optimistic 0.8 -this number, btw, will vary from studio to studio, depending on various factors - but I'm going to leave the text of the original post intact:]
Looking at a preview and then hitting "back" erases your post? WTF? Okay, starting over...
I had a friend, George, in high school who was jealous of guys like Planck and Avogadro for having their own constants named after them. So he decided he was going to have a constant and it was going to be 5. 5 was George's Constant. Whenever our math teacher wrote 5 on the chalkboard, George would pipe up, "That's my constant!" It seemed pretty funny at the time.
Anyhow...
We've all heard of Brooks' Law, right? From The Mythical Man-Month:
Oversimplifying outrageously, we state Brooks' Law: adding manpower to a late software project makes it later.
Turns out Brooks' Law isn't really true. Hey, he said it was an outrageous simplification. I've been on multiple projects where we realized we weren't quite going to make it so we added a guy or three and hit our date. Still, the spirit of Brooks' Law is true - adding more people to a project makes it less efficient, as communication overhead goes up. Instead of coding, the guys are asking each other what to code, how to code; telling each other what to code, how to code; going to meetings, training, trying to understand each other's code, and so on.
I read, in I forget which software engineering magazine, sometime over a decade ago, that we could estimate the effect of "Brooks' Law" by taking the square root of the number of coders on the team. So, 4 people can only do the job of 2; 9 people can only do the job of 3; and so on.
That's too conservative. Maybe if management is absolute crap: there's no lead, no source control, no planning, nothing.
On the other hand, most people estimate as if there's no communication inefficiency at all. This is not something that's modeled into MS Project. Add a guy to your Project and you don't see all of your bars automatically getting slightly bigger. Some teams might estimate the inefficiency by not assigning the lead any tasks - so n programmers are now doing the jobs of (n-1) programmers. That's too optimistic.
We need some middle ground.
So, simplifying not-as-outrageously, we state Fristrom's Law: if there are n people on your team, expect (n ^ .75) people's worth of work out of them.
This law roughly matches my experience:
- Die By The Sword. 8 people. 8 ^ .75 = 4.75. DBTS had 8 levels.
- Draconus. 16 people. 16 ^ .75 = 8. Draconus had 15 levels.
- Spider-Man. 30 people. 30 ^ .75 = 12.8. Spider-Man had around 20 levels.
The "levels" may be an apples-an-oranges thing, really I'm mostly going by feel. Spider-Man felt about 50% bigger than Draconus.
This law is useful for planning. Say you've made a rough estimate of your entire game - and you think it'll be 240 man-months of work. That means you could finish it all by yourself in 20 years. How many people will you need to get it done in a year? If there was no communication overhead at all, it would take 20 people.
But, applying Fristrom's Law:
n ^ .75 = 20
n = 20 ^ ( 1 / .75 )
n = 54
Fifty four people - to do only twenty man-years worth of work.
You might say, "That's a pretty rough estimate! You want me to bet my project on that?" To that I'd say, "Studies have shown that rigorously planned estimates are only marginally more likely to be accurate than rough estimates. (Schwaber, *Agile Software Development With Scrum*) So why waste the extra effort? Besides, most teams don't even take this factor into account, so we're being extra-conservative."
You can also use Fristrom's Law if you know your team's velocity, feel it's not good enough, and want to figure out how much staff to add to get to the velocity you're looking for. I'll leave that as an exercise for the student.
There are some corollaries here:
As you add more staff your scope-for-buck goes down dramatically. Yes, you're making a marginally bigger game, but your cost went up linearly. People will be less impressed. You'd think, because of this, that publishers would want to make 100 games with 10 man teams instead of 10 games with 100 man teams, but the best game is the one that everyone buys...the second best not-so-much. So that marginal additional scope may have been worth it, if it means you beat out your competitor.
Time is more valuable than money. Scope increases almost linearly with time invested. Two guys working for one year make a smaller game than one guy working for two. Too bad that one guy's game looked a little outdated by the time it hit the shelves.
As the industry trends towards larger projects, demand for talent goes up, because of the reduced efficiency. So although these big budget projects may be bad for the industry, they're good for the individual. As our companies flounder and our projects get canceled, we'll all still be able to land well-paying jobs.
So, there it is, Fristrom's law. Unless somebody beat me to it. Remember when I thought I invented the term "Heisenbug"?
That's not too bad of a theory. It is similar to something I read some time ago (blast my memory for not remembering anymore).
How much time you can expect/schedule people for (on average):
Contract work - 100% of their time. If they are there, they should be working or you're losing $$
Non-management employees - 80% of their time. This takes into account meetings and stuff. Basically, what you're saying, that they will do other stuff than work during the day, so .75 isn't that far off.
Management - 50% of their time scheduled. They will have more fires and meetings and stuff, so to schedule them for an entire week will probably end in disaster.
SO I do think Fristom's Law has some merit to it. As you said, there are other factors involved, but it's not too bad for a rough estimate. I may have to try this one out in secret and see what happens...
Posted by: Liam Hislop | May 17, 2006 at 07:44 AM
A big part of it depends upon the length of time involved. While the productivity may average out to that formula, in the short term it is often worse, because of the "spool up time" required for a new team member to get up to speed and be productive.
I haven't read Brooks' book, but I suspect that's a big part of the "making it later." For the first 1-4 weeks, the new team member may be more of a productivity sink as he pulls other team members away from their tasks to help train him on what he should be doing, processes, how the code is organized, etc.
Posted by: Jay Barnson | May 17, 2006 at 12:33 PM
Huh, levels? "Number of Levels" isn't going to be at all correlated with the amount of programming effort on the project, only the amount of design (and maybe art) effort.
Posted by: Sean Barrett | May 17, 2006 at 07:15 PM
Well, although Brooks is talking about just programming effort, I'm talking about the whole shebang. Artists & designers are just as impacted by communication overhead as coders are. And number of levels IS *correlated* with amount of programming effort on the project - because number of programmers is usually correlated with the size of the rest of the staff. Granted, there's no causation.
Finally, I did say it was just a "feel" thing.
Posted by: Jamie Fristrom | May 18, 2006 at 11:54 AM
"Artists & designers are just as impacted by communication overhead as coders are."
I hear this as wild crazy talk.
I'm not going to claim that artists, specifically, operate in a communication vaccuum, per se, but my experience has definitely been that most artists, once they understand art style and technical constraints and engine features and tools (and yeah - all of that can take a non-trivial amount of time) can work pretty decoupled from each other. Doubling the amount of artists _can_ double, say, the number of textures that are produced. It slows, too, if there are complex specialist interdependencies (animations that rely on tweaks to animation systems and specifics in level architecture, for example).
I don't think artists scale anywhere near as terribly as programmers do in the wild.
Designers probably are about as bad, though, or at least have the potential to be, depending on what it is they're designing.
Posted by: Nathan McKenzie | May 18, 2006 at 11:20 PM
Easy for a non-artist to say. Even with texture artists, the most likely art position to be able to make into galley-slaves, you've got "Is this texture good enough?", "Does it match the concept art?", "Does it fit the color key?", "You forget to make it double-sided, alpha, not-alpha, this shader, that shader." At Treyarch, the texture artists also would do the texture mapping and refine the terrain meshes so their textures looked good, so you'd get, "Hey, when you added this little ridge here you made it so Spider-Man couldn't crawl over."
Also, rule-of-thumb, folks! I agree that you *can* double work when you double manpower, as in, it's *possible* - but that's not the way to plan.
Posted by: Jamie Fristrom | May 19, 2006 at 10:26 PM
Feh!
I do have production artwork in shipped games, actually - it's all particle system stuff and related source art. And as other effects artists made new effects, they had nearly no chance of breaking or mucking up my stuff. Effects (and textures and sounds and music and models) aren't totally decoupled from other disciplines - as you say, texture choices can drastically affect gameplay - and they're not decoupled from each other - it is important that textures all fit the art style. But in my experience, they still are far more decoupled than code tends to be. Isn't this part of the reason for all the current interest in data-driven approaches - they can be scaled up more successfully (if, often, more generically)?
Posted by: Nathan McKenzie | May 20, 2006 at 12:11 PM