This one's going to be obvious to most people, but I have read a book that says to do the opposite, so:
A lot of the scrum / agile guys advise against splitting your schedule up by resource: don't have a bunch of "QA" tasks that are estimated separately from your "Designer" tasks that are estimated separately from your "Coder" tasks, and are all tracked independently, because that creates an us-against-them mindset. "We desinger types are done with our responsibilities, it's you coder types that are holding us back." It should be one-for-all, all-for-one: if you're falling behind, your designers and QA guys and whatnot should find ways to help in whatever way they can, even if it's just getting coffee for and giving back massages to those poor coders. *Agile Estimating And Planning* is one book that recommends this practice.
"Man, that's beautiful," I thought, when I read this. And that's how I scheduled Schizoid, lumping the art tasks and the code tasks and the music tasks into one big pile. Of course, in reality, there's no way James Chao, once he was done making his beautiful art, was going to help me fix network play bugs. Even if he did want to roll up his sleeves and start programming. (Or get me coffee and give me back massages.)
So, for games anyway, don't do that! Typically what we do - Schizoid being the exception - is schedule in a matrix that looks something like this:
Code Art Design
Main Menu 2 5 0
Eyeball Enemy 5 8 3
We're still using a lot of techniques from *Agile Estimating* - such planning poker, fibonacci-size numbers, estimation in generic work-units instead of days, and a serious aversion to Gantt charts. But the result we end up with gives us a better idea of how many coders vs. artists we need, and let's us track velocity separately.
But is tracking velocity separately a good thing? Some would say it divides the team. (And there's often a rift between coders and artists to begin with.) But it's probably worth knowing whether art or code is behind and by how much, so you can adjust staff and cut features accordingly! At least you're not singling out individuals.
Interestingly, Clinton Keith told me recently that he now works specialization into his agile consulting and training. I wonder how his approach works?
The matrix approach is what we are doing too, but inter-dependencies are getting in the way. For example a task with 2 days of coding and 5 days of art might require 1 day of coding to be done upfront so the artist can work, then 1 day to integrate the result.
We ended up dealing with little "on hold" stickers over such tasks ("half done, waiting for art"), and that looks a bit messy but I have to confess it's convenient and works quite fine for us.
Posted by: Drealmer | December 18, 2009 at 01:59 AM
I do enterprise software development and I'm helping us move to agile/iterative. I've come up with the same thing. It's impossible for us to get business & analysts to break big use cases up to user stories that can be done in an iteration (done = requirements, design, code, test, accept). So I have user stories of dev complete => coded & unit tested and ad hoc test session with QA, qa complete => all happy path/alternate flows able to complete, if buggy, UAT, etc. and then a "home run" stretch where we retest and fix all final showstoppers.
I have to deal with resource management by type of resource as well so I need to know each team's velocity and resource needs, and we do have to do some ordering of dependencies. I think doing iterative handoffs and working these separate tasks concurrently as much as possible helps to avoid the blame game.
Posted by: Bobby Lewis | December 18, 2009 at 04:16 AM
Jamie,
Good post!
Yes, a lot of agile texts assume that the "teams" consist of mostly programmers and that specialization slowly gives way to generalization over time (programmers do more testing, QA can review unit tests, etc).
In game development this doesn't quite apply. I spend a third of my book (~may '01) talking about team structures and how to address this for game specialists that just can't generalize that much. Things have to change.
Both commentators above point out the obvious problems: inter-dependencies and definitions of done that change as a feature emerges. Cross-discipline teams and variable "definitions of done" solve these. They also solve many of the issues of the unplanned but necessary work being done rather than slipping through the cracks because the boss didn't schedule those tasks.
Cross-discipline teams work for games in pre-production by doing better release and look-ahead planning, reorganizing between sprints and creating some shared resources/pools when necessary. Then you can measure velocity of real things (user stories). Measuring velocity of component or functional teams is useless IMO. Besides, I'd rather measure the velocity of the game, than the teams.
In production, I don't recommend the base scrum practices (http://tinyurl.com/6dtb5j). Things start to look more like functional management at that point.
Clint
Posted by: Clinton Keith | December 18, 2009 at 08:52 AM