« Moving Up In The World | Main | Happy Birthday To Schizoid »

September 03, 2007



What's wrong with having a simple script that runs on your machine at about 6 A.M. before you arrive, and builds all the different versions of the game you'll need during the day?

That alone would solve a majority of the problems you mention.

As for the other issues, it seems that too much emphasis is placed on the "build" process. Sure, you need it for self contained deployments, but for incremental development why not build the resources on the fly like id's Tech5 and other engines have done for years?

~ alexjc

Max Battcher

It very obviously depends on the system you are building, as you point out. For small projects the XNA content pipeline more than suffices and the time for an incremental build isn't a large expense. For larger projects you probably want to start looking into more team-oriented build systems such as using a parallel/background build system such as a continuous integration engine like CruiseControl.NET. Getting a decent CruiseControl.NET set up for XNA projects shouldn't be too tough (thanks to MSBuild) and could quite help in larger projects. (My own so far are "small", so I haven't yet resorted to something like CruiseControl myself...)

But no approach is necessarily perfect. As with anything else there are drawbacks to any approach.


OT: I haven't seen much activity on sweng-gamedev. Did something change?

Jeremy Statz

I'd argue that the "perfect world" build system you're describing is actually still pretty terrible. A perfect world content pipeline wouldn't have a pipeline at all, it'd be loading the content directly so there's no separation between source and game-ready asset.

Not going to work well in practice, unfortunately. Maybe it could be faked by having the game automatically compile/load from the source asset on startup if its time/date is newer than the existing compiled version?

All the pipelines I've worked with lately are in the "spend a half hour compiling in the morning" category, though at least we've gotten away from the nightmare hell that is expecting everybody to build their own EXEs. One of the biggest problems I've noticed with this sort of pipeline, is that the compiling step takes long enough that a lot of the artists simply won't run the game. They'll stick to their little test map or something and that's the end of it, because recompiling and running the production maps takes too long. I'd really like to see this go away, and I suspect next project with this sort of pipeline I'll put some substantial time into getting a cache server going, if we can't do anything better about it.

Josh Szepietowski

Variety is the spice of life.

One good option is to check both the source and processed assets into version control. Setup a build machine to regularly (constantly) build the incremental changes of the source assets. This machine should hold a lock on the final assets in the repository, so that the only way a final asset filters to the rest of the team is if the source asset was put into the repository.

On individual boxes allow to building of only the assets they are interested in (for fast artist iteration), and on every box allow a 'full incremental build' to take place (at an option).

Often programmers will be happy to plug away with just the final assets sent down from the build machine, and thats fine. Slowing their build process down by having them build the art assets is a mistake, IMHO, but disconnecting the 'source' from the 'final' asset is also a mistake.

This has been a theorypost, unfortunately, as I have not seen a build system setup this way just yet! :)

Mat Noguchi

If there were no bottlenecks the Perfect Build System would be the closest to ideal, since everything is described at the highest level of abstraction and you can generate the final build instantly.

Unfortunately, we have two main bottlenecks: I/O and runtime processing.

Any system that is designed ignoring those two bottlenecks is doomed to scaling hell.



Wow, I almost violently disagree. First off,

"But if that's the case, how come most of the succesful studios I know do it the "antipattern" way (anyone using Unreal with its wad files, for example) and the teams I've heard of with the big fat build systems are catastrophic?"

Hmmm, nearly every team I've worked on and every product I've shipped has done the big fat build. Our processes were closer to the one Josh mentions but you could type "build all" and get all the assets rebuilt from source. Having artist manually shepard stuff is the ultimate nono in my experience.

That said, I think we might be mixing ideas. The ideas above can be broken down

1) It should be possible to automatically rebuild all assets with one command from source materials

This is so you can make huge architechural changes in safety.

2) It should be possible to quickly build any single asset

Otherwise you have the long waits

3) When building a single asset, it should be possible for the system to automatically build everything else that asset needs (ie, if the Island level needs the palm tree asset then building the Island Level should also (optionally) build the palm tree if it needs to. If the palm tree needs the palm-leaf texture that should get converted.

Otherwise you have to hunt down the missing unbuilt assets

4) Assets should be built from source

Otherwise they get impossible to edit because all the stuff that made them easy to edit in the first place can get lost.(contraints, layers, IK rigs, expressions)

On top of that building from source means the source gets checked in. Which solves the "I lost the source" problem that here about on other teams.

5) updating a source assets should build any assets that depend on the fact that that asset changed.

Otherwise you either manually remember to rebuild those other assets which requires hunting things down.

None of this is that hard to setup and there are lots of easy ways to optimize, safe guard, etc.



*) Building some assets (a level) can be slow so in our current system when any user builds an asset the command lines and source files (and tools) are MD5ed (MD5s are cached so they don't have to be re-MD5ed) and the MD5s compared. If they match then the baked file on the server is going to match what the user is trying to build and so it's copied off the server instead of built locally

-starting from source-

*) Exporting is easy to automate. Maya has mayabatch or you can write your exporter to run using OpenMaya as an executable. So there's no need to have artists manually export. Max has Maxscript which can be called on launch through the commandline so it can be automated as well.

-being flexible-

*) Version FOLDERS can be useful so that if a major format update happens you change the path the tools write to by version. So for example instead of storing the result in data/models/tree.bin it's in data//models/tree.bin. Then the code is also updated to load that version. The result is people who have not updated the code or tools still build to and run from the old folders. Programmers working on tools or artists working with them on new features can safely move ahead

-starting from source-

*) If Excel is easier for artists/designers to use for data entry it's trival to parse its XML. (see my website for examples). Its XML stores everything so there's no reason to use XLS files ever which means no manual steps for that data either. Exporting to CSV or running some macro will both end up which chances for error and other issues and should be avoided.

That's just off the top of my head but anyway, the point is that yes, if you have the system you described that you type "build all" every morning and then wait 1~2 hours then yea, that sucks. But only alternative is not "hand build everything and have no build all ability". There are plenty of other and *better* options.


I noticed that the browser or the blog software eat my version example. The second path should be


where (version) changes with each file format change

Oh, and as for Unreal people using the non big build. Every unreal project appears to be 6 to 12 months late using the non-big build. The projects I worked on shipped on time. (yea, I know that's an anecdotal point but then so is yours :-p)

Jamie Fristrom

That does sound really good...
I'm jealous.


The main problem with from source asset build systems I have worked on is that there usually isn't any dependency management. Which means incremental builds are unsafe and thus full builds become the norm. I think that if you look at source compilation of code you can see that dependency management, when implemented well, is a real time saver. Just as with must code in games, the build system is just hacked on as an after thought.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.


Post a comment

Your Information

(Name is required. Email address will not be displayed with the comment.)

Jamie's Bragging Rights

  • Spider-Man 2
    The best superhero games of all time Game Informer
    Top five games of all time Yahtzee Croshaw
    Top five superhero games of all time MSNBC
    Top 100 PS2 games of all time Official Playstation 2 Magazine
    1001 Games You Must Play Before You Die Nomination for Excellence in Gameplay Engineering Academy of Interactive Arts & Sciences
  • Schizoid
    Penny Arcade PAX 10 Award
    Nominated for XBLA Best Original Game
    Nominated for XBLA Best Co-Op Game