Something I've been deliberating lately - I've done console development for so long I've never had to worry about different performance characteristics on different machines.
One nice thing about web development is I can keep track of statistics about the games that are being played and what framerate they're seeing. Good news - seems like most people so far are playing with better machines than my laptop. (Though that may be an early adopter thing.)
So, for them, I'd like to turn on some of the cooler features I've worked on - in particular post-process glow and blur. On my machine, it makes things run just a little too slow - when most of your frames are <33ms but once or twice a second you get a 66ms zinger it's worse than running at a consistent 50ms.
So...assuming I don't come up with any genius optimizations...people with machines like mine should have post-processing turned off. Or play in a smaller window. It would be really nice if the game could detect that and adapt appropriately - "Am I running below 30? Turn these features off!" - but what would that look like to the player? Some players would be playing along with the effects on, and then they'd get a bad frame or series of frames in some gnarly stretch of game, and then suddenly the effects would turn off. It would look like it broke, like a bug...
Games back in the early days of hardware acceleration used to do things like that, but I think these days PC games have mostly moved away from that - these days they make a best guess and stick with it and let you adjust settings if you want.
So, for now, I guess that's what I'm going to do - I'll run with the full suite of effects and let the players turn them off if they want better framerate. But that's putting a lot of responsibility in the hands of the player - they'll probably have a better time if they turn off the glow and play at 60, but will they realize that?
So, I don't know. What do you guys think?