Tracking down a weird bug with Tim today - all of the global variables look fine but the computation is coming out wrong, and it's in an optimized build so we can't look at the local variables - we start stepping through the disassembly looking at the floating point registers, and eventually discover that the / 2.0f in the code is really multiplying by 0.25. Where's it getting that? From a totally unrelated global variable that Tim had declared for debugging purposes; initialized to 0.5 but he was changing it on the fly to test stuff. Apparently the compiler had decided that since, as far as it could tell, that 0.5 was constant it could use it for whatever it wanted...
Was it const or not const?
Posted by: MSN | October 11, 2010 at 02:03 PM
Not const.
Posted by: Jamie Fristrom | October 11, 2010 at 02:12 PM
I wonder if you declare it volatile if it will fix the issue.
Posted by: Jay K. | October 11, 2010 at 08:28 PM
Wow, bizarre... what compiler is that in? Why would it choose to use a varible for .5 instead of an immediate value?
Posted by: Chris Jurney | October 12, 2010 at 11:59 AM
I'm sure it would, but there isn't really an issue other than we spent a lot of time trying to figure out what was causing a bug that wouldn't happen in the field.
And the compiler is visual studio for the xbox, and I don't know enough about the floating point unit to venture a guess.
Posted by: Jamie Fristrom | October 12, 2010 at 12:15 PM
round many global debug variables up into a singleton, and try not to let many compilation units know about its existence. let them see only a pointer or a function that "just happens to" always return the singleton.
if compilation units don't know about the singleton, they can't make dangerous assumptions about it... barring some really good global optimization.
Posted by: Bryan McNett | October 13, 2010 at 01:04 PM