One of the many ways in which crap can just not render is incorrect transforms - particularly common when porting to a new platform, because ...
You might have different coordinate systems: z up, y up ... left-handed, right-handed ...
And, one I've been bitten by many years ago and forgot about until today - row-major vs. column-major matrix multiplication! Why do I always assume everything is row-major and completely forget it ain't necessarily so...I guess because I've been using DirectX most of my life.
So, note to self: DirectX row-major, openGl column-major.
Um, hmmm.
It's more than that. There are 2 bits.
bit 0: is the math library column major or row major. The issue only comes up in multiplication generally in that you want to know is it A * B or B * A that gives you the correct answer
bit 1: What order are your values stored. In other words, does translation go in 12,13,14 or 3,7,11
Those are separate issues as regardless of how you store them the correct way the are multiplied might be different.
And then.....
At least when it comes to shaders, OpenGL and DirectX are the same at this point because YOU WRITE ALL THE MATH. I haven't had to change any of my math libraries in OpenGL vs DirectX, I only had to change my shaders from clipPosition = position * worldViewMatrix to clipPosition = worldViewMatrix * position;
Posted by: Greggman | May 18, 2011 at 06:55 PM
What platforms store translation in 3, 7, 11 these days? Nintendo? I thought at this point we were in an opengl vs directx world.
Posted by: Jamie | May 18, 2011 at 08:22 PM