I'm very confused as to what my problem is here. I've set up a matrix which converts global/world coordinates into a local coordinate space of an object. This conversion matrix is constructed using object information from four vectors (forward, up, side and position). This localization matrix is then passed to glMultMatrixf()
at the draw time for each object so as I can draw a simple axes around each object to visualize the local coordinate system. This works completely fine and as expected, and as the objects move and rotate in the world, so do their local coordinate axes.
The problem is that when I take this same matrix and multiply it by a column vector (to convert the global positi开发者_开发技巧on of one object into the local coordinate system of another object) the result is not at all as I would expect. For example:
My localize matrix is as follows:
0.84155 0.138 0.5788 0
0.3020 0.8428 -0.5381 8.5335
0.4949 -0.5381 -0.6830 -11.6022
0.0 0.0 0.0 1.0
I input the position column vector:
-30.0
-30.0
-30.0
1.0
And get the output of:
-99.2362
-1.0199
4.8909
1.0000
As my object's position at this point in time is (-50.8, 8.533, -11.602, 1), I know that the output for the x coordinate cannot possibly be as great as -99.2362. Futhermore, when I find the distance between two global points, and the distance between the localized point and the origin, they are different.
I've checked this in Matlab and it seems that my matrix multiplication is correct (Note: in Matlab you have to first transpose the localize matrix). So I'm left to think that my localize matrix is not being constructed correctly - but then OpenGL is successfully using this matrix to draw the local coordinate axes!
I've tried to not include unnecessary details in this question but if you feel that you need more please don't hesitate to ask! :)
Many thanks.
I have to guess, but I would like to point out two sources of problems with OpenGL-matrix multiplication:
the modelview matrix transforms to a coordinate system where the camera is always at the origin (0,0,0) looking along the z-axis. So if you made some transformations to "move the camera" before applying local->global transformations, you must compensate for the camera movement or you will get coordinates local to the camera's coordinate space. Did you include camera transformations when you constructed the matrix?
Matrices in OpenGL are COLUMN-major. If you have an array with 16 values, the elements will be ordered that way:
[0][4][ 8][12]
[1][5][ 9][13]
[2][6][10][14]
[3][7][11][15]
Your matrix also seems strange. The first three columns tell me, that you applied some rotation or scaling transformations. The last column shows the amount of translation applied to each coordinate element. The numbers are the same as your object's position. That means, if you want the output x coordinate to be -50.8, the first three elements in the first row should add up to zero:
-30*0.8154 -30*0.3020 -30*0.4939 + 1 * -50.8967
<---this should be zero--------> but is -48,339.
So I think, there really is a problem when constructing the matrix. Perhaps you can explain how you construct the matrix...
精彩评论