More help with maths.

K.I.L.E.R

Retarded moron
Veteran
This is in relation to vectors, scalars and MATRICES.
Oh Goodness I had to read over 200 pages to attempt to understand something basic.
I'm just going to give you guys an abstract explanation of my understanding as a picture and you guys can correct my wrong answer. :)

In the matrix(modelview-matrix), assume x, y, and z are equal to 0.

I assume the matrix is a function from going from 1 frame into another?
IE: f(x) = x^2.

Anyone have an easy way to reverse the transformation?
Calculating inverses with determinants is a headache unless someone can do a good job of explaining it unlike most sites which just make my head spin.
 

Attachments

  • understanding.JPG
    understanding.JPG
    19.2 KB · Views: 14
K.I.L.E.R said:
Anyone have an easy way to reverse the transformation?
Build the inverse in reverse order. In graphics you don't generally use an arbitrary matrix but instead assemble it by multiplying simple matrices. i.e. if you are creating

Object-2-world = scale(2,5,1/2) * translate(10,5,7) * ...... * rotate(30 around axis BLAH)

then the inverse is created with

Inv = rotate(-30 around axis BLAH) * ... * translate(-10, -5, -7) * scale*(1/2, 1/4, 2)

Does that help?
 
Last edited by a moderator:
No, I'm almost certain that Simon's way is much faster. Most simple matrix inversion algorithms that you will see on the web are variants of Gaussian elimination which IIRC has cubic asymptotic complexity growth [i.e., O(n^3)]. I've heard of faster means but none were better than O(n^2.5), and it's debateable if they would even be noticably faster on a relatively small matrix.
 
Thanks guys, I wish I could give you guys more rep but it won't allow me. :(

Another question.
A modelview matrix is just a matrix that is the concatanated result of millions of other transformations right?
 
akira888 said:
No, I'm almost certain that Simon's way is much faster. Most simple matrix inversion algorithms that you will see on the web are variants of Gaussian elimination which IIRC has cubic asymptotic complexity growth [i.e., O(n^3)]. I've heard of faster means but none were better than O(n^2.5), and it's debateable if they would even be noticably faster on a relatively small matrix.
Well, there are "direct" methods for small matrices, but I'd say the biggest advantage of creating the inverse as you go is that it is possibly more numerically stable.


Killer, I just remembered. One thing you can do is flag special cases. For example, if R is a rotation matrix (i.e. any concatenation of rotation matrices), then the Transpose(R) == Inverse(R). You can't get much cheaper than that :)

You can then apply similar tricks if you know that your matrix has only been formed from rotations and translations. I'll let you think about that though, but as a hint remember that Inverse(X*Y) = Inverse(Y) * Inverse(X).
 
Simon F said:
Killer, I just remembered. One thing you can do is flag special cases. For example, if R is a rotation matrix (i.e. any concatenation of rotation matrices), then the Transpose(R) == Inverse(R). You can't get much cheaper than that :)

As a Postscriptum: This is true for all hermitian matrices.
 
hupfinsgack said:
As a Postscriptum: This is true for all hermitian matrices.
I couldn't remember the term :) (Assuming that is the term)
 
Last edited by a moderator:
hupfinsgack said:
As a Postscriptum: This is true for all hermitian matrices.
huh ?

[ 1 -i]
[ i 2]
is an hermitian matrix. So how does that fit with the description inverse = transpose ?
 
LeGreg said:
huh ?

[ 1 -i]
[ i 2]
is an hermitian matrix. So how does that fit with the description inverse = transpose ?

I'm not sure that is correct either..... :???:

I don't recall ever using complex numbers in my Linear Algebra courses, but Mathworld has this to say about Hermitian matrices:

It doesn't appear that hupfinsgack's definition is correct either :cry:
 
Simon F said:
It doesn't appear that hupfinsgack's definition is correct either :cry:

Oops, totally my bad. Note to myself don't answer questions, if you just came home from going out. Mixed up orthogonal eigenvectors with orthogonal matrices...

This is pretty embarrassing. :oops: Especially, after spitting big words. Anyways nobody's perfect ...
 
K.I.L.E.R said:
Orthogonal matrix?
Eigenvector?
I hope nobody minds me doing that, hehe

Orthogonal matrix => A*Transpose(A)=Identity
Identitiy(2)=[1 0]
[0 1]



Eigenvalue:
A, B are quadratic matrices, w is a scalar, x is a vector

general Eigenvalue problem:
A*x=w*B*x
or
(A-w*B)*x=0

special Eigenvalue problem:
(A-w*Identity)=0

w=Eigenvalue, x=Eigenvector
 
K.I.L.E.R said:
Orthogonal matrix?
Eigenvector?

Ahhh kiler, the joys you have ahead of you. :p

hupfinsgack said:
Eigenvalue:
A, B are quadratic matrices, w is a scalar, x is a vector
I think you meant to say "square matrices".
 
Last edited by a moderator:
Back
Top