Hello Everyone,
I am currently working on this problem in Linear Algebra and I am starting to go crazy with it.
I have done my calculations over and over again, and I keep getting
the same answer that is not what Wolfram Alpha is getting when I double check my work.
I did double check on this problem by performing the regular determinant without reducing into echelon form, and I get the same answer as Wolfram, so I am wondering if I am doing the echleon form of this wrong.
I have to find the determinant by row reduction to echelon form.
So the matrix is:
[1 5 -3
3 -3 3
2 13 -7]
The first step I do is: 1/3R2->R2
[1 5 -3
1 -1 1
2 13 -7]
Next I take -1R1+R2->R2
[1 5 -3
0 -6 4
2 13 -7]
Then I take -2R1+R3->R3
[1 5 -3
0 -6 4
0 3 -1]
After that I 1/2R2->R2
[1 5 -3
0 -3 2
0 3 -1]
Then I R2+R3->R3
[1 5 -3
0 -3 2
0 0 1]
Now that I have my echelon form, I try to get my determinant:
det = 1[-3 2 - 5[0 2 -3[0 3
0 1] 0 1] 0 0]
Which makes my det = -3???
My other calculations and Wolfram are getting the answer as -18.
Could you let me know where I am going wrong with this problem? :(
Thanks!!
Copyright © 2024 QUIZLS.COM - All rights reserved.
Answers & Comments
Verified answer
determinant of the echelon form & the original matrix are not USUALLY equal. in reality echelon form of a matrix is not unique.
[1 5 -3
0 -1 2/3
0 0 7]
- this is also an echelon form of your matrix. here determinant is -7.
a) Use the formula for determinants of order 3: |A| = a11 a22 a33 + a12 a23 a31 + a13 a21 a32 - a13 a22 a31 - a12 a21 a33 - a11 a23 a32. |A| = 6•(-a million)•(-3) + (-2)•a million•(-9) + 3•3•2 - 2•(-a million)•(-9) - (-2)•3•(-3) - a million•3•6 = 0 b) employing those residences: det(A) = det(A^T) det(A*B) = det(A)*det(B) det(U^T*U) = det(U^T)*det(U) = det(U)*det(U) = det(U)^2 as U^T*U = I det(U^T*U) = det(I) = a million for this reason det(U)^2 = a million => det(U) = a million or det(U) = -a million