Jump to content

Talk:Invertible matrix: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
SriCHaM (talk | contribs)
No edit summary
 
(57 intermediate revisions by 27 users not shown)
Line 1: Line 1:
{{WikiProject banner shell|class=B|
{{maths rating|frequentlyviewed=yes
{{WikiProject Mathematics|priority = high }}
|field = algebra
|importance = mid
|class = start
|historical =
}}
}}
{{annual readership|scale=log}}


{{User:MiszaBot/config
| algo = old(365d)
| archive = Talk:Invertible matrix/Archive %(counter)d
| counter = 1
| maxarchivesize = 150K
| archiveheader = {{Automatic archive navigator}}
| minthreadstoarchive = 1
| minthreadsleft = 10
}}
{{Archive box |search=yes |bot=Lowercase sigmabot III |age=12 |units=months |auto=yes }}

==Wiki Education Foundation-supported course assignment==
[[File:Sciences humaines.svg|40px]] This article was the subject of a Wiki Education Foundation-supported course assignment, between <span class="mw-formatted-date" title="2021-09-08">8 September 2021</span> and <span class="mw-formatted-date" title="2021-12-19">19 December 2021</span>. Further details are available [[Wikipedia:Wiki_Ed/Northeastern_University/ENGW3307_Adv_Writing_for_the_Sciences_(Fall2021)|on the course page]]. Student editor(s): [[User:Laoer22|Laoer22]].

{{small|Above undated message substituted from [[Template:Dashboard.wikiedu.org assignment]] by [[User:PrimeBOT|PrimeBOT]] ([[User talk:PrimeBOT|talk]]) 00:45, 17 January 2022 (UTC)}}
== Sigh ==
== Sigh ==


Line 21: Line 34:
My request to the fantastic math heads writing this page: put the high-school stuff first, since that's what a lot people will want to see. Make it clear to people with minimum math background, and keep it simple when possible.
My request to the fantastic math heads writing this page: put the high-school stuff first, since that's what a lot people will want to see. Make it clear to people with minimum math background, and keep it simple when possible.


Thanks for the hard work. I'm just trying to help make the information useful for everyone.
Thanks for the hard work. I'm just trying to help make the information useful for everyone.


[[User:Hawkeyek|Hawkeyek]] ([[User talk:Hawkeyek|talk]]) 05:51, 10 April 2008 (UTC)
[[User:Hawkeyek|Hawkeyek]] ([[User talk:Hawkeyek|talk]]) 05:51, 10 April 2008 (UTC)
Line 29: Line 42:
:I think this information is important to include, but that earlier in the article we can describe it in simpler terms, such as those used by 67.9.148.47 above. [[User:Dcoetzee|Dcoetzee]] 08:46, 29 November 2008 (UTC)
:I think this information is important to include, but that earlier in the article we can describe it in simpler terms, such as those used by 67.9.148.47 above. [[User:Dcoetzee|Dcoetzee]] 08:46, 29 November 2008 (UTC)


I think it is important to include that singularity is rare. Maybe we should add something small to make sure we're not overgeneralizing.
::I think it is important to include that singularity is rare. Maybe we should add something small to make sure we're not overgeneralizing. Singularity over a euclidean field is rare. But isn't a square matrix over z2 almost always singular? <span style="font-size: smaller;" class="autosigned">— Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[Special:Contributions/132.235.46.80|132.235.46.80]] ([[User talk:132.235.46.80|talk]]) 04:05, 9 November 2011 (UTC)</span><!-- Template:Unsigned IP --> <!--Autosigned by SineBot-->
Singularity over a euclidean field is rare. But isn't a square matrix over z2 almost always singular? <span style="font-size: smaller;" class="autosigned">— Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[Special:Contributions/132.235.46.80|132.235.46.80]] ([[User talk:132.235.46.80|talk]]) 04:05, 9 November 2011 (UTC)</span><!-- Template:Unsigned IP --> <!--Autosigned by SineBot-->


::Prompted by the discussion here, I have moved it to later. It does not seem to be important enough to be in the lead of the article. Incidentally, the lead doesn't exactly comply with the [[WP:LEAD]] guideline anyway. It should probably be rewritten and the existing content relocated elsewhere. [[User:Silly rabbit|<font color="#c00000">siℓℓy rabbit</font>]] ([[User talk:Silly rabbit|<span style="color:#FF823D;font-family:Monotype Corsiva;cursor:help"><font color="#c00000">talk</font></span>]]) 13:53, 29 November 2008 (UTC)
:::Prompted by the discussion here, I have moved it to later. It does not seem to be important enough to be in the lead of the article. Incidentally, the lead doesn't exactly comply with the [[WP:LEAD]] guideline anyway. It should probably be rewritten and the existing content relocated elsewhere. [[User:Silly rabbit|<span style="color:#c00000;">siℓℓy rabbit</span>]] ([[User talk:Silly rabbit|<span style="color:#FF823D;font-family:Monotype Corsiva;cursor:help"><span style="color:#c00000;">talk</span></span>]]) 13:53, 29 November 2008 (UTC)


::::Could someone add a simple example with real numbers? That was what I was looking for, just some actual example not involving variables. <!-- Template:Unsigned IP --><small class="autosigned">—&nbsp;Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[Special:Contributions/60.166.111.122|60.166.111.122]] ([[User talk:60.166.111.122#top|talk]]) 14:19, 16 September 2015 (UTC)</small>
== Statement equivalence ==
:I agree, this page does have a lot of high level terms. Though the concept itself is pretty layered, maybe we should introduce a basic introduction section? I think it would appeal to a wider audience. [[User:SriCHaM|SriCHaM]] ([[User talk:SriCHaM|talk]]) 13:32, 30 June 2024 (UTC)

The formulae given for inverting 2x2 and 3x3 matrices are not valid
for matrices with non-commutative elements and maybe this should be
made clear. On the other hand, there are many articles on matrices
and to qualify every statement with words like 'where the matrix
elements are real, complex, or multiply commutatively' would make
the pages cumbersome for most readers. Perhaps a footnote?

I cancelled out these two lines beacause they are not equivalent with the invertibility of a matrix. Only if both are true and this is already given in the next line (..exactly one solution..).
*The equation ''Ax = b'' has at most one solution for each ''b'' in ''K''<sup>''n''</sup>.
*The equation ''Ax = b'' has at least one solution for each ''b'' in ''K''<sup>''n''</sup>.
I also removed
*The linear transformation ''x'' <tt>|-></tt> ''Ax'' from ''K''<sup>''n''</sup> to ''K''<sup>''n''</sup> is [[injective function|one-to-one]].
*The linear transformation ''x'' <tt>|-></tt> ''Ax'' from ''K''<sup>''n''</sup> to ''K''<sup>''n''</sup> is [[surjective function|onto]].
beacuas these are not equivalent. And also not equivalent with the other statements

:What? I can prove that the transformation is both onto and one-to-one for square matrices. These two statements are, in fact, equivalent iff A is a square matrix. The onto statement is equivalent to saying that Span{Col ''A''}} = ''K''<sup>''n''</sup>, whereas the one-to-one statement implies that Nul ''A'' = {0}, both of which are given in the parts that are listed. If you want the full formal proof, you'll have to wait until I get my Linear Algebra textbook out. [[User:IMacWin95|IMacWin95]] 01:36, 28 April 2007 (UTC)

::Agreed. [[User:Oli Filth|Oli Filth]] 11:56, 28 April 2007 (UTC)



== What? ==

Are large parts of this article copy and pasted from answers.com, or did answers.com take large parts of this article?
Can someone explain the statement: "As a rule of thumb, almost all matrices are invertible. Over the field of real numbers, this can be made precise as follows: the set of singular n-by-n matrices, considered as a subset of , is a null set, i.e., has Lebesgue measure zero. Intuitively, this means that if you pick a random square matrix over the reals, the probability that it will be singular is zero."

Does this mean the matrix

:(0, 2)
:(0, 0)

is invertible, even though it is not row reducible to the identity? Somehow I don't think so, but for a person not steeped in mathematical know-how, it is misleading and suggests that is indeed the case. Basically, this article messed me over because I used it as a study help. Someone who knows what they're talking about needs to re-write it, or else sue answers.com.

[[User:18.251.6.142|18.251.6.142]] 08:41, 10 March 2006 (UTC)

: Answers.com copied this article under the [[GFDL]], so all is fine. :)

: That matrix is not invertible, and it is not row reducible either, so I don't see a problem. That text just says it is more likely that a given matrix is invertible that it is not, it does not mean all matrices are invertible.

: I suggest you go over things which you don't know and read only the parts you understand. The article is written such that it gives some information both to people who know nothing abou this stuff, and to people who know a lot, it is not tailored specifically to you. If overall this article manages to answer some of your questions, I guess you should be happy with that. [[User:Oleg Alexandrov|Oleg Alexandrov]] ([[User talk:Oleg Alexandrov|talk]]) 17:41, 10 March 2006 (UTC)

:: Thank you Oleg. I am sorry if I seemed a bit upset, but other parts of this site have been very helpful in my coursework and this particular statement seemed a bit misleading and cost me a lot of time. [[User:18.251.6.142|18.251.6.142]] 17:59, 10 March 2006 (UTC)

=== Invertible != regular ? ===

The article currently equates invertibility and regularity with the statement beginnging "In linear algebra, an n-by-n (square) matrix A is called invertible, non-singular, or regular if..." in the first line. However, this interpretation of "regular" conflicts with the definition "A stochastic matrix P is regular if some matrix power P^k contains only strictly positive entries" given on the [[Stochastic matrix]] page. I've not seen the term "regular matrix" before, but it seems that that page uses "regular" where I would use "ergodic", while this page uses it as a synonym for "invertible".

As far as I can see, one of the definitions must be wrong, as each can trivially be shown to exclude the other. In the unlikely event that both meanings are in common use, then it is an error on the Stochastic matrix page that "regular" is linked to this page. [[User:152.78.191.84|152.78.191.84]] 12:08, 13 March 2006 (UTC)

: I'm struggling with this seeming contradiction as well. I can't find any references to regular matrices in my (somewhat limited) library, but I found a few conflicting references online. Wolfram's site (http://mathworld.wolfram.com/RegularMatrix.html) just redirects me to the entry for "Nonsingular Matrix", which gives credence to the equivalence of invertability and regularity. I've found a few other definitions as well which don't seem quite as reputable. A book transcript from Springer-Verlag (http://www.vias.org/tmdatanaleng/hl_regularmatrix.html) gives the definition found on the [[Stochastic matrix]] page. Finally, Thinkquest (http://library.thinkquest.org/28509/English/Transformation.htm) gives the definition of a regular matrix as a matrix whose inverse is itself (<math>A^{-1} = A</math>). This definition seems pretty off to me; wouldn't that imply that the only "regular" matrix is the identity or a permutation matrix?

: I looked around for references on the Stochastic Matrix Theorem cited on the [[Stochastic matrix]] page but was unsuccessful as the page gives no references. Perhaps someone with more knowledge could point to a source for this theorem, which likely would clarify the confusion? [[User:Mateoee|Mateoee]] 17:54, 17 November 2006 (UTC)
:: I never heard of invertible matrices being called regular. I will remove that from the article. [[User:Oleg Alexandrov|Oleg Alexandrov]] ([[User talk:Oleg Alexandrov|talk]]) 02:45, 18 November 2006 (UTC)

== Rephrase a sentence? ==

I find the sentence "The equation Ax = 0 has infinitely many the trivial solutions x = 0 (i.e. Null A = 0)." very confusing. Why not rephrase it to "The equation Ax = 0 has only the trivial solution x = 0."? If the former sentence is more correct I apologize for my lack of knowledge in this subject. :/ [[User:Karih|Karih]] 18:18, 3 December 2006 (UTC)

:You're absolutely right, it is very confusing to put it mildly. I fixed it; thanks for bringing this to our attention. -- [[User:Jitse Niesen|Jitse Niesen]] ([[User talk:Jitse Niesen|talk]]) 02:10, 4 December 2006 (UTC)


== Inversion of 3 x 3 matrices ==
== Inversion of 3 x 3 matrices ==
Line 103: Line 56:


:There are links to two different methods for solving systems that involve inverse matrices, as well as a description of the general analytic method for obtaining the ''n''x''n'' inverse. It would be completely unnecessary to show an example of 3x3 in this article, IMO. The only reason that the 2x2 is shown is because it's trivially simple. [[User:Oli Filth|Oli Filth]] 10:54, 22 February 2007 (UTC)
:There are links to two different methods for solving systems that involve inverse matrices, as well as a description of the general analytic method for obtaining the ''n''x''n'' inverse. It would be completely unnecessary to show an example of 3x3 in this article, IMO. The only reason that the 2x2 is shown is because it's trivially simple. [[User:Oli Filth|Oli Filth]] 10:54, 22 February 2007 (UTC)
From http://www.dr-lex.34sp.com/random/matrix_inv.html:


::From http://www.dr-lex.34sp.com/random/matrix_inv.html:
<math>

:::<math>
A^{-1} =
A^{-1} =
\begin{bmatrix}
\begin{bmatrix}
Line 112: Line 66:
g & h & i \\
g & h & i \\
\end{bmatrix}^{-1} </math>
\end{bmatrix}^{-1} </math>
<math>=
:::<math>=
\frac{1}{a(ie-hf)-d(ib-hc)+g(fb-ec)}
\frac{1}{a(ie-hf)-d(ib-hc)+g(fb-ec)}
\begin{bmatrix}
\begin{bmatrix}
Line 121: Line 75:
</math>
</math>


Hopefully I copied it over rightly. Looking at the letters like this makes the pattern of 2x2 matrices excluding the row and column of the element in question, turned sideways, seem much more intuitive, although it sure sounds complicated when I say it out like that. [[User:72.224.200.135|72.224.200.135]] 02:44, 12 June 2007 (UTC)
::Hopefully I copied it over rightly. Looking at the letters like this makes the pattern of 2x2 matrices excluding the row and column of the element in question, turned sideways, seem much more intuitive, although it sure sounds complicated when I say it out like that. [[User:72.224.200.135|72.224.200.135]] 02:44, 12 June 2007 (UTC)


I agree that this article doesn't need an explicit formula for every single dimension, but I still would include a formula for the 3x3 case. Not the one above, which is admittedly long winded & inconvenient, just asking for making copying errors. But there is a much simpler one, only using [[cross product]] & [[triple product]]: <br /> If an invertible matrix A consists of the column vectors <math>\mathbf{x_0},\;\mathbf{x_1},\;\mathbf{x_2}</math>, its inverse consists of the row vectors <math>\mathbf{x_1}\times \mathbf{x_2},\;\mathbf{x_2}\times \mathbf{x_0},\;\mathbf{x_0}\times\mathbf{x_1} </math>, multiplied by the inverse determinant of A, where the determinant just "happens" to be equal to the triple product of x0,x1 & x2: <math>\det(A) =\mathbf{x_0}\cdot(\mathbf{x_1}\times\mathbf{x_2})</math>. That this matrix is a left inverse of A can be checked easily by using basic properties of the cross & triple products. And since left inverses & right inverses are identical for all groups, this is indeed the inverse of A.<br />If no one objects, I'm gonna include the formula. [[User:Catskineater|Catskineater]] ([[User talk:Catskineater|talk]]) 03:42, 6 February 2010 (UTC)
:::I agree that this article doesn't need an explicit formula for every single dimension, but I still would include a formula for the 3x3 case. Not the one above, which is admittedly long winded & inconvenient, just asking for making copying errors. But there is a much simpler one, only using [[cross product]] & [[triple product]]: <br /> If an invertible matrix A consists of the column vectors <math>\mathbf{x_0},\;\mathbf{x_1},\;\mathbf{x_2}</math>, its inverse consists of the row vectors <math>\mathbf{x_1}\times \mathbf{x_2},\;\mathbf{x_2}\times \mathbf{x_0},\;\mathbf{x_0}\times\mathbf{x_1} </math>, multiplied by the inverse determinant of A, where the determinant just "happens" to be equal to the triple product of x0,x1 & x2: <math>\det(A) =\mathbf{x_0}\cdot(\mathbf{x_1}\times\mathbf{x_2})</math>. That this matrix is a left inverse of A can be checked easily by using basic properties of the cross & triple products. And since left inverses & right inverses are identical for all groups, this is indeed the inverse of A.<br />If no one objects, I'm gonna include the formula. [[User:Catskineater|Catskineater]] ([[User talk:Catskineater|talk]]) 03:42, 6 February 2010 (UTC)


:No one protested, so i added the formula. [[User:Catskineater|Catskineater]] ([[User talk:Catskineater|talk]]) 22:42, 21 February 2010 (UTC)
::::No one protested, so i added the formula. [[User:Catskineater|Catskineater]] ([[User talk:Catskineater|talk]]) 22:42, 21 February 2010 (UTC)


:::::You guys should probably switch the matrix formula for the inversion of 3x3 matricies on the wiki page.


:::::i.e

::::::<math>\mathbf{A}^{-1} = \begin{bmatrix}

You guys should probably switch the matrix formula for the inversion of 3x3 matricies on the wiki page.

i.e
:<math>\mathbf{A}^{-1} = \begin{bmatrix}
a & b & c\\ d & e & f \\ g & h & k\\
a & b & c\\ d & e & f \\ g & h & k\\
\end{bmatrix}^{-1} =
\end{bmatrix}^{-1} =
Line 139: Line 90:
\, A & \, B & \,C \\ \, D & \, E & \,F \\ \, G & \,H & \, K\\
\, A & \, B & \,C \\ \, D & \, E & \,F \\ \, G & \,H & \, K\\
\end{bmatrix}</math>
\end{bmatrix}</math>
where
:::::where
:<math>Z = a(ek-fh)+b(fg-dk)+c(dh-eg)</math>
::::::<math>Z = a(ek-fh)+b(fg-dk)+c(dh-eg)</math>
which is the determinant of the matrix. If <math>Z</math> is finite (non-zero), the matrix is invertible, with the elements of the above matrix on the right side given by
:::::which is the determinant of the matrix. If <math>Z</math> is finite (non-zero), the matrix is invertible, with the elements of the above matrix on the right side given by
:<math>\begin{matrix}
::::::<math>\begin{matrix}
A = (ek-fh) & D = (ch-bk) & G = (bf - ce) \\
A = (ek-fh) & D = (ch-bk) & G = (bf - ce) \\
B = (fg-dk) & E = (ak-cg) & H = (cd-af) \\
B = (fg-dk) & E = (ak-cg) & H = (cd-af) \\
Line 148: Line 99:
\end{matrix}</math>
\end{matrix}</math>


:::::I'm not 100% sure, but i'm pretty sure that that gives out the solution for the transpose of the inverse, not the inverse itself. A pretty easy and quick fix, but i'm pretty lazy and don't have the time to formulate it into wiki and make sure its all right. <!-- Template:Unsigned --><small class="autosigned">—&nbsp;Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[User:Njc69|Njc69]] ([[User talk:Njc69#top|talk]] • [[Special:Contributions/Njc69|contribs]]) 16:35, 3 October 2010 (UTC)</small>


::::::I agree with the above. The formula on the main page caused me lots of problems since it is actually transpose of the inverse, but the above formula seems to work. <span style="font-size: smaller;" class="autosigned">— Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[Special:Contributions/65.60.221.79|65.60.221.79]] ([[User talk:65.60.221.79|talk]]) 10:59, 8 March 2013 (UTC)</span><!-- Template:Unsigned IP --> <!--Autosigned by SineBot-->


:::::::The formula on the page has been corrected more than two years ago, so you are probably misreading something.—[[User:EmilJ|Emil]]&nbsp;[[User talk:EmilJ|J.]] 12:25, 8 March 2013 (UTC)
I'm not 100% sure, but i'm pretty sure that that gives out the solution for the transpose of the inverse, not the inverse itself. A pretty easy and quick fix, but i'm pretty lazy and don't have the time to formulate it into wiki and make sure its all right.

== Inversion of 4 x 4 matrices ==
please!

: Although it is possible to derive equations for the inversion of 3x3 and 4x4 matrices like the one for the 2x2 matrix, they will be huge (and therefore not really suitable for the article). The generalised analytic form is already given (i.e. in terms of determinant and co-factors), and furthermore, inversion may be achieved more practically using an algorithm such as Gaussian elimination. [[User:Oli Filth|Oli Filth]] 09:01, 19 January 2007 (UTC)

==Note==

I think making a statement like "As a rule of thumb, almost all matrices are invertible" is vague and not accurate. There may be more invertible matrices than not, but a statement like that will certainly confuse many readers, especially those that are new to the subject. [[User:69.107.60.124|69.107.60.124]] 18:31, 28 January 2007 (UTC)

: I agree. I will remove that. [[User:Oleg Alexandrov|Oleg Alexandrov]] ([[User talk:Oleg Alexandrov|talk]]) 23:15, 28 January 2007 (UTC)
:: Actually, after reading the text, I disagree. That statement is definitely not precise, but it is made precise in the next sentence, and that rather vague statement is used to motivate the numerical issues below.

:: I don't much like the current intro, but it has its good points and I can't think of anything better to replace it with. [[User:Oleg Alexandrov|Oleg Alexandrov]] ([[User talk:Oleg Alexandrov|talk]]) 23:19, 28 January 2007 (UTC)

: I think that "As a rule of thumb" is misleading. (At least one of my students was slightly confused by it.) I deleted that and rephrased/reordered parts of the paragraph. Hopefully, the new version is less confusing. [[User:Fgdorais|Fgdorais]] 21:02, 17 September 2007 (UTC)

I would like to add that "almost all square matrices are invertible" can also be interpreted in the sense of category, i.e., the set of invertible matrices is open dense. This is more intimately connected with perturbation of coefficients and numerical considerations. Perhaps this material should be removed from the introduction and have its own paragraph where these two (and pehaps other) interpretations can be discussed. I may decide to write such a paragraph when I'm less busy, but I would be very happy if someone else were to volunteer. [[User:Fgdorais|Fgdorais]] 14:55, 23 September 2007 (UTC)

== Account for other systems than R ==

in "Inversion of 2 x 2 matrices", 1/(ad-bc) is used. Should it be (ad-bc)^-1 to account for other systems such as [[Ring (mathematics)|rings]] ? I'm in no way a mathematician so i ask someone more knowledgeable to consider. [[User:Dubonbacon|Dubonbacon]] 18:17, 23 February 2007 (UTC)

:I'd guess that people sufficiently advanced to know about such stuff will readily convert between these notations. I think that one would probably need to assume that the entries come from a [[field (mathematics)|field]] instead of a ring. But I'm in no way a pure mathematician so all my matrices have numbers in them. -- [[User:Jitse Niesen|Jitse Niesen]] ([[User talk:Jitse Niesen|talk]]) 12:34, 26 February 2007 (UTC)

== Gaussian elimination example ==

I've reverted the addition of an example of Gaussian elimination, because that is already covered in the [[Gaussian elimination]] article. That article would be the appropriate place to add an example. This article is concerned with the mathematics of inverse matrices, not the numerical intricacies of how to obtain them. (Otherwise, for parity, we'd need step-by-step numerical examples of Newton's method and LU decomposition as well, which would hideously bloat the article.)

Adding an example that has no explanation of the steps involved is certainly not helpful! [[User:Oli Filth|Oli Filth]] 23:44, 3 May 2007 (UTC)

== "regular" revisited ==

As I could find in a former discussion (2006), the term "regular" has been removed from the introduction because it seems not equal to "inversible". But still "Regular matrix", as it is referred to in "Irregular matrix", e.g., redirects to this article. First, I was quite confused that the opposite of a matrix with "a different number of elements in each row" should be an inversible matrix. Second, the more confused I was when I could not find the term "regular" anywhere in the whole article where I was redirected from "Regular matrix"...

--chiccodoro <small>—Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[Special:Contributions/131.152.34.104|131.152.34.104]] ([[User talk:131.152.34.104|talk]]) 09:29, 26 May 2008 (UTC)</small><!-- Template:UnsignedIP --> <!--Autosigned by SineBot-->

:I'm not sure what the best thing to do is here. Firstly, the term "regular matrix" is used in the meaning of "invertible matrix", so I initially added this to the article. However, reading the 2006 discussion made me realize that this usage is very rare and that it would be misleading to add it as a synonym in the first sentence, so I reverted myself. Secondly, the article [[stochastic matrix]] does no longer mention the meaning of "regular matrix" that the 2006 discussion refers to. The best solution I could think of is to turn [[regular matrix]] in some kind of disambiguation page. I think that should help against any confusion. -- [[User:Jitse Niesen|Jitse Niesen]] ([[User talk:Jitse Niesen|talk]]) 12:38, 18 December 2008 (UTC)

::The disambiguation page is already done. Should this comment be removed from here?-- [[User:Arauzo|Arauzo]] ([[User talk:Arauzo|talk]]) 11:57, 1 November 2011 (UTC)

:::As I understand it, talk pages are never pruned down. [[User:Austinmohr|Austinmohr]] ([[User talk:Austinmohr|talk]]) 19:13, 1 November 2011 (UTC)

== Invertible matrices are common ==

After the discussion at [[#Note]] above, Silly Rabbit moved the paragraph about almost all matrices being invertible to a separate section. Recently, an IP editor added the sentence "Informally speaking, invertible matrices are common whereas singular matrices are rare (the precise meaning of this statement is given below)." I think that even with all the qualifiers, this sentence is misleading. For instance, singular matrices are not rare in exams. I replaced it with the statement that random matrices are singular with probability zero, which is a formulation that I hope most people can understand. -- [[User:Jitse Niesen|Jitse Niesen]] ([[User talk:Jitse Niesen|talk]]) 12:46, 18 December 2008 (UTC)
:I rephrased it using the words [[almost surely]], which are both formally defined and informally understandable. If anyone thinks that's not OK, feel free to revert or change it. [[User:Oliphaunt|Oliphaunt]] ([[User talk:Oliphaunt|talk]]) 22:59, 18 December 2008 (UTC)
::I understand the puropse behind your edit - "rare" and "common" are general ideas and do not have an exact mathematical meaning in this context. However, there needs to be a balance between being precise and conveying the idea to a layman. The point of deferring the exact meaning of the statement to later sections is so that we can be more colloquial in the introduction. In fact, there is more than one sense in which singular matrices are rare (eg: singular matrices are nowhere dense), and as it currently stands the introduction simply picks one precise explanation and ignores the other. For these reasons we should leave the technical details about lebesgue measure, almost surely picking random matrices, density, and so forth until later, and keep the tone of the introduction informal. [[Special:Contributions/67.9.148.47|67.9.148.47]] ([[User talk:67.9.148.47|talk]]) 10:33, 20 December 2008 (UTC)
:::No, my problem is the probable interpretation of laymen when reading that sentence is wrong. Most laymen (or, at least, a lot of them) will interpret it as saying that you will rarely encounter a singular matrix in practice, and I believe that in fact quite a lot of the matrices that are encountered in practice are singular. Oliphaunt, I like the "almost surely". -- [[User:Jitse Niesen|Jitse Niesen]] ([[User talk:Jitse Niesen|talk]]) 16:34, 20 December 2008 (UTC)

== The basics? ==


::::::::I have only a moderate objection to the above cross/dot product representation: it's useless. If what we're looking for is a REPRESENTATION of the inverse of a matrix, then A<sup>-1</sup> is hard to beat. I'm writing this 6½ years after Catskineater's post, but I don't see how an alternative representation helps either clarify the topic or aid in computing the terms of the inverse matrix. I found the inclusion of the specific element by element expansion of the 3x3 matrix quite useful. I also strongly disagree with those who claim such a representation is prone to errors. That is, assuming copy and paste is something you are able to handle it is trivial and NOT "asking for making copy errors". I strongly doubt whether either the information about the Cayley-Hamilton decomposition or the information about the representation in terms of 3 column vectors is useful to 99.99% of the readers. I propose the 3x3 section be shortened by removing all the material after the Det(A)=aA+bB+cC AND I also propose abandoning the use of the elements A,B,...,I. They add very little in terms of conciseness; compare Det(A) = a(ei-fh)- b(di-fg)+c(dh-eg) with the above, there is very little space savings but the cost of including 9 more (extraneous) variables is significant in terms of clarity and simplicity. What purpose does it serve?[[Special:Contributions/71.30.36.108|71.30.36.108]] ([[User talk:71.30.36.108|talk]]) 00:31, 9 August 2016 (UTC)
I actually came here to check that my memory that A . A^-1 =I
after checking the whole page and the page for Matrix I gave up looking and started working through the example there. They give a A and a A^-1 after the first row its clear it is true.


== Matrix inverses in MIMO wireless communication ==
But its not on the wiki entry, so I checked the German page, and sure enough there it is.
Sure its not for all Matrices but it is the basic idea isn't it? <span style="font-size: smaller;" class="autosigned">—Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[Special:Contributions/137.248.1.11|137.248.1.11]] ([[User talk:137.248.1.11|talk]]) 10:01, 3 November 2009 (UTC)</span><!-- Template:UnsignedIP --> <!--Autosigned by SineBot-->


The statement that "It is crucial for the matrix H to be invertible for the receiver to be able to figure out the transmitted information." is just plain wrong. The matrix H is not always square, and there are several better ways of decoding the transmitted signal, instead of inverting the channel matrix. I have never edited a Wikipedia page before, so I will figure that out before I make some correcting changes. <span style="font-size: smaller;" class="autosigned">— Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[Special:Contributions/89.160.119.209|89.160.119.209]] ([[User talk:89.160.119.209|talk]]) 11:00, 13 December 2013 (UTC)</span><!-- Template:Unsigned IP --> <!--Autosigned by SineBot-->
== Matrix inverses in real-time simulations ? ==


== No mention of linear independence anywhere ==
I'm not against the paragraph, but the sentence "Compared to matrix multiplication or creation of rotation matrices, matrix inversion is several orders of magnitude slower" looks quite unfounded. As far as I can tell, you need 8 muls & 4 adds for a matrix multiplication in the 2x2 case, compared to 1 reciprocal, 6 muls & 1 add for an inversion. For 3x3, it's 27 muls & 18 adds compared to 1 reciprocal, 30 muls & 11 adds. Doesn't look like "several orders of magnitude" (which I interpret as a factor of at least 50, but more like >100) to me. For 4x4 matrices, it's 64 muls & 48 adds versus 1 reciprocal, 164 muls & 83 adds. If every operation counts as 1 FLOP, that's 112 FLOPs versus 247 FLOPs - still only a factor of less than 3.
<br />If the original author could please explain what he/she meant. [[User:Catskineater|Catskineater]] ([[User talk:Catskineater|talk]]) 22:40, 21 February 2010 (UTC)


A square matrix is singular if and only if it's determinant is zero.
: Maybe the author is referring to how the operations for matrix-product can be done in parallel, moreso than inverse operations. Either way it needs a citation needed tag, which I'll add. [[User:Antares5245|Antares5245]] ([[User talk:Antares5245|talk]]) 22:21, 15 June 2010 (UTC)


A family of vectors are linearly independent if and only if the determinant of their matrix is zero.
:: I removed the sentence (and the sentence before it) because as far as I know there is no performance problem if you use the equation for the 3x3 inverse. One might point out that computing the inverse of rigid body transformations in homogeneous coordinates can be implemented by a transposition of a 3x3 (rotation) matrix and a negation of a (translation-)vector, which is significantly faster than a general 4x4 inverse (maybe even orders of magnitude), but it is trivial that you avoid more expensive computations if there is a cheaper alternative. --[[User:Martin Kraus|Martin Kraus]] ([[User talk:Martin Kraus|talk]]) 13:56, 20 July 2010 (UTC)


I really think there should be a mention of linear independence here after the reference to singular matrices since singularity is equivalent to linear dependence, which also ties in to the discussion later about eigenvectors and such. There's already a page on linear independence so just a quick link would be nice.
==blockwise inversion==
currently, 2 by 2 block wise inversion is provided, can someone write down the 3 by 3 block wise inversion? [[User:Jackzhp|Jackzhp]] ([[User talk:Jackzhp|talk]]) 23:39, 18 April 2010 (UTC)


[[User:SomeHandyGuy|SomeHandyGuy]] ([[User talk:SomeHandyGuy|talk]]) 00:33, 4 June 2014 (UTC)
: I'm not sure there actually is a "3x3" block inverse. You can come close by partitioning one of the 2x2 blocks further into another 2x2 block. What's more interesting is doing the block inverse by calculating the inverse of B or C , instead of A or D. This is possible because the inverse of a horizontal reversal of matrices is equal to the vertical reversal of the inverse, and can be useful when A or D is singular. Probably not interesting enough to add to the article though. [[User:Antares5245|Antares5245]] ([[User talk:Antares5245|talk]]) 22:25, 15 June 2010 (UTC)


== Article Name: why not '''inverse matrix''' or '''matrix inverse''' ? ==
As another note on this section of the article, I think there needs to be some clarification of what is meant when it is said that the matrix is invertible if and only if A and D - CA^{-1}B are invertible. The decomposition into the submatrices is arbitrary. It's easy to come up with an invertible matrix for which no upper left square submatrix is invertible (think of the identity with the first and last columns switched - any block decomposition gives a singular "A" matrix). Clearly the statement cannot be true then; negate it: the matrix is singular if and only if either A or D - CA^{-1}B is singular. But my example contradicts that. Is there some extra condition that I am missing and which is implied? [[Special:Contributions/18.63.6.219|18.63.6.219]] ([[User talk:18.63.6.219|talk]]) 19:20, 22 August 2011 (UTC)


I think those phrases would appear more in text, i.e. someone reading and clicking wants to know ''what is an "inverse matrix"'', or ''how do I invert a matrix''.
== Etymology of "Singularity" ==
Would it make sense to rename and reword this article? :-


"The inverse of a matrix 'A' is a matrix 'B' such that ... AB=I ; a matrix for which such an inverse exists is called 'invertible'..."
Could someone please point out what it is that is "single" in "singular matrix"? Thanks [[User:Gwideman|Gwideman]] ([[User talk:Gwideman|talk]]) 00:24, 20 September 2010 (UTC)
[[User:Fmadd|Fmadd]] ([[User talk:Fmadd|talk]]) 08:38, 4 February 2017 (UTC)


== External links modified ==
: Probably nothing very close. It's the same "singular" as in "singularity", which is more like "exceptional". [[Special:Contributions/94.255.156.147|94.255.156.147]] ([[User talk:94.255.156.147|talk]]) 21:31, 26 January 2011 (UTC)


Hello fellow Wikipedians,
== Analytic solution? ==


I have just modified one external link on [[Invertible matrix]]. Please take a moment to review [[special:diff/810551545|my edit]]. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit [[User:Cyberpower678/FaQs#InternetArchiveBot|this simple FaQ]] for additional information. I made the following changes:
For some reason, the subsection about Cramer's rule (and special cases thereof) is called "Analytic solution", which feels very confusing as this method has nothing whatsoever to do with analysis. I suppose "closed form formulas" might be what the editor wanted to express, but simply [[Cramer's rule]] is probably most in line with its sibling subsection headings. [[Special:Contributions/94.255.156.147|94.255.156.147]] ([[User talk:94.255.156.147|talk]]) 21:27, 26 January 2011 (UTC)
*Added archive https://web.archive.org/web/20111103045802/http://www.khanacademy.org/video/inverse-matrix--part-1?playlist=Linear%20Algebra to http://www.khanacademy.org/video/inverse-matrix--part-1?playlist=Linear+Algebra


When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
:As explained in [[analytical expression]], the terms [[closed-form expression]] and analytical expression are almost equivalent. Here, "analytic solution" might be preferable to emphasize the difference to a "numerical solution". Cramer's might be too specialized since there probably are other forms of analytic solutions that are equivalent to Cramer's rule but not covered by it. --[[User:Martin Kraus|Martin Kraus]] ([[User talk:Martin Kraus|talk]]) 17:35, 24 August 2011 (UTC)


{{sourcecheck|checked=false|needhelp=}}
== 'Integral representation' method ==


Cheers.—[[User:InternetArchiveBot|'''<span style="color:darkgrey;font-family:monospace">InternetArchiveBot</span>''']] <span style="color:green;font-family:Rockwell">([[User talk:InternetArchiveBot|Report bug]])</span> 23:35, 15 November 2017 (UTC)
With regards to [http://en.wikipedia.org/enwiki/w/index.php?title=Invertible_matrix&diff=469678256&oldid=469359086 this addition], after some discussion [http://en.wikipedia.org/enwiki/w/index.php?title=User_talk:JohnBlackburne&oldid=469490338#Integral_Representation_of_Matrix_Inverse on my talk page] it's clear it has a number of problems.


== Formula not showing correctly ==
Mostly it's not in the [http://www.weylmann.com/gaussian.pdf source], or anything like it. The source does discuss an integral involving a matrix but not any of the formulas here and the matrix in the source is not a general square matrix but a symmetric one. The source contains nothing identified as a method for inverting matrices, the point of that section of the article. Further the notation is very unclear and non-standard: it looks like the integral of a scalar ('''A'''''x'' being a vector), so cannot be matrix valued. With so many issues it definitely needs a source giving it as a method for inverting general matrices, otherwise it is original research.--<small>[[User:JohnBlackburne|JohnBlackburne]]</small><sup>[[User_talk:JohnBlackburne|words]]</sup><sub style="margin-left:-2.0ex;">[[Special:Contributions/JohnBlackburne|deeds]]</sub> 08:47, 5 January 2012 (UTC)


:: ok, I try again to explain.. the integral consists of a product of two factors 1) an exponential of a scalar value exp(-0.5*(Ax)^2) = exp(-0.5*x'A'Ax) (I use x' and A' to denote the transpose of x, A), I hope you see that x'A'Ax which is (Ax)'(Ax) is a scalar, and 2) a rank 1 matrix x(Ax)' . thus the result of the integral is clearly matrix valued. The other only fact needed to see it is that the expectation value of xx' of a multivariate normal distribution with mean 0 is its covariance matrix (this is line one of the contribution) ok man, this was my last try, if you do not understand now, than I am just sorry.. but it is not a good reason to remove it. thank you. <span style="font-size: smaller;" class="autosigned">— Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[Special:Contributions/131.246.191.181|131.246.191.181]] ([[User talk:131.246.191.181|talk]]) 08:18, 6 January 2012 (UTC)</span><!-- Template:Unsigned IP --> <!--Autosigned by SineBot-->
Somehow the formula for the inverse of a general 2x2 matrix is not showing correctly. The latex has a minus before the c but the output does not. I can't seem to fix it.<!-- Template:Unsigned IP --><small class="autosigned">—&nbsp;Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[Special:Contributions/92.194.124.84|92.194.124.84]] ([[User talk:92.194.124.84#top|talk]]) 15:18, 14 January 2021 (UTC)</small>


:Thank you for your concern about the minus signs, but Wikipedia has a software bug. Editors have tried several times to kludge it to no avail. This has been reported multiple times now on [[WP:VPT]]. We'll just have to wait until the administrators fix it properly.—[[User:Anita5192|Anita5192]] ([[User talk:Anita5192|talk]]) 16:57, 14 January 2021 (UTC)
:::<small>Please sign your talk page messages with four tildes (<nowiki>~~~~</nowiki>). Thanks.</small>
:::The idea is not that you explain, but that you provide a source. Please ''do'' read the policies as explained in [[wp:NOR]], [[wp:V]] and [[wp:BURDEN]]. That is the way things go here. We don't have to understand anything — you have to provide a source, so we can directly verify it, and then, per [[wp:consensus]], talk about whether it is sufficiently [[wp:notable]] to be included here. It's all just basic Wikipedia policy. - [[User:DVdm|DVdm]] ([[User talk:DVdm|talk]]) 08:56, 6 January 2012 (UTC)


== Theorem, Explanations, and Applications ==
:::: yes, ok, unfortunately I am not able to name a source.. thought it is elementary enough, but this is of course relative.. bye


I will reorder the theorems to show audiences that they can be explained by each others first. In addition, the formulas are hard to most audiences for understanding, so I will add some explanations on these formulas. Also, Invertible Matrix can be used in many concrete ways in real life but the application part only talks about few of it. In sub-title least square solutions, the invertible matrix is always used for data analyzing for predicting future data, and people always used it to create model of the relationship between the variables and output. For example, to analyze the price of house, the variables should be area, layout, swimming pool, and place and the output should be price. So, I will add more content on application part.
:Looks like a classic case of [[wp:OR]]. There's way to many of those here, specially in the math articles, where we always have to fight that "''But, it's trivial. Every amateur mathematician can verify it.''" I wouldn't even look at these things, let alone verify or even discuss them: [[wp:BURDEN|it's up to them to get a source, not up to us to beg for one.]] - [[User:DVdm|DVdm]] ([[User talk:DVdm|talk]]) 21:43, 5 January 2012 (UTC)
== "[[:Nonsingular]]" listed at [[Wikipedia:Redirects for discussion|Redirects for discussion]] ==
[[File:Information.svg|30px]]
The redirect <span class="plainlinks">[//en.wikipedia.org/enwiki/w/index.php?title=Nonsingular&redirect=no Nonsingular]</span> has been listed at [[Wikipedia:Redirects for discussion|redirects for discussion]] to determine whether its use and function meets the [[Wikipedia:Redirect|redirect guidelines]]. Readers of this page are welcome to comment on this redirect at '''{{slink|Wikipedia:Redirects for discussion/Log/2023 April 25#Nonsingular}}''' until a consensus is reached. <!-- from Template:RFDNote --> [[User:1234qwer1234qwer4|1234qwer]][[User talk:1234qwer1234qwer4|1234qwer]][[Special:Contribs/1234qwer1234qwer4|4]] 23:04, 25 April 2023 (UTC)

Latest revision as of 13:33, 30 June 2024


Wiki Education Foundation-supported course assignment

[edit]

This article was the subject of a Wiki Education Foundation-supported course assignment, between 8 September 2021 and 19 December 2021. Further details are available on the course page. Student editor(s): Laoer22.

Above undated message substituted from Template:Dashboard.wikiedu.org assignment by PrimeBOT (talk) 00:45, 17 January 2022 (UTC)[reply]

Sigh

[edit]

Sorry to say so, but I think this page is too complicated.

Example from the introduction: "Over the field of real numbers, the set of singular n-by-n matrices, considered as a subset of R^{n \times n}, is a null set, i.e., has Lebesgue measure zero. (This is true because singular matrices can be thought of as the roots of the polynomial function given by the determinant.) "

Who cares? Sure, this may be an interesting aside, but it obfuscates more important things. In particular, since the "matrix inverse" page has been merged here, the extremely useful content that used to reside there should be clearer.

I have referred to Wikipedia's "matrix inverse" page innumerable times in the past to remind me how to do simple matrix inversion. (I forget these things!) Now when I look for that simple information I get overloaded with nonsense about the Lebesque measure of the set of singular matrices. Seriously?

Wikipedia used to be a good source for the basic explaination. If I needed more, I'd go to Mathworld- which was not very often, since I never understood what Wolfram was saying!

My request to the fantastic math heads writing this page: put the high-school stuff first, since that's what a lot people will want to see. Make it clear to people with minimum math background, and keep it simple when possible.

Thanks for the hard work. I'm just trying to help make the information useful for everyone.

Hawkeyek (talk) 05:51, 10 April 2008 (UTC)[reply]

The purpose is to say that singular (non-invertible) matrices are very very very rare. If you choose a matrix with random real entries (say, between 0 and 1), then the probability it is singular is literally zero. That is not to say that non-invertible matrices can't happen, just that they are infinitely unlikely. (if this seems like a contradiction, consider throwing a dart at a dartboard - what is the probability that the dart will hit a particular point?) 67.9.148.47 (talk) 11:50, 28 November 2008 (UTC)[reply]
I think this information is important to include, but that earlier in the article we can describe it in simpler terms, such as those used by 67.9.148.47 above. Dcoetzee 08:46, 29 November 2008 (UTC)[reply]
I think it is important to include that singularity is rare. Maybe we should add something small to make sure we're not overgeneralizing. Singularity over a euclidean field is rare. But isn't a square matrix over z2 almost always singular? — Preceding unsigned comment added by 132.235.46.80 (talk) 04:05, 9 November 2011 (UTC)[reply]
Prompted by the discussion here, I have moved it to later. It does not seem to be important enough to be in the lead of the article. Incidentally, the lead doesn't exactly comply with the WP:LEAD guideline anyway. It should probably be rewritten and the existing content relocated elsewhere. siℓℓy rabbit (talk) 13:53, 29 November 2008 (UTC)[reply]
Could someone add a simple example with real numbers? That was what I was looking for, just some actual example not involving variables. — Preceding unsigned comment added by 60.166.111.122 (talk) 14:19, 16 September 2015 (UTC)[reply]
I agree, this page does have a lot of high level terms. Though the concept itself is pretty layered, maybe we should introduce a basic introduction section? I think it would appeal to a wider audience. SriCHaM (talk) 13:32, 30 June 2024 (UTC)[reply]

Inversion of 3 x 3 matrices

[edit]

please!

I understand why you would not wish to post a general form for the inversion of a 3x3 matrix, but maybe a step by step with a simple example? Nightwindzero 05:52, 22 February 2007

There are links to two different methods for solving systems that involve inverse matrices, as well as a description of the general analytic method for obtaining the nxn inverse. It would be completely unnecessary to show an example of 3x3 in this article, IMO. The only reason that the 2x2 is shown is because it's trivially simple. Oli Filth 10:54, 22 February 2007 (UTC)[reply]
From http://www.dr-lex.34sp.com/random/matrix_inv.html:
Hopefully I copied it over rightly. Looking at the letters like this makes the pattern of 2x2 matrices excluding the row and column of the element in question, turned sideways, seem much more intuitive, although it sure sounds complicated when I say it out like that. 72.224.200.135 02:44, 12 June 2007 (UTC)[reply]
I agree that this article doesn't need an explicit formula for every single dimension, but I still would include a formula for the 3x3 case. Not the one above, which is admittedly long winded & inconvenient, just asking for making copying errors. But there is a much simpler one, only using cross product & triple product:
If an invertible matrix A consists of the column vectors , its inverse consists of the row vectors , multiplied by the inverse determinant of A, where the determinant just "happens" to be equal to the triple product of x0,x1 & x2: . That this matrix is a left inverse of A can be checked easily by using basic properties of the cross & triple products. And since left inverses & right inverses are identical for all groups, this is indeed the inverse of A.
If no one objects, I'm gonna include the formula. Catskineater (talk) 03:42, 6 February 2010 (UTC)[reply]
No one protested, so i added the formula. Catskineater (talk) 22:42, 21 February 2010 (UTC)[reply]
You guys should probably switch the matrix formula for the inversion of 3x3 matricies on the wiki page.
i.e
where
which is the determinant of the matrix. If is finite (non-zero), the matrix is invertible, with the elements of the above matrix on the right side given by
I'm not 100% sure, but i'm pretty sure that that gives out the solution for the transpose of the inverse, not the inverse itself. A pretty easy and quick fix, but i'm pretty lazy and don't have the time to formulate it into wiki and make sure its all right. — Preceding unsigned comment added by Njc69 (talkcontribs) 16:35, 3 October 2010 (UTC)[reply]
I agree with the above. The formula on the main page caused me lots of problems since it is actually transpose of the inverse, but the above formula seems to work. — Preceding unsigned comment added by 65.60.221.79 (talk) 10:59, 8 March 2013 (UTC)[reply]
The formula on the page has been corrected more than two years ago, so you are probably misreading something.—Emil J. 12:25, 8 March 2013 (UTC)[reply]
I have only a moderate objection to the above cross/dot product representation: it's useless. If what we're looking for is a REPRESENTATION of the inverse of a matrix, then A-1 is hard to beat. I'm writing this 6½ years after Catskineater's post, but I don't see how an alternative representation helps either clarify the topic or aid in computing the terms of the inverse matrix. I found the inclusion of the specific element by element expansion of the 3x3 matrix quite useful. I also strongly disagree with those who claim such a representation is prone to errors. That is, assuming copy and paste is something you are able to handle it is trivial and NOT "asking for making copy errors". I strongly doubt whether either the information about the Cayley-Hamilton decomposition or the information about the representation in terms of 3 column vectors is useful to 99.99% of the readers. I propose the 3x3 section be shortened by removing all the material after the Det(A)=aA+bB+cC AND I also propose abandoning the use of the elements A,B,...,I. They add very little in terms of conciseness; compare Det(A) = a(ei-fh)- b(di-fg)+c(dh-eg) with the above, there is very little space savings but the cost of including 9 more (extraneous) variables is significant in terms of clarity and simplicity. What purpose does it serve?71.30.36.108 (talk) 00:31, 9 August 2016 (UTC)[reply]

Matrix inverses in MIMO wireless communication

[edit]

The statement that "It is crucial for the matrix H to be invertible for the receiver to be able to figure out the transmitted information." is just plain wrong. The matrix H is not always square, and there are several better ways of decoding the transmitted signal, instead of inverting the channel matrix. I have never edited a Wikipedia page before, so I will figure that out before I make some correcting changes. — Preceding unsigned comment added by 89.160.119.209 (talk) 11:00, 13 December 2013 (UTC)[reply]

No mention of linear independence anywhere

[edit]

A square matrix is singular if and only if it's determinant is zero.

A family of vectors are linearly independent if and only if the determinant of their matrix is zero.

I really think there should be a mention of linear independence here after the reference to singular matrices since singularity is equivalent to linear dependence, which also ties in to the discussion later about eigenvectors and such. There's already a page on linear independence so just a quick link would be nice.

SomeHandyGuy (talk) 00:33, 4 June 2014 (UTC)[reply]

Article Name: why not inverse matrix or matrix inverse ?

[edit]

I think those phrases would appear more in text, i.e. someone reading and clicking wants to know what is an "inverse matrix", or how do I invert a matrix. Would it make sense to rename and reword this article? :-

"The inverse of a matrix 'A' is a matrix 'B' such that ... AB=I ; a matrix for which such an inverse exists is called 'invertible'..." Fmadd (talk) 08:38, 4 February 2017 (UTC)[reply]

[edit]

Hello fellow Wikipedians,

I have just modified one external link on Invertible matrix. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 23:35, 15 November 2017 (UTC)[reply]

Formula not showing correctly

[edit]

Somehow the formula for the inverse of a general 2x2 matrix is not showing correctly. The latex has a minus before the c but the output does not. I can't seem to fix it.— Preceding unsigned comment added by 92.194.124.84 (talk) 15:18, 14 January 2021 (UTC)[reply]

Thank you for your concern about the minus signs, but Wikipedia has a software bug. Editors have tried several times to kludge it to no avail. This has been reported multiple times now on WP:VPT. We'll just have to wait until the administrators fix it properly.—Anita5192 (talk) 16:57, 14 January 2021 (UTC)[reply]

Theorem, Explanations, and Applications

[edit]

I will reorder the theorems to show audiences that they can be explained by each others first. In addition, the formulas are hard to most audiences for understanding, so I will add some explanations on these formulas. Also, Invertible Matrix can be used in many concrete ways in real life but the application part only talks about few of it. In sub-title least square solutions, the invertible matrix is always used for data analyzing for predicting future data, and people always used it to create model of the relationship between the variables and output. For example, to analyze the price of house, the variables should be area, layout, swimming pool, and place and the output should be price. So, I will add more content on application part.

The redirect Nonsingular has been listed at redirects for discussion to determine whether its use and function meets the redirect guidelines. Readers of this page are welcome to comment on this redirect at Wikipedia:Redirects for discussion/Log/2023 April 25 § Nonsingular until a consensus is reached. 1234qwer1234qwer4 23:04, 25 April 2023 (UTC)[reply]