Talk:Kernel (matrix): Difference between revisions
→Major revision: not an improvement |
m Maintain {{WPBS}}: 1 WikiProject template. Remove 1 deprecated parameter: field. Tag: |
||
(39 intermediate revisions by 21 users not shown) | |||
Line 1: | Line 1: | ||
{{WikiProject banner shell|class=Start| |
|||
{{maths rating |
|||
{{WikiProject Mathematics|priority=mid}} |
|||
| field = algebra |
|||
| importance = mid |
|||
| class = start |
|||
| historical = |
|||
}} |
}} |
||
== untitled == |
|||
Can somebody make a multi-dimesional example here? (more than one free variable....) |
Can somebody make a multi-dimesional example here? (more than one free variable....) |
||
Line 19: | Line 17: | ||
==Division by Zero== |
==Division by Zero== |
||
"Nullity" should no longer redirect here, since it appears that someone has managed to solve the "divide by zero problem"... and the solution is called Nullity. (http://www.bbc.co.uk/berkshire/content/articles/2006/12/06/divide_zero_feature.shtml) [[User:Medevilenemy|Medevilenemy]] 19:24, 6 December 2006 (UTC) |
"Nullity" should no longer redirect here, since it appears that someone has managed to solve the "divide by zero problem"... and the solution is called Nullity. (http://www.bbc.co.uk/berkshire/content/articles/2006/12/06/divide_zero_feature.shtml) [[User:Medevilenemy|Medevilenemy]] 19:24, 6 December 2006 (UTC) |
||
:No offense, but that "solution" is total bullcrap. --[[User:Wooty|'''''Wo''''']] |
:No offense, but that "solution" is total bullcrap. --[[User:Wooty|'''''Wo''''']][[User:Wooty/b|<span style="color:red;">'''o'''</span>]][[User:Wooty|'''''ty''''']] <small>[[User talk:Wooty|'''Woot?''']] [[Special:Contributions/Wooty|'''contribs''']]</small> 09:54, 7 December 2006 (UTC) |
||
::It's not Wikipedia's job to decide whether a theory is correct or not. Leave that to discussion within the mathematics community and related journals. See the note at the bottom of [[WP:NOR]]. --[[Tjohns ]] [[User_talk:Tjohns | ✎]] 12:03, 7 December 2006 (UTC) |
::It's not Wikipedia's job to decide whether a theory is correct or not. Leave that to discussion within the mathematics community and related journals. See the note at the bottom of [[WP:NOR]]. --[[Tjohns ]] [[User_talk:Tjohns | ✎]] 12:03, 7 December 2006 (UTC) |
||
:I think discussion regarding this should take place over at [[Talk:Nullity]]. --[[Tjohns ]] [[User_talk:Tjohns | ✎]] 12:00, 7 December 2006 (UTC) |
:I think discussion regarding this should take place over at [[Talk:Nullity]]. --[[Tjohns ]] [[User_talk:Tjohns | ✎]] 12:00, 7 December 2006 (UTC) |
||
Line 27: | Line 25: | ||
:That is not an "expansion" as your edit summary says, but a "limitation" to the special case of a finite dimensional linear operator. The definition is much wider than for matrices only. −[[User:Woodstone|Woodstone]] 06:05, 9 September 2007 (UTC) |
:That is not an "expansion" as your edit summary says, but a "limitation" to the special case of a finite dimensional linear operator. The definition is much wider than for matrices only. −[[User:Woodstone|Woodstone]] 06:05, 9 September 2007 (UTC) |
||
I don't agree that the introduction to the article should focus on the general case like this. Here's why: |
|||
# The null space of a matrix is an important idea, and most of the article is devoted to it. Null spaces of matrices are central to elementary linear algebra, and I think they deserve their own article. |
|||
# Introducing null spaces in the context of general operators makes the article less accessible to a general audience. |
|||
# For general linear operators, the null space is more often referred to as the kernel. |
|||
In an attempt to resolve this disagreement, I've created an article called [[kernel (linear algebra)]] that discusses the general case, and added a disambig template to the top of this article. [[User:Jim.belk|Jim]] 18:45, 9 September 2007 (UTC) |
|||
::You are missing the point again. Linear algebra is not the general case. Also non-linear operators can have a null space. Did you notice that there are already quite many "kernel (something)" articles? Why add another one. In my opinion the titles should be reversed from your idea. Null space is generic. The "Matrix" case is quite specific and should have an appropriate title if you insist on splitting it from the generic one. −[[User:Woodstone|Woodstone]] 07:09, 10 September 2007 (UTC) |
|||
:::I'm not sure I've ever heard of anyone talk about the null space of a nonlinear operator. Most of the time people use the word "operator" to mean "linear operator", and even in that case the "null space" is most commonly called the "kernel". My experience has been that the word "null space" is used most of the time to talk about the null space of a matrix. |
|||
:::Google backs me up on this. There are [http://www.google.com/search?&q=%22null+space%22&btnG=Search 476,000 results] for "null space", of which [http://www.google.com/search?hl=en&q=%22null+space%22+matrix&btnG=Search 347,000] involve the word "matrix", and [http://www.google.com/search?as_q=&hl=en&as_epq=null+space+&as_oq=matrix+linear 413,000] involve either the word "matrix" or the word "linear". The [http://www.google.com/search?as_epq=null+space&as_oq=&as_eq=matrix+linear remaining hits] don't particularly seem to be about the null space of a nonlinear operator. |
|||
:::The [http://mathworld.wolfram.com/NullSpace.html Math World article] on null spaces restricts to "linear transformations", as does the [http://www.m-w.com/cgi-bin/dictionary?va=null+space Merriam-Webster definition]. I looked at the [[Special:Whatlinkshere/Null space|Wikipedia articles that link to "Null space"]], and I didn't notice any that have to do with nonlinear operators. |
|||
:::I like the current organization of the articles on [[null space]] and [[kernel (linear algebra)]], but if you strongly object to then we could describe our disagreement on [[Wikipedia talk:WikiProject Mathematics]], and ask what the other members think. [[User:Jim.belk|Jim]] 17:17, 10 September 2007 (UTC) |
|||
Since you've reverted the introduction again, I've asked for an outside opinion from [[Wikipedia talk:WikiProject Mathematics]]. (Those coming from the outside should take a look at the [http://en.wikipedia.org/enwiki/w/index.php?title=Null_space&oldid=157539189 current version] and [http://en.wikipedia.org/enwiki/w/index.php?title=Null_space&oldid=157511014 previous version] of the introduction.) [[User:Jim.belk|Jim]] 01:18, 14 September 2007 (UTC) |
|||
:I don't believe the set of solutions of <math>f(x)=0</math> for a general ''f'' is called ''null space''. It's only called a null space if ''f'' is linear. I think it's called a ''kernel'' if ''f'' preserves the relevant algebraic structure, and the expression for the general situation is [[zero set]]. |
|||
:I'm not so sure about having separate articles on [[null space]] (of matrices) and [[kernel (linear algebra)]] (of linear operators). There is a big overlap and I'm not convinced that the article would be less accessible if we were to treat linear operators here, provided that we do introduce null spaces for matrices first and that we delineate the more abstract parts which rely on identifying matrices with linear operators. If we do decide to have separate articles, then I think they should be linked more than by just a note at the top. -- [[User:Jitse Niesen|Jitse Niesen]] ([[User talk:Jitse Niesen|talk]]) 02:24, 14 September 2007 (UTC) |
|||
The introduction in terms of a matrix looks better to me, more accessible. It's OK to have a separate matrix-oriented article, and a separate operator-oriented article, with more of a functional analysis and algebra flavor ('''not''' the college lower division linear algebra). These really serve distinct audiences. I am not sure if the kernel vs nullspace nomenclature has anything to do with this distinction - they are exactly the same to me - so perhaps some renaming is in order. And, yes, I agree with Jitse Niesen that both refer to linear operators only. By the way, computing the basis of the nullspace of a matrix by means of elementary transformation as done in the article now and in undergraduate textbooks is unfortunate and of (perhaps) a didactic value only because of numerical stability problems; practical computation should be done via [[QR decomposition]] or better [[SVD]], like in [[Matlab]] function <tt>null</tt>. [[User:Jmath666|Jmath666]] 04:34, 14 September 2007 (UTC) |
|||
:::I agree that there's a problem with presenting only the row reduction perspective, and I've been trying to figure out how to deal this problem for many different linear algebra articles (c.f. [[column space]], [[row space]], [[Euclidean subspace]], and [[system of linear equations]].) On the one hand, row reduction is the standard algorithm given in most introductions to linear algebra; on the other hand, it apparently has serious numerical stability problems, making it relatively useless for practical applications. See [[Talk:System of linear equations]] for more discussion about this issue. [[User:Jim.belk|Jim]] 20:45, 14 September 2007 (UTC) |
|||
::::I did not say there is a problem with row reduction as opposed to columns; there is a problem with the reduction approach at all. This results in unfortunate engineers who try to implement those naive methods out of the book in software. But the educational establishment deems it important so students are subjected to it. Hence it has a place on Wikipedia where those students may look. If I have time I may add a paragraph here about computing the nullspace in a numerically sound manner. Yes the issues in solving systems are similar, but simpler. See also the paragraph on numerical computation in [[Moore-Penrose pseudoinverse]]. For example Trefethen-Bau {{ISBN|978-0898713619}} do not even consider Gaussian elimination (a.k.a. reduction) for anything and start with QR. Matlab switched from QR to SVD (more expensive but more reliable) for things like the nullspace many years ago; reduction methods are not even a remote consideration for many decades now if reliability is important. [[User:Jmath666|Jmath666]] 04:48, 15 September 2007 (UTC) |
|||
:::::Sorry, we seem to be miscommunicating. I was agreeing with you—by "row reduction" I mean Gaussian elimination, or any method that involves elementary row and/or column operations. I'm calling it "row reduction" just because it's usually done with rows in introductory courses. I know about the numerical problems with reduction, and I've been trying to figure out how this should be reflected in the articles on elementary linear algebra. See my last post. [[User:Jim.belk|Jim]] 09:07, 15 September 2007 (UTC) |
|||
:I'm not sure we need a separate article [[kernel (linear algebra)]] next to [[kernel (algebra)#Linear operators]], but if kept, there should be cross references. --[[User talk:Lambiam|Lambiam]] 08:58, 14 September 2007 (UTC) |
|||
:See also [[Wikipedia:Reference desk/Archives/Mathematics/2007 August 12#Null space and Kernel (algebra)]]. --[[User talk:Lambiam|Lambiam]] 09:05, 14 September 2007 (UTC) |
|||
:Even if I understand that matrices are more common use, nullspace does not only concern finite dimensional spaces, and not only real vector spaces (in particular, complex matrices and matrices over finite fields are also widely used). Hence the definition, at least, should not restrict the term to such a particular situation. Moreover, in the article, the term matrix is used in place of "real matrix", leading to strange assertions like "The null space of an m × n matrix is a subspace of Rn".Also,the introduction of the term "Euclidean space" is misleading: Why should the natural metrics be taken into account? This is particularly funny if A is used to define a quadratic form associated to a non-Euclidean distance. Should we consider that the isotropic subspace of a quadratic form is an Eucldiean subspace??? [[User:Taxipom|pom]] 21:33, 18 September 2007 (UTC) |
|||
::What's happening is that I'm trying to make this article as accessible as possible to non-mathematicians (e.g. second-year students in an introductory linear algebra course). The most basic case of a matrix is a real matrix, so this is what gets talked about in the article. Jmath666 has now implemented a partial solution to this problem (see below). |
|||
::Also, "Euclidean space" is used here as a general synonym for '''R'''<sup>''n''</sup>, primarily because [[Euclidean space]] is currently the main article on ''n''-dimensional space. It's possible that "coordinate space" would be better terminology, but the article on [[coordinate space]] is currently written from a more general standpoint, and [[real coordinate space]] is a redirect to [[Euclidean space]]. (There's also a very odd article entitled [[n-dimensional space]] which should probably be merged with [[Euclidean space]], but could alternatively be developed into a main article on ''n'' dimensions.) [[User:Jim.belk|Jim]] 05:19, 19 September 2007 (UTC) |
|||
I agree with Taxipom. Thanks to the references to Euclidean space, it is proving very difficult indeed to work out how much of the article applies to, say, matrices containing complex values, or over finite fields. This is most certainly not "as accessible as possible", it is extremely confusing. [[Special:Contributions/92.30.174.205|92.30.174.205]] ([[User talk:92.30.174.205|talk]]) 23:09, 27 September 2011 (UTC) |
|||
===Renaming proposed=== |
|||
In view of the discussion above I propose there should be |
|||
* kernel (linear algebra), including elementary linear algebra and numerical linear algebra aspects (this article) |
|||
* kernel (functional analysis), or operator theory (relating/including [[Fredholm theory]] etc) |
|||
* kernel (algebra) (as is now, perhaps incorporating kernel (operator theory)) |
|||
with proper redirects from the matching variants of nullspace, and a dab page. Each with distinct audience and at a distinct level. For example, referring to the post by Taxipom, the words "natural metric" and all considerations that follow will make no sense whatsoever to a typical 2nd year US college student who just takes a first course in linear algebra and wants to use wikipeda to look up something. While it would be all fine and of interest to a grown mathematician or a student in a classical, pure math oriented program. Or... maybe I'll just [[Wikipedia:Be bold|be bold]] do it and see if anyone objects. [[User:Jmath666|Jmath666]] 04:12, 19 September 2007 (UTC) |
|||
Done, [[Kernel (mathematics)]] is the main article now and everything is links to/from there. Hopefully it makes more sense now. This article, now called Kernel (matrix) is about matrices only and more abstract stuff belongs elsewhere. [[User:Jmath666|Jmath666]] 04:56, 19 September 2007 (UTC) |
|||
:I really like this suggestion. However, I think that [[null space (matrix)]] would be a better title for this article than [[kernel (matrix)]]. "Null space" is the terminology used in most 2nd-year linear algebra books, and "null space of a matrix" returns twenty times as many Google hits as "kernel of a matrix". Would you object if I moved the page? [[User:Jim.belk|Jim]] 05:03, 19 September 2007 (UTC) |
|||
:::There are [[Null space (matrix)]] and also [[Nullspace (matrix)]] namely redirects to [[Kernel (matrix)]]. If you want to switch it around, you will need the help of an admin to delete the redirect page first. Also [[Kernel (matrix)]] is consistent with [[Kernel (mathematics)]]. I do not think it is a big deal either way, do what you think is right. [[User:Jmath666|Jmath666]] 07:37, 19 September 2007 (UTC) |
|||
::(edit conflict) |
|||
::There seem to have been some renames/redirects already, but I lost track. However, redirecting "null space" to "kernel (matrix)" is very wrong. The concept of null space is much wider than as only applied to matrices. I agree with redirecting all "null space" articles to corresponding "kernel" articles. But the bare "kernel" article should have a fully generic definition and can then descend into or refer to specific cases, e.g. for matrices. Especially so, because the general definition is in no way more complex than the specialised one. −[[User:Woodstone|Woodstone]] 05:05, 19 September 2007 (UTC) |
|||
::: Yes there is already a bare "kernel" article, namely the main article [[Kernel (mathematics)]] and it does exactly as you say. Perhaps [[Nullspace]] should redirect there, so I have done that now too. Thanks for noticiting. Feell free to change whatever you like. [[User:Jmath666|Jmath666]] 07:37, 19 September 2007 (UTC) |
|||
This is a mistake. A linear transformation has a kernel. A matrix is not a linear transformation so it does not have a kernel. There is a lot of slop in the literature about this, but if you go back to basics you will find this to be true. In particular, a matrix may represent a linear transformation in more than one way, so it is intrinsically nonsensical to refer to its "kernel". There is a perfectly good term for for the null space, namely "null space". This page should at least have a correct definition and only then cite [[Kernel (linear algebra)]]. [[User:Zaslav|Zaslav]] ([[User talk:Zaslav|talk]]) 07:53, 23 February 2020 (UTC) |
|||
== Proof that A Q2 = 0 == |
|||
I am sure this is very obvious to most of you but I was curious to see at the end of the article a method for computing the null space using QR factorization. I am not a mathematician myself but was wondering what is the proof that A Q2 = 0? It may be staring me in the face but I don't see it at the moment, I am sure it is correct as I tried the method using SciLab and it worked very nicely. --[[User:Rhoddydog|Rhoddydog]] ([[User talk:Rhoddydog|talk]]) 17:35, 14 August 2008 (UTC) |
|||
PS I just realized why A Q2 = 0, for those who might be interested: multiply both sides of A^T P = [Q1 Q2] [R 0]^T |
|||
by [Q1 Q2]^T (since Q is orthogonal) and multiply out the resulting expression, one of the results is A Q2 = 0. --[[User:Rhoddydog|Rhoddydog]] ([[User talk:Rhoddydog|talk]]) 10:05, 14 August 2008 (UTC) |
|||
" multiply both sides of " --> "'''left''' multiply both sides of ..." --[[User:Mangledorf|Mangledorf]] ([[User talk:Mangledorf|talk]]) 15:29, 20 August 2012 (UTC) |
|||
==Split Left Nullspace== |
|||
Can I remove the paragraph from this page and put it on a separate "left nullspace" page? It's nice when you search for something if you just get the information you want instead of a whole bunch of other stuff.[[User:Daviddoria|daviddoria]] ([[User talk:Daviddoria|talk]]) 15:46, 19 September 2008 (UTC) |
|||
== Relation to eigenvalues == |
|||
Hi there, |
|||
I thought it would be nice if there would be a section explaining the relation between the nullspace and the eigenvalues/eigenvectors of the matrix. |
|||
As far as I know, the dimension of the nullspace (nullity?) of a matrix is equal to the amount of eigenvectors with eigenvalue 0. And the nullspace is spanned by these vectors, too. |
|||
[[User:Drugbird|Drugbird]] ([[User talk:Drugbird|talk]]) 07:56, 27 April 2010 (UTC) |
|||
== Using RRE to calculate Nullspace == |
|||
The current algorithm given using RRE to find a nullspace has a step "Interpreting the reduced row echelon form as a homogeneous linear system" and then "determine which of the variables are free". That's a bit of a stretch for someone who is just learning how to calculate nullspace. I'd suggest a more concrete algorithm for it, like the following: |
|||
# If the input matrix is wider than it is tall, pad zero rows to make it square. |
|||
# Put the matrix in RRE form keeping the pivots along the diagonal. |
|||
# If the matrix is taller than it is wide, crop the lower rows to make it square (they will all be zero rows if the RRE was done correctly). |
|||
# Subtract the identity matrix. |
|||
That produces a null space basis from the non zero columns of the result. It is simple, uses RRE, and doesn't require the "now write out tons of expressions and modify them" step. [[User:Antares5245|Antares5245]] ([[User talk:Antares5245|talk]]) 06:37, 28 August 2010 (UTC) |
|||
== LQ Factorization == |
|||
It seems that using the LQ factorization would be faster at finding the null space than the QR factorization, since we can apply it directly to the matrix $A$ instead of to $A^T$. LAPACK has routines for computing the LQ factorization as well.[[User:Vinzklorthos|Vinzklorthos]] ([[User talk:Vinzklorthos|talk]]) 15:13, 6 February 2012 (UTC) |
|||
Also, I have heard that QR is faster than SVD (presumably, LQ would be faster as well). The reason to use SVD relates to the possibility that the matrix does not have full row rank, i.e., at least one row is a linear combination of other rows; if the matrix is a constraint matrix in an optimization problem, then the matrix can be reduced and still yield an equivalent problem. SVD would catch this by having more zero singular values, but QR or LQ would only indicate this by having a degenerate R or L matrix, and this would have to be checked. Using SVD allows the user to not worry about this. In conclusion, as I understand, if the user knows the matrix has full row rank, then it is faster to use an LQ factorization of $A$ (or a QR factorization of $A^T$).[[User:Vinzklorthos|Vinzklorthos]] ([[User talk:Vinzklorthos|talk]]) 15:13, 6 February 2012 (UTC) |
|||
== Numerical computation of null space == |
|||
I have completely rewritten, and renamed this section into "Computation of the null space in a computer". In fact, the previous version did not respect [[WP:NPOV]] by supposing that every numerical computation is a floating-point computation, which is a blatant mistake. On the other hand I have removed the large part of the section devoted to specific numerical algorithms, because it is outside the scope of this article. In fact, null space computation is a special instance of solving a homogeneous linear system. Deciding which algorithm is the best in each case is a highly technical matter, whose answer depend on various parameters of the input matrix (its structure) as well as the architecture (cache(s), vectorization, number of cores, size of the memory, ...) of the computer. Implementing a high performance linear algebra package is yet a research project for a whole team! Thus the questions, that were addressed in the part that I have removed, have not their place here but in an article [[high performance liner algebra]], yet to write. Also the suppressed part presented a single algorithm as the state of the art. This is [[WP:OR]] --[[User:D.Lazard|D.Lazard]] ([[User talk:D.Lazard|talk]]) 16:42, 19 October 2012 (UTC) |
|||
==Assessment comment== |
|||
{{Substituted comment|length=112|lastedit=20120101014309|comment=Tried to make this better. Probably needs more references. [[User:Jim.belk|Jim]] 03:09, 9 September 2007 (UTC)}} |
|||
Substituted at 02:16, 5 May 2016 (UTC) |
Latest revision as of 01:43, 9 March 2024
This redirect does not require a rating on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
untitled
[edit]Can somebody make a multi-dimesional example here? (more than one free variable....)
I am for the idea of merging this article (Null Space), with Kernel -- minghan 15:23, 26 January 2006 (UTC)
I would prefer to have kernel (mathematics) continue to focus on the more general case while this article addresses the restricted case of kernels of matrix operators in linear algebra. Deco 02:33, 27 January 2006 (UTC)- Changed my mind. This article presents itself as the same concept as kernels, so a merge is warranted. Deco 02:35, 27 January 2006 (UTC)
- A slight philosophical difference is that a matrix is not a map, whereas a linear transformation is, even though the two are commonly identified. The null space of a matrix refers to what would be the kernel if we thought of it as a linear transformation. Whether the articles are written this way, I don't know, but I think that at least in theory, the two should be considered separate concepts, that you realize are the same after a little bit of thinking.
Hmmm. I still suggest that the merge discussion be centralized in kernel (mathematics). — Arthur Rubin | (talk) 02:57, 17 March 2006 (UTC)
Division by Zero
[edit]"Nullity" should no longer redirect here, since it appears that someone has managed to solve the "divide by zero problem"... and the solution is called Nullity. (http://www.bbc.co.uk/berkshire/content/articles/2006/12/06/divide_zero_feature.shtml) Medevilenemy 19:24, 6 December 2006 (UTC)
- No offense, but that "solution" is total bullcrap. --Wooty Woot? contribs 09:54, 7 December 2006 (UTC)
- It's not Wikipedia's job to decide whether a theory is correct or not. Leave that to discussion within the mathematics community and related journals. See the note at the bottom of WP:NOR. --Tjohns ✎ 12:03, 7 December 2006 (UTC)
- I think discussion regarding this should take place over at Talk:Nullity. --Tjohns ✎ 12:00, 7 December 2006 (UTC)
Major revision
[edit]I just posted a major revision to this page, and I added the WikiProject Mathematics template above. Jim 03:06, 9 September 2007 (UTC)
- That is not an "expansion" as your edit summary says, but a "limitation" to the special case of a finite dimensional linear operator. The definition is much wider than for matrices only. −Woodstone 06:05, 9 September 2007 (UTC)
I don't agree that the introduction to the article should focus on the general case like this. Here's why:
- The null space of a matrix is an important idea, and most of the article is devoted to it. Null spaces of matrices are central to elementary linear algebra, and I think they deserve their own article.
- Introducing null spaces in the context of general operators makes the article less accessible to a general audience.
- For general linear operators, the null space is more often referred to as the kernel.
In an attempt to resolve this disagreement, I've created an article called kernel (linear algebra) that discusses the general case, and added a disambig template to the top of this article. Jim 18:45, 9 September 2007 (UTC)
- You are missing the point again. Linear algebra is not the general case. Also non-linear operators can have a null space. Did you notice that there are already quite many "kernel (something)" articles? Why add another one. In my opinion the titles should be reversed from your idea. Null space is generic. The "Matrix" case is quite specific and should have an appropriate title if you insist on splitting it from the generic one. −Woodstone 07:09, 10 September 2007 (UTC)
- I'm not sure I've ever heard of anyone talk about the null space of a nonlinear operator. Most of the time people use the word "operator" to mean "linear operator", and even in that case the "null space" is most commonly called the "kernel". My experience has been that the word "null space" is used most of the time to talk about the null space of a matrix.
- Google backs me up on this. There are 476,000 results for "null space", of which 347,000 involve the word "matrix", and 413,000 involve either the word "matrix" or the word "linear". The remaining hits don't particularly seem to be about the null space of a nonlinear operator.
- The Math World article on null spaces restricts to "linear transformations", as does the Merriam-Webster definition. I looked at the Wikipedia articles that link to "Null space", and I didn't notice any that have to do with nonlinear operators.
- I like the current organization of the articles on null space and kernel (linear algebra), but if you strongly object to then we could describe our disagreement on Wikipedia talk:WikiProject Mathematics, and ask what the other members think. Jim 17:17, 10 September 2007 (UTC)
Since you've reverted the introduction again, I've asked for an outside opinion from Wikipedia talk:WikiProject Mathematics. (Those coming from the outside should take a look at the current version and previous version of the introduction.) Jim 01:18, 14 September 2007 (UTC)
- I don't believe the set of solutions of for a general f is called null space. It's only called a null space if f is linear. I think it's called a kernel if f preserves the relevant algebraic structure, and the expression for the general situation is zero set.
- I'm not so sure about having separate articles on null space (of matrices) and kernel (linear algebra) (of linear operators). There is a big overlap and I'm not convinced that the article would be less accessible if we were to treat linear operators here, provided that we do introduce null spaces for matrices first and that we delineate the more abstract parts which rely on identifying matrices with linear operators. If we do decide to have separate articles, then I think they should be linked more than by just a note at the top. -- Jitse Niesen (talk) 02:24, 14 September 2007 (UTC)
The introduction in terms of a matrix looks better to me, more accessible. It's OK to have a separate matrix-oriented article, and a separate operator-oriented article, with more of a functional analysis and algebra flavor (not the college lower division linear algebra). These really serve distinct audiences. I am not sure if the kernel vs nullspace nomenclature has anything to do with this distinction - they are exactly the same to me - so perhaps some renaming is in order. And, yes, I agree with Jitse Niesen that both refer to linear operators only. By the way, computing the basis of the nullspace of a matrix by means of elementary transformation as done in the article now and in undergraduate textbooks is unfortunate and of (perhaps) a didactic value only because of numerical stability problems; practical computation should be done via QR decomposition or better SVD, like in Matlab function null. Jmath666 04:34, 14 September 2007 (UTC)
- I agree that there's a problem with presenting only the row reduction perspective, and I've been trying to figure out how to deal this problem for many different linear algebra articles (c.f. column space, row space, Euclidean subspace, and system of linear equations.) On the one hand, row reduction is the standard algorithm given in most introductions to linear algebra; on the other hand, it apparently has serious numerical stability problems, making it relatively useless for practical applications. See Talk:System of linear equations for more discussion about this issue. Jim 20:45, 14 September 2007 (UTC)
- I did not say there is a problem with row reduction as opposed to columns; there is a problem with the reduction approach at all. This results in unfortunate engineers who try to implement those naive methods out of the book in software. But the educational establishment deems it important so students are subjected to it. Hence it has a place on Wikipedia where those students may look. If I have time I may add a paragraph here about computing the nullspace in a numerically sound manner. Yes the issues in solving systems are similar, but simpler. See also the paragraph on numerical computation in Moore-Penrose pseudoinverse. For example Trefethen-Bau ISBN 978-0898713619 do not even consider Gaussian elimination (a.k.a. reduction) for anything and start with QR. Matlab switched from QR to SVD (more expensive but more reliable) for things like the nullspace many years ago; reduction methods are not even a remote consideration for many decades now if reliability is important. Jmath666 04:48, 15 September 2007 (UTC)
- Sorry, we seem to be miscommunicating. I was agreeing with you—by "row reduction" I mean Gaussian elimination, or any method that involves elementary row and/or column operations. I'm calling it "row reduction" just because it's usually done with rows in introductory courses. I know about the numerical problems with reduction, and I've been trying to figure out how this should be reflected in the articles on elementary linear algebra. See my last post. Jim 09:07, 15 September 2007 (UTC)
- I'm not sure we need a separate article kernel (linear algebra) next to kernel (algebra)#Linear operators, but if kept, there should be cross references. --Lambiam 08:58, 14 September 2007 (UTC)
- See also Wikipedia:Reference desk/Archives/Mathematics/2007 August 12#Null space and Kernel (algebra). --Lambiam 09:05, 14 September 2007 (UTC)
- Even if I understand that matrices are more common use, nullspace does not only concern finite dimensional spaces, and not only real vector spaces (in particular, complex matrices and matrices over finite fields are also widely used). Hence the definition, at least, should not restrict the term to such a particular situation. Moreover, in the article, the term matrix is used in place of "real matrix", leading to strange assertions like "The null space of an m × n matrix is a subspace of Rn".Also,the introduction of the term "Euclidean space" is misleading: Why should the natural metrics be taken into account? This is particularly funny if A is used to define a quadratic form associated to a non-Euclidean distance. Should we consider that the isotropic subspace of a quadratic form is an Eucldiean subspace??? pom 21:33, 18 September 2007 (UTC)
- What's happening is that I'm trying to make this article as accessible as possible to non-mathematicians (e.g. second-year students in an introductory linear algebra course). The most basic case of a matrix is a real matrix, so this is what gets talked about in the article. Jmath666 has now implemented a partial solution to this problem (see below).
- Also, "Euclidean space" is used here as a general synonym for Rn, primarily because Euclidean space is currently the main article on n-dimensional space. It's possible that "coordinate space" would be better terminology, but the article on coordinate space is currently written from a more general standpoint, and real coordinate space is a redirect to Euclidean space. (There's also a very odd article entitled n-dimensional space which should probably be merged with Euclidean space, but could alternatively be developed into a main article on n dimensions.) Jim 05:19, 19 September 2007 (UTC)
I agree with Taxipom. Thanks to the references to Euclidean space, it is proving very difficult indeed to work out how much of the article applies to, say, matrices containing complex values, or over finite fields. This is most certainly not "as accessible as possible", it is extremely confusing. 92.30.174.205 (talk) 23:09, 27 September 2011 (UTC)
Renaming proposed
[edit]In view of the discussion above I propose there should be
- kernel (linear algebra), including elementary linear algebra and numerical linear algebra aspects (this article)
- kernel (functional analysis), or operator theory (relating/including Fredholm theory etc)
- kernel (algebra) (as is now, perhaps incorporating kernel (operator theory))
with proper redirects from the matching variants of nullspace, and a dab page. Each with distinct audience and at a distinct level. For example, referring to the post by Taxipom, the words "natural metric" and all considerations that follow will make no sense whatsoever to a typical 2nd year US college student who just takes a first course in linear algebra and wants to use wikipeda to look up something. While it would be all fine and of interest to a grown mathematician or a student in a classical, pure math oriented program. Or... maybe I'll just be bold do it and see if anyone objects. Jmath666 04:12, 19 September 2007 (UTC)
Done, Kernel (mathematics) is the main article now and everything is links to/from there. Hopefully it makes more sense now. This article, now called Kernel (matrix) is about matrices only and more abstract stuff belongs elsewhere. Jmath666 04:56, 19 September 2007 (UTC)
- I really like this suggestion. However, I think that null space (matrix) would be a better title for this article than kernel (matrix). "Null space" is the terminology used in most 2nd-year linear algebra books, and "null space of a matrix" returns twenty times as many Google hits as "kernel of a matrix". Would you object if I moved the page? Jim 05:03, 19 September 2007 (UTC)
- There are Null space (matrix) and also Nullspace (matrix) namely redirects to Kernel (matrix). If you want to switch it around, you will need the help of an admin to delete the redirect page first. Also Kernel (matrix) is consistent with Kernel (mathematics). I do not think it is a big deal either way, do what you think is right. Jmath666 07:37, 19 September 2007 (UTC)
- (edit conflict)
- There seem to have been some renames/redirects already, but I lost track. However, redirecting "null space" to "kernel (matrix)" is very wrong. The concept of null space is much wider than as only applied to matrices. I agree with redirecting all "null space" articles to corresponding "kernel" articles. But the bare "kernel" article should have a fully generic definition and can then descend into or refer to specific cases, e.g. for matrices. Especially so, because the general definition is in no way more complex than the specialised one. −Woodstone 05:05, 19 September 2007 (UTC)
- Yes there is already a bare "kernel" article, namely the main article Kernel (mathematics) and it does exactly as you say. Perhaps Nullspace should redirect there, so I have done that now too. Thanks for noticiting. Feell free to change whatever you like. Jmath666 07:37, 19 September 2007 (UTC)
This is a mistake. A linear transformation has a kernel. A matrix is not a linear transformation so it does not have a kernel. There is a lot of slop in the literature about this, but if you go back to basics you will find this to be true. In particular, a matrix may represent a linear transformation in more than one way, so it is intrinsically nonsensical to refer to its "kernel". There is a perfectly good term for for the null space, namely "null space". This page should at least have a correct definition and only then cite Kernel (linear algebra). Zaslav (talk) 07:53, 23 February 2020 (UTC)
Proof that A Q2 = 0
[edit]I am sure this is very obvious to most of you but I was curious to see at the end of the article a method for computing the null space using QR factorization. I am not a mathematician myself but was wondering what is the proof that A Q2 = 0? It may be staring me in the face but I don't see it at the moment, I am sure it is correct as I tried the method using SciLab and it worked very nicely. --Rhoddydog (talk) 17:35, 14 August 2008 (UTC)
PS I just realized why A Q2 = 0, for those who might be interested: multiply both sides of A^T P = [Q1 Q2] [R 0]^T by [Q1 Q2]^T (since Q is orthogonal) and multiply out the resulting expression, one of the results is A Q2 = 0. --Rhoddydog (talk) 10:05, 14 August 2008 (UTC)
" multiply both sides of " --> "left multiply both sides of ..." --Mangledorf (talk) 15:29, 20 August 2012 (UTC)
Split Left Nullspace
[edit]Can I remove the paragraph from this page and put it on a separate "left nullspace" page? It's nice when you search for something if you just get the information you want instead of a whole bunch of other stuff.daviddoria (talk) 15:46, 19 September 2008 (UTC)
Relation to eigenvalues
[edit]Hi there,
I thought it would be nice if there would be a section explaining the relation between the nullspace and the eigenvalues/eigenvectors of the matrix.
As far as I know, the dimension of the nullspace (nullity?) of a matrix is equal to the amount of eigenvectors with eigenvalue 0. And the nullspace is spanned by these vectors, too. Drugbird (talk) 07:56, 27 April 2010 (UTC)
Using RRE to calculate Nullspace
[edit]The current algorithm given using RRE to find a nullspace has a step "Interpreting the reduced row echelon form as a homogeneous linear system" and then "determine which of the variables are free". That's a bit of a stretch for someone who is just learning how to calculate nullspace. I'd suggest a more concrete algorithm for it, like the following:
- If the input matrix is wider than it is tall, pad zero rows to make it square.
- Put the matrix in RRE form keeping the pivots along the diagonal.
- If the matrix is taller than it is wide, crop the lower rows to make it square (they will all be zero rows if the RRE was done correctly).
- Subtract the identity matrix.
That produces a null space basis from the non zero columns of the result. It is simple, uses RRE, and doesn't require the "now write out tons of expressions and modify them" step. Antares5245 (talk) 06:37, 28 August 2010 (UTC)
LQ Factorization
[edit]It seems that using the LQ factorization would be faster at finding the null space than the QR factorization, since we can apply it directly to the matrix $A$ instead of to $A^T$. LAPACK has routines for computing the LQ factorization as well.Vinzklorthos (talk) 15:13, 6 February 2012 (UTC)
Also, I have heard that QR is faster than SVD (presumably, LQ would be faster as well). The reason to use SVD relates to the possibility that the matrix does not have full row rank, i.e., at least one row is a linear combination of other rows; if the matrix is a constraint matrix in an optimization problem, then the matrix can be reduced and still yield an equivalent problem. SVD would catch this by having more zero singular values, but QR or LQ would only indicate this by having a degenerate R or L matrix, and this would have to be checked. Using SVD allows the user to not worry about this. In conclusion, as I understand, if the user knows the matrix has full row rank, then it is faster to use an LQ factorization of $A$ (or a QR factorization of $A^T$).Vinzklorthos (talk) 15:13, 6 February 2012 (UTC)
Numerical computation of null space
[edit]I have completely rewritten, and renamed this section into "Computation of the null space in a computer". In fact, the previous version did not respect WP:NPOV by supposing that every numerical computation is a floating-point computation, which is a blatant mistake. On the other hand I have removed the large part of the section devoted to specific numerical algorithms, because it is outside the scope of this article. In fact, null space computation is a special instance of solving a homogeneous linear system. Deciding which algorithm is the best in each case is a highly technical matter, whose answer depend on various parameters of the input matrix (its structure) as well as the architecture (cache(s), vectorization, number of cores, size of the memory, ...) of the computer. Implementing a high performance linear algebra package is yet a research project for a whole team! Thus the questions, that were addressed in the part that I have removed, have not their place here but in an article high performance liner algebra, yet to write. Also the suppressed part presented a single algorithm as the state of the art. This is WP:OR --D.Lazard (talk) 16:42, 19 October 2012 (UTC)
Assessment comment
[edit]The comment(s) below were originally left at Talk:Kernel (matrix)/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.
Tried to make this better. Probably needs more references. Jim 03:09, 9 September 2007 (UTC) |
Last edited at 01:43, 1 January 2012 (UTC). Substituted at 02:16, 5 May 2016 (UTC)