Jump to content

Talk:Gram matrix

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by SineBot (talk | contribs) at 19:51, 12 November 2007 (Signing comment by 128.163.144.104 - "What does (xi|xj) mean?: new section"). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Limits

In the expression for , what is and  ? Deepak 16:54, 31 March 2006 (UTC)[reply]

Too much physics-oriented

The Gramian matrix can be calculated and is important in any inner product space. The integral of the multiplication of two functions which is shown in the article is just one case of an inner product.

Here is a good page on the subject: http://www.jyi.org/volumes/volume2/issue1/articles/barth.html

Yes, I agree. Gram matrices show up also in machine learning, where a number of methods depend on a set of input vectors in (for some finite ) only through the Gram matrix of this set. One may construct the Gram matrix using the standard inner product (dot product) on , or -- very usefully -- an arbitrary inner product, which corresponds to mapping the input vectors nonlinearly into some usually higher-dimensional space and taking the dot product there (known as the kernel trick). Either way, integrals are not involved. Eclecticos 05:04, 24 September 2006 (UTC)[reply]
Also, the usual name in the machine learning literature, and AFAIK in the linear algebra literature too, is "Gram matrix." I have never run across the variant "Gramian matrix" before, but perhaps it is used in physics? Eclecticos 05:04, 24 September 2006 (UTC)[reply]

I am aquainted with "Gramian" from mathematics. The main point is that the gramian matrix of some base (not necessarily orthonormal) of an euclidean (= inner product) space contains all the information on the geometry (the inner products) of that space.

In addition, checking for linear dependencies is only a specific case of determining the volume of the parallelopiped spanned by some vectors, which can be done easily by the gramian matrix. It is different from the determinant, since it applies to non-rectangular matrices as well.

For example - to calculate the area of a parallelogram given within a 3-d space by determinant is hard, because we need to find an orthonormal base for the plane in which the parallelogram lies and transform the vectors to that base, but using the gramian matrix it is very simple (see the external link).

What does (xi|xj) mean?

Need definition of this term —Preceding unsigned comment added by 128.163.144.104 (talk) 19:49, 12 November 2007 (UTC)[reply]