Jump to content

Wikipedia:Reference desk/Mathematics

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 70.54.112.243 (talk) at 04:15, 21 November 2015. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Welcome to the mathematics section
of the Wikipedia reference desk.
Select a section:
Want a faster answer?

Main page: Help searching Wikipedia

   

How can I get my question answered?

  • Select the section of the desk that best fits the general topic of your question (see the navigation column to the right).
  • Post your question to only one section, providing a short header that gives the topic of your question.
  • Type '~~~~' (that is, four tilde characters) at the end – this signs and dates your contribution so we know who wrote what and when.
  • Don't post personal contact information – it will be removed. Any answers will be provided here.
  • Please be as specific as possible, and include all relevant context – the usefulness of answers may depend on the context.
  • Note:
    • We don't answer (and may remove) questions that require medical diagnosis or legal advice.
    • We don't answer requests for opinions, predictions or debate.
    • We don't do your homework for you, though we'll help you past the stuck point.
    • We don't conduct original research or provide a free source of ideas, but we'll help you find information you need.



How do I answer a question?

Main page: Wikipedia:Reference desk/Guidelines

  • The best answers address the question directly, and back up facts with wikilinks and links to sources. Do not edit others' comments and do not give any medical or legal advice.
See also:


November 15

piecewise polynomial least squares

(I tried math.stackexchange and got no response for a week. Boo hoo.)

I have in mind a project involving a least-squares fit using piecewise polynomials; at a finite number of known arguments xj, the kjth derivative is discontinuous.

How many basis functions are needed? My guess is: xn for 0≤n<min(k), and then, for each j,n such that kjn ≤ the maximum degree, a pair of functions which are zero on one side and (x-xj)n on the other. Is that right?

In general, I welcome any pointers that might reduce the number of wheels I'll reinvent. —Tamfang (talk) 07:00, 15 November 2015 (UTC)[reply]

Just trying to understand the problem here. You are trying to create a spline composed of multiple polynomial arcs, right ? The adjacent arc endpoints must have point continuity, of course, but how about tangent & curvature continuity, etc. ? Since you are using least squares method, I assume you don't need an exact fit. So, how many points would each arc run through ? (Just offhand, this method sounds like it would generate an extremely "lumpy" spline.) I assume you already know how the number of constraints relates to the degree of the polynomial ? StuRat (talk) 07:12, 15 November 2015 (UTC)[reply]
Sure, let's say I'm trying to create a spline composed of multiple polynomial arcs, and the degree of continuity is kj-1. Maybe I like it runny lumpy; if it's lumpier than I like, I'll increase kj. Rather than discrete points, my input is piecewise continuous, so the algo involves integrals rather than sums. Number of constraints, in the sense I think you mean, is not meaningful here. —Tamfang (talk) 08:51, 15 November 2015 (UTC)[reply]
If the function is on the domain , and the polynomials are of degree at most d, and for the derivatives at are expected to be continuous up to ( constraints), then I'm pretty sure the number of degrees of freedom is . -- Meni Rosenfeld (talk) 09:53, 15 November 2015 (UTC)[reply]
And I think the following basis functions will work (probably the same as what you wrote, but I think is clearer): Letting , for each and , the function which is 0 for and for . This also means their number can be rewritten as . -- Meni Rosenfeld (talk) 10:06, 15 November 2015 (UTC)[reply]
Hm ... thanks, yes, I think that does work; the is a good gimmick (removing some special cases from the description). You've saved me some redundancy. —Tamfang (talk) 08:11, 16 November 2015 (UTC)[reply]
This is above my head (I don't know why I even look at this notice board!) but does Savitzky–Golay filter help? Thincat (talk) 09:00, 20 November 2015 (UTC)[reply]

November 19

Finite endomorphism ring

Is there an infinite abelian group with a finite endomorphism ring? GeoffreyT2000 (talk) 01:17, 19 November 2015 (UTC)[reply]

I don't think so. If an abelian group has a finite endomorphism ring, then it is necessarily a torsion abelian group. (Otherwise multiplication by an integer gives an obvious injection from the set of integers into the endomorphism ring.) In fact, for the same reason, the elements must have bounded order. By the first Prüfer theorem, a torsion abelian group of this kind is isomorphic to a direct sum of cyclic groups. Because the order of these cyclic groups is bounded, infinitely many direct factors are repeated by the pigeonhole principle, and so in that case the endomorphism ring is uncountably infinite. Sławomir
Biały
12:07, 19 November 2015 (UTC)[reply]

Probability distributions, frequency of events

Hi, I'd like to create an IID stochastic process , where 1 indicates that an event has occurred, and 0 that it has not. I had been using , where B(p) is the Bernoulli distribution with parameter p. This works ok, and I can vary the mean frequency of events, but the variance is p(1-p), and so the variance increases as I increase frequency - I am primarily interested in . What distribution should I use that would allow me to manipulate the frequency and variance independently? Or at least have a fixed variance for all frequencies? I need at the end to have a string of length N like 010...001, so counting events in an interval is not helpful. To be clear, I'd like to have the time between events (expected value of 1/p) have the same variance for all frequencies p. I can think of a few ways to generate strings with the necessary properties programmatically, but it would be far better if I could do it with a simple distribution. Any ideas? I feel like there must be something simple I'm forgetting about. Thanks, SemanticMantis (talk) 16:14, 19 November 2015 (UTC)[reply]

That's impossible. The only distribution with support is Bernoulli. To get what you described (which may or may not be what you want), you'll have to make your events dependent (contrary to the assumption of IID).
And the best way to do that is probably to start with the distribution you want for the time between successful events, with a given mean and variance (plenty of choice there - a good choice is the maximum entropy distribution), and simply running that and deriving the process (with 0's to fill in the gaps between successive 1's). But again, this will mean that there will be dependence between the events at given times (the dependence will be stronger for nearby events). -- Meni Rosenfeld (talk) 18:53, 19 November 2015 (UTC)[reply]
@Meni Rosenfeld: D'oh! Thanks, that makes sense, I forgot to leave out option "c) Is this impossible?" -- of course I see now that what I asked for is indeed impossible. To clarify, you're suggesting that I could instead create an IID process S_n for the spaces, then create O=1...1...1, where the number of zeros in the ... is S_n. Then O_n is not IID, but I could create S_n such that the mean and variance are independent. But can I do that with maximal entropy? E.g. I thought the geometric distribution had maximal entropy on {(0),1,2,...}, and that won't let me pick mean and variance independently. If I want to demand independent mean and variance, what are my options for support on \mathbb{N}? I think maybe I can re-parameterize the Beta-binomial distribution by mean and variance like you can do with the Beta distribution, but that's only quasi-independent, because once you pick the mean it bounds the allowable variances. I think I may well stick with my Bernoulli set up for now, but I'm interested in the time-dependent case as possible future refinement. SemanticMantis (talk) 19:54, 19 November 2015 (UTC)[reply]
I guess I could pick the mean \mu and then let S_n = DiscreteUniform(\mu-k, \mu+k). Variance could then at least be arbitrarily large or small. SemanticMantis (talk) 20:01, 19 November 2015 (UTC)[reply]
Yes, that is the process I proposed.
I meant, "maximum entropy for given mean and variance supported on positive integers" (geometric is max. ent. for given mean, without variance specification). This kind of distribution has the same form as the normal distribution, but the scale & shift parameters will not be exactly the mean and s.d. (since the restriction to positive integers changes the mean and variance for a given formula). You'd have to do a bit of work to find the correct parameters.
Alternatively, a combination of binomial distribution (for low variance) and negative binomial (for high variance) can work, and it's easier to find the parameters, but they don't cover the entire possibilities of mean & variance. -- Meni Rosenfeld (talk) 20:18, 19 November 2015 (UTC)[reply]
Note also that a uniform distribution will be fairly restrictive - you must keep it positive, so it limits how wide an interval you can take, and hence you can't have high variance. Negative binomial doesn't have this problem. -- Meni Rosenfeld (talk) 20:31, 19 November 2015 (UTC)[reply]
Thanks again Meni, very helpful. SemanticMantis (talk) 22:31, 19 November 2015 (UTC)[reply]


November 20

Calculus formula on back of jacket

I saw a guy on a train wearing a fairly plain black jacket but there was a mathematical formula embroided on the back.

Does anyone recognize what this is and why would someone have it on their jacket? I'm guessing it's some math "in joke". Vespine (talk) 03:36, 20 November 2015 (UTC)[reply]

It's the Black-Scholes equation. As for what the intended message of putting it on a jacket was, your guess is as good as mine.--Antendren (talk) 08:29, 20 November 2015 (UTC)[reply]

November 21

Hello! I'm a first year university student, and while practicing integration I made up a question and tried to solve it using integration by parts. I got nowhere after a short while and, after looking the integral up, it turns out there is no elementary integral for the problem. However, integration by parts gives me this infinite series:

I was just wondering, is there a way to write this expression as an infinite sum? I can kind of see a pattern but I don't have enough experience with infinite sums to generate one (if it even exists!). Thanks for your help. 70.54.112.243 (talk) 04:15, 21 November 2015 (UTC)[reply]