Jump to content

Talk:Alphabet (formal languages)

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is the current revision of this page, as edited by Epachamo (talk | contribs) at 15:21, 23 May 2024 (Cardinality: Reply). The present address (URL) is a permanent link to this version.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Combine with Symbol

[edit]

I recommend that this page be combined with the Symbol page. 97.65.82.66 (talk) 7:11, 8 September 2010

An argument against merging is that the article about "symbol" is fairly general and also covers uses outside the area of computer science. Also, the explanation in Symbol (formal) is not written in the style of a rigorous mathematical definition. In the context of (theoretical) computer science, just pointing to alphabet (computer science) eliminates the need for inlining the mathematical definition every time the term is used. Hermel (talk) 21:48, 13 September 2010 (UTC)[reply]
In no way should this be merged with symbol. I suspect that the commenter is not familiar with the huge amount of mathematics that is based on the concept of an alphabet. It's a foundational idea. Jason Quinn (talk) 12:00, 13 July 2011 (UTC)[reply]

Title

[edit]

I dislike the current title Alphabet (computer science). I think "Alphabet (mathematics)" would be more appropriate. Other opinions? Jason Quinn (talk) 12:02, 13 July 2011 (UTC)[reply]

Definition

[edit]

I recently changed the definition based on one given in a mathematical logic textbook (Ebbinghaus's Mathematical Logic). This definition agrees with what I usually see in mathematical texts. Logically there is no problem with using an infinite (even uncountable) alphabet. I suspect that at least some computer science texts require an alphabet to be finite by definition. It makes sense to me to use the more general definition in the article and if necessary to qualify statements made in a computer science context. Ebbinghaus also disallows an "empty alphabet" and article now also adopts that since I don't anticipate a problem with this. Correct me if you know better. Jason Quinn (talk) 12:11, 13 July 2011 (UTC) EDIT: Added link to the mentioned diff. Jason Quinn (talk) 14:01, 24 August 2016 (UTC)[reply]

Alphabets with ambiguous strings

[edit]

Suppose an alphabet is the set {|,||} and we are interested in the string "|||". This string is ambiguous and cannot be parsed unambiguously. Does someone here know what the name of this class of "ambiguous" alphabets is called? I am curious about special sub-classes and so forth. Jason Quinn (talk) 12:14, 13 July 2011 (UTC)[reply]

Target audience?

[edit]

Regarding a recent revert that eliminated some introductory notation material, I'm not convinced it makes the article better. Without it, it assumes the reader to be highly proficient in the technical aspects of formal languages but I don't think that is the best decision here. I think this article can be made to educate readers of all levels. The article is in terrible shape right now because editors have effectively written the article as if the topic is strings rather than alphabets themselves. The reverted Notation section also treats an alphabet as a derived concept rather than a fundamental one; that is, it assumes formal language and strings have already been defined (and that the reader is familiar with them!) are and then (implicitly) defines an alphabet from those two concepts. It can be done the other way around where the alphabet is the fundamental thing, and then strings and formal languages are defined based upon it. I believe that is the more common way of doing things too in textbooks. A major advantage of doing this way is that it also takes this intrinsically simple topic and keeps it accessible whereas the other way around makes it very exclusive to people already in the know. But a person who is an studying formal languages will instantly grasp what a formal alphabet is and have no need of this Wikipedia article. On the other hand, a person who is not an expert at higher mathematics might need this article and might need some help basic help with notation (highschoolers, etc) They could certainly learn about things like strings and applications of languages from this article if it's done right since they can be introduced as defined via alphabets.

I have to say that title "Alphabet (formal languages)" is a troublesome and encourages editors to write this article with too narrow an audience in mind. I've considered "Alphabet (mathematics)", "Alphabet (logic)", "Alphabet (computer science)" before but those also suffer from the same problem. Among all these, I like the current title best as long as it's remembered that a diverse group of people will be reading this, and not just people who are neck deep in formal languages.

So who is our target audience here? Are we going to treat alphabets as fundamental objects or subservient ones? Jason Quinn (talk) 03:23, 28 March 2021 (UTC)[reply]

I would like it to be readable by computer science undergraduates, at least, among others. I don't think introducing mathematical formalisms about how to denote sets is particularly helpful in making it more readable to that audience, though. —David Eppstein (talk) 07:39, 28 March 2021 (UTC)[reply]
@Jason Quinn: In general, I appreciate your recent edits to make the article more readable to (e.g.) highschoolers. However, I believe we shouldn't repeat the basics of set theory (like methods to denote sets) here. You wouldn't explain set union here either, although it is even less basic than set representation. This was the motivation for my revert. - Jochen Burghardt (talk) 10:44, 28 March 2021 (UTC)[reply]

Combine with Terminal and nonterminal symbols

[edit]

This article should be combined with Terminal and nonterminal symbols. I'm not convinced there is any difference between the two. Any naysayers? Epachamo (talk) 00:15, 10 July 2023 (UTC)[reply]

Yes, I object. Terminal and nonterminal symbols are a specific distinction among symbols of an alphabet when those symbols are used for different purposes as parts of a grammar. If any merge makes sense, it is to merge Terminal and nonterminal symbols into the article about formal grammars, where they are used. Alphabets are used all over the place in the theory of formal languages, combinatorics on words, string processing algorithms, and computational complexity theory, in contexts where there is no concept of a terminal or nonterminal, just symbols of an alphabet. —David Eppstein (talk) 00:29, 10 July 2023 (UTC)[reply]
That makes sense, thanks for taking the time to explain it to me. I rescind my proposal. Epachamo (talk) 14:28, 10 July 2023 (UTC)[reply]

Cardinality

[edit]

First you say that an alphabet is a non-empty set and then you say it may have any cardinality. By the way, sometimes it is allowed to be empty. --Andres (talk) 01:13, 23 May 2024 (UTC)[reply]

Source? Epachamo (talk) 15:21, 23 May 2024 (UTC)[reply]