Jump to content

Talk:Denotational semantics: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
Line 229: Line 229:


:Editor Sam Staton is being obstructionist. It is absolutely necessary for the article to describe the subject in generality. Appeals to the mythical glories of obsolete earlier work do not suffice. Why does he think that Hewitt's approach is unnatural and difficult to understand?[[Special:Contributions/205.248.100.252|205.248.100.252]] ([[User talk:205.248.100.252|talk]]) 03:23, 9 December 2009 (UTC)
:Editor Sam Staton is being obstructionist. It is absolutely necessary for the article to describe the subject in generality. Appeals to the mythical glories of obsolete earlier work do not suffice. Why does he think that Hewitt's approach is unnatural and difficult to understand?[[Special:Contributions/205.248.100.252|205.248.100.252]] ([[User talk:205.248.100.252|talk]]) 03:23, 9 December 2009 (UTC)
:: Actually, there's nothing requiring the article to describe the subject "in generality" (much less making it "absolutely necessary"). The relevant (and mandatory for all editors) policy here is that of [[WP:NPOV|Neutral Point of View]], which requires to include all "significant views that have been published by reliable sources". This *must* be done "in proportion to the prominence of each" (see [[WP:Weight]]), which means that less frequent views must be given less coverage. If a new, more general definition of Denotational semantics is found by a researcher, it can be included in the article as an addendum, but it *must not* be used as the basis of the article. This is by Wikipedia policy, and in fact is one of the [[Wikipedia:Five_pillars|five founding pillars of Wikipedia]]. [[User:Diego Moya|Diego]] ([[User talk:Diego Moya|talk]]) 15:42, 10 December 2009 (UTC)


Am I "obstructionist"? I am simply saying this: It is only necessary for the article to describe the subject as it is widely understood, as it is taught worldwide and in the form that it is being investigated. Wikipedia is not the right place for you to convince others about the generality and importance of your new ideas, or for you to argue that the standard and widely used ideas are wrong. I would like to hear these arguments and have this discussion, but in a more appropriate place. [[User:Sam Staton|Sam]] ([[User talk:Sam Staton|talk]]) 02:17, 10 December 2009 (UTC)
Am I "obstructionist"? I am simply saying this: It is only necessary for the article to describe the subject as it is widely understood, as it is taught worldwide and in the form that it is being investigated. Wikipedia is not the right place for you to convince others about the generality and importance of your new ideas, or for you to argue that the standard and widely used ideas are wrong. I would like to hear these arguments and have this discussion, but in a more appropriate place. [[User:Sam Staton|Sam]] ([[User talk:Sam Staton|talk]]) 02:17, 10 December 2009 (UTC)

Revision as of 15:42, 10 December 2009

Actors, concurrency etc

Hi. I have moved the material giving details on concurrency and actor semantics back to the page on Denotational semantics of the Actor model. This page has to give a high level overview of the subject. Actors, concurrency and interaction get a fair mention as it is, and there are links for those readers who want to know more. No offense intended to the editor who moved the material here. Sam (talk) 20:43, 9 June 2009 (UTC)[reply]

As you should know from your previous work, sequential and functional denotational semantics are special cases that are interesting mainly in how they illustrate and motivate the general case.75.10.249.77 (talk) 03:18, 11 June 2009 (UTC)[reply]

Current article is a mess

The current article is a mess. It treats all kinds of special cases in a disorganized fashion and doesn't properly develop and motivate the general case.67.180.94.147 (talk) 16:57, 13 June 2009 (UTC)[reply]

The problem seems to have been resolved.70.231.253.115 (talk) 19:13, 21 June 2009 (UTC)[reply]
Perhaps unfortunately, the current state of the subject is a little disorganized, so this is unavoidable. The idea that the Actor Model is the "general case" is a valid position but it is not universally held. Sam (talk) 18:16, 2 September 2009 (UTC)[reply]
Unfortunately, Sam has imposed his very narrow notion of compositionality in terms of the composition of continuous functions. This is entirely insufficient in general. Someone should restore the more general case and includes Sam's new material as a special case.71.198.220.76 (talk) 19:45, 2 September 2009 (UTC)[reply]
As I have written, the approach is concerned with composition of morphisms in a category, and has little to do with continuous functions per se. This is the general approach to denotational semantics that is taken by Abramsky, Girard, Gunter, Hennessy, Plotkin, Stoy, Tennent, Winskel, and many others. It is the style of denotational semantics that is used in logic, especially categorical logic, see for instance the books by Lambek and Scott, Jacobs, Johnstone. You might be interested to note that Hildebrandt uses this style in his semantics of fairness. It is certainly not a narrow notion, nor a special case. Sam (talk) 19:33, 3 September 2009 (UTC)[reply]
I wonder whether your ideas might actually fit into this framework... I will think about it. Sam (talk) 19:39, 3 September 2009 (UTC)[reply]
The undue emphasis on the actor model comes from the days when Carl Hewitt was editing here. Feel free to give it proper WP:WEIGHT. The "further reading" list could use some filtering as well. You are certainly more knowledgeable than me in this area. Pcap ping 23:54, 6 September 2009 (UTC)[reply]

Compositionality

The article mentions that the denotations of programs should be compositional in terms of their subphrases but doesn't explain how this is accomplished.67.180.94.147 (talk) 17:09, 13 June 2009 (UTC)[reply]

Compositionality previously appeared in the article as follows:
Compositionality in programming languages
An important aspect of denotational semantics of programming languages is compositionality, by which the denotation of a program is constructed from denotations of its parts. For example consider the expression "<expression1> + <expression2>". Compositionality in this case is to provide a meaning for "<expression1> + <expression2>" in terms of the meanings of <expression1> and <expression2>.
Scott and Strachey [1971] proposed that the semantics of programming languages be reduced to the semantics of the lambda calculus and thus inherit the denotational semantics of the lambda calculus. However, it turned out that general computation could not be implemented in the lambda calculus (see Indeterminacy in concurrent computation). Thus there arose the problem of how to provide modular denotational semantics for concurrent programming languages. One solution to this problem is to use the Eval messages with an environment so that programs obtian their denotational semantics from the methods explained earlier in this article.
Environments
Environments hold the bindings of identifiers. When an environment is sent a Lookup message for an identifier x, it returns the latest (lexical) binding of x.
As an example of how this works consider the lambda expression <L> below which implements a tree data structure when supplied with parameters for a leftSubTree and rightSubTree. When such a tree is given a parameter message "getLeft", it return leftSubTree and likewise when given the message "getRight" it returns rightSubTree.
                 λ(leftSubTree, rightSubTree)
                     λ(message)
                            if (message == "getLeft") then leftSubTree
                            else if (message == "getRight") then rightSubTree
Consider what happens when an expression of the form "(<L> 1 2)" is sent an Eval message with environment E. One semantics for application expressions such as this one is the following: <L>, 1 and 2 are each sent Eval messages with environment E. The integers 1 and 2 immediately reply to the Eval message with themselves.
However, <L> responds to the Eval message by creating a closure C that has a body <L> and environment E. "(<L> 1 2)" then sends C the message [1 2].
When C receives the message [1 2], it creates a new environment F which behaves as follows:
  1. When it receives a Lookup message for the identifier leftSubTree, it responds with 1
  2. When it receives a Lookup message for the identifier rightSubTree, it responds with 2
  3. When it receives a Lookup message for any other identifier, it forwards the Lookup message to E
C then sends an Eval message with environment F to the following:
                 λ(message)
                       if (message == "getLeft") then leftSubTree
                       else if (message == "getRight") then rightSubTree
Arithmetic expressions
For another example consider the expression "<expression1> + <expression2>" which has the subexpressions <expression1> and <expression2>. When the composite expression receives an Eval message with an environment E and a customer C, it sends Eval messages to <expression1> and <expression2> with environment E and sends C a newl created C0. When C0 has received back two values N1 and N2, it sends C the value N1 + N2.
Other programming language constructs
The denotational compositional semantics presented above is very general and can be used for functional, imperative, concurrent, logic, etc. programs.[1] For example it easily provides denotation semantics for constructs that are difficult to formaize using other approaches such as delays and futures.
205.158.58.226 (talk) 22:49, 14 June 2009 (UTC)[reply]
Good point. I've had a go at explaining the usual approach to compositionality. This is quite general and might even include your ideas to a certain extent. Sam (talk) 18:16, 2 September 2009 (UTC)[reply]

Progressivity

Progressivity is different from constructivism. Constructivism is concerned with whether domain elements can be shown to exist by constructive methods. Progressivity[2] is concerned with the time progression of systems.67.180.94.147 (talk) 17:25, 13 June 2009 (UTC)[reply]

  1. ^ Carl Hewitt. Common sense for concurrency and strong paraconsistency using unstratified inference and reflection Arxiv 0812.4852. 2008.
  2. ^ "What is Commitment? Physical, Organizational, and Social" LNAI 4386. Springer-Verlag. 2007.


Progressivity previously appeared in the article as follows:
  1. Progressivity:[1] For each system S, there is a progressionS function for the semantics such that the denotation (meaning) of a system S is i∈ωprogressionSi(⊥S) where S is the initial configuration of S.
205.158.58.226 (talk) 22:37, 14 June 2009 (UTC)[reply]

Denotational Semantics of Exceptions (was Division by Zero)

User 162.135.0.12 added statements about division by zero being a problem for compositionality in denotational semantics. It is true that division by zero has to be handled carefully in computer science. As far as I am aware, though, this is not generally regarded as one of the most serious problems for denotational semantics. I think that mentioning it here gives it undue weight. If I am mistaken, please provide a reference to a textbook. Sam (talk) 07:58, 15 September 2009 (UTC)[reply]

Sam, Unfortunately, you are dead wrong. Exceptions are crucial to Software Engineering. You might want to take a tutorial on Software Engineering. --70.132.17.91 (talk) 02:11, 16 September 2009 (UTC)[reply]
Of course, I know that exceptions are extremely important in programming and software engineering! I am just not sure that they have historically played a crucial and fundamental role in denotational semantics. (We can handle exceptions with the exceptions monad, for instance.) Sam (talk) 07:11, 16 September 2009 (UTC)[reply]
Sam: Unfortunately, use of the exceptions monad forces sequential execution, which is unacceptable. You are mistaken in your assertion that the functional approach to compositionality of programs works in general.70.132.17.91 (talk) 22:24, 16 September 2009 (UTC)[reply]
Yes, I know that in the concurrent context, exceptions are difficult to get right in a language design. I didn't mean that the simple exceptions monad was a universal solution! However, I think the page is at a stage that reflects the current status of textbooks, undergraduate/graduate courses etc. and in my view this is what is needed in a wikipedia page (regardless of how non-general this might be). There is definitely some academic research still to be done, though. Can we continue this discussion somewhere else, more appropriate? eg by email or on your blog? I would very much like to understand your objections to the conventional notions of compositionality, and I don't understand your why you are referring to this as "functional". I would also be interested in discussing semantics for ActorScript. Cheers, Sam (talk) 18:57, 17 September 2009 (UTC)[reply]
If you knew that the exceptions monad doesn't work, why did you suggest it? There is no excuse for Wikipedia to have obsolete incorrect information. You might contact Professor Hewitt with your questions about ActorScript. 70.132.17.91 (talk) 22:18, 17 September 2009 (UTC)[reply]
Uh? The exceptions monad is not obsolete or incorrect. Sam (talk) 10:24, 18 September 2009 (UTC)[reply]
Sam,
Unfortunately, you are wrong again. The exceptions monad is not faithful because it forces sequential execution. For example, using the exceptions monad for an expression of the form f(E1, ..., En) forces sequential execution of E1, ..., E1. 12.234.41.239 (talk) 18:07, 18 September 2009 (UTC)[reply]
But that is exactly the behaviour specified by modern mainstream languages such as Java or C#. Sure, some languages are deliberately nondeterministic (often for rather dubious reasons), but that's a different issue. Hqb (talk) 18:55, 18 September 2009 (UTC)[reply]
Unfortunately, the above opinion has been rendered totally obsolete because of the advent of many-core architecture. Also, it is very important not to confuse concurrency with nondeterministic computation. 12.234.41.239 (talk) 19:12, 18 September 2009 (UTC)[reply]
I'm not sure what "opinion" you refer to there. It's a fact that Java and C# explicitly require left-to-right evaluation (or do you dispute that?), and I wouldn't expect that aspect of their specification to change in the foreseeable future. (Again, if you have well-sourced information to the contrary, please cite.) The point is, as long as exception monads faithfully model the semantics of today's most popular languages, it's hard to dismiss them as "obsolete" without getting into crystal ball territory. Hqb (talk) 19:46, 18 September 2009 (UTC)[reply]

It is widely recognized that the many-core revolution has rendered Java and C# obsolete. So a new generation of programming languages (e.g. ActorScript) have been developed that integrate local and nonlocal computation for client-cloud computing.12.234.41.239 (talk)

Monads in general are not rendered obsolete by the ever-increasing ubiquity of parallel computing. To the contrary, GHC can leverage the provable state separation between different monads to automatically parallelism execution; see for instance [1]. There are quite a few proposals in this area. An overview of the more practical ones is here. So I'm not sure what Carl Hewitt's point is. Do you want exceptions to run concurrently with the code/thread that generated them? What would be purpose of that? Pcap ping 00:46, 19 September 2009 (UTC)[reply]
The consensus is that a procedure invocation should evaluate its arguments concurrently. In ActorScript notation, this means that f(E1, ..., En) is equivalent to f <- [E1, ..., En] Unfortunately monads are obsolete.71.198.220.76 (talk) 18:12, 20 September 2009 (UTC)[reply]
"The consensus is that a procedure invocation should evaluate its arguments concurrently". Probably in true you research group, but I strongly doubt such universal consensus exists even amongst PLT researchers. You can easily prove me wrong: start a thread on LtU on this. Pcap ping 18:44, 20 September 2009 (UTC)[reply]
In this day and age, there is no realistic alternative to concurrent execution. 71.198.220.76 (talk) 20:17, 21 September 2009 (UTC)[reply]

Material in "Compositionality in Programming Language" removed by Sam Staton

(Article extract moved to /Compositionality in Programming Language to keep this page manageable. Piet Delport (talk) 2009-10-22 23:14)

Reply to Carl

Hi, as you know, wikipedia has to report the generally accepted state of the art, regardless of what the most general case or the very best approach might be. This means that we have to report what is in the textbooks and the university courses, and in the style that is usually reported at scientific conferences, rather than what is in your very recent preprints. We can discuss how things should be at a different forum -- you can come to visit Cambridge if you like. Anyway, that is why I'm about to put the page back to how it was before.

Let me just point out that the usual way of handling compositionality, that is used by almost all researchers in denotational semantics, can be made so general as to be almost vacuous. I agree that classical domain theory doesn't work for everything, that is well known, but the usual approach to compositional semantics has actually nothing to do with functions or domains. Whenever you have a compositional denotational semantics, you can fit it into this framework, as follows.

Consider an untyped language for simplicity. Then build a category: objects are natural numbers. A morphism m -> n is an n-tuple of programs that have m free variables. We now equate two morphisms m->n if they have the corresponding programs have the same denotation (so a morphism can be thought of as a denotation.) The composition of two morphisms is built by substituting programs for variables.

This all makes sense provided the semantics respect substitution, roughly:

  • if〚 x ⊢ P 〛 = 〚 x ⊢ P' 〛 and 〚 y ⊢ Q 〛 = 〚 y ⊢ Q' 〛 then 〚 y ⊢ P[Q/x] 〛= 〚 y ⊢ P'[Q'/x] 〛.

Do you see what I mean? Sam (talk) 20:45, 16 October 2009 (UTC)[reply]

We learned that the functional approach is inadequate for expressions like "return also become(contents=newContents)" (see above). Of course general computation does not respect substituation.63.249.108.250 (talk) 20:10, 17 October 2009 (UTC)[reply]
I think you are saying that, in your opinion, denotational semantics should not respect substitution. I have never heard anyone else say this; I have never read it in a textbook; I have never seen it in a published research paper. It's not fair to rewrite the article based on this idea. (Of course, you have a right to maverick ideas, and I would like to hear them. But this is not the place.) Sam (talk) 11:38, 19 October 2009 (UTC)[reply]
Sam: The student is correct and you are wrong. They are correct in saying Computation does not respect substitution. The approach that you are advocating has no chance with expressions like "return also become(contents=newContents)" 70.231.250.190 (talk) 14:12, 19 October 2009 (UTC)[reply]
Hi. I'm not trying to advocate any approaches, I'm just reporting what is the generally accepted as constituting "denotational semantics". That is what this wikipedia article has to focus on. I'm afraid I don't understand the precise meaning of "computation does not respect substitution". You may be right, but unfortunately what is "right" or "most general" is not the issue here. We should talk about it somewhere else. Sam (talk) 16:03, 19 October 2009 (UTC)[reply]

The part of the article that you deleted explains why the functional approach that you advocate is inadequate. Furthermore, the sentence (above) that you provided:

  • if〚 x ⊢ P 〛 = 〚 x ⊢ P' 〛 and 〚 y ⊢ Q 〛 = 〚 y ⊢ Q' 〛 then 〚 y ⊢ P[Q/x] 〛= 〚 y ⊢ P'[Q'/x] 〛

is nonsense. Why are you censuring the general approach that everyone can understand? —Preceding unsigned comment added by 99.29.247.230 (talk) 17:09, 19 October 2009 (UTC)[reply]

I am not advocating anything, I am not censuring anything, I am trying to make an article that summarizes what is generally understood as "denotational semantics"; I have ascertained this by looking at textbooks, university courses, reading academic papers, etc.. If you have published your criticisms of the state of the subject and your new ideas, we could add a brief paragraph summarizing them. Readers can then read the references to find out more.
(Are you complaining that I have missed the indices out of the sentence? if so, here is the unabbreviated version:
  • if 〚 x1...xn ⊢ P 〛 = 〚 x1...xn ⊢ P' 〛 and 〚 y1...ym ⊢ Q1 〛 = 〚 y1...ym ⊢ Q'1〛 ... and 〚 y1...ym ⊢ Qn 〛 = 〚 y1...ym ⊢ Q'n'〛
then 〚 y1...ym ⊢ P[Q1/x1]...[Qn/x_n] 〛= 〚 y1...ym ⊢ P'[Q'1/x1]...[Q'n/x_n] 〛.
Here 〚 x1...xn ⊢ P 〛 means "the meaning of program P, which has free variables x1...xn, as usual. P[Q1/x1] means "substitute Q1 for the free variable x1 in P, as usual.) Sam (talk) 17:46, 19 October 2009 (UTC)[reply]

People are frustrated because you keep evading the issues that they have raised.

Sorry, I don't understand. Which issue?

The notation that you are using is quite awkward and inelegant by comparison with the use of environments. Furthermore, environments are needed for programming languages like JavaScript.

Also, the version of the substitution rule that you are advocating is too simplistic and can produced incorrect conclusions. For example〚 x - x 〛= 〚 0 〛but if f is indeterminate, then 〚 f() - f() 〛≠ 〚 0 〛.

If x ranges over indeterminate computations, then I would not expect to have 〚 x ⊢ x - x 〛= 〚x ⊢ 0 〛.
Just to clarify, the editor above was correct in that taken as a program expression, 〚 x - x 〛= 〚 0 〛but if f is indeterminate, then taken as a program expression 〚 f() - f() 〛≠ 〚 0 〛.
No, this depends on how you interpret the expressions. One way to interpret non-determinism here is by using powerset of integers as a domain. Then 〚 x ⊢ x - x 〛≠ 〚 x ⊢ 0 〛. Sam (talk) 13:45, 28 October 2009 (UTC)[reply]
It looks like you have made a mistake. If x is a number, then the program expression "x-x" means the same as "0" 12.234.41.221 (talk) 17:20, 30 October 2009 (UTC)[reply]
No, actually. The usual powerset interpretation of non-determinism would give us that 〚 x ⊢ x - x 〛 is the function f:P(Z)->P(Z) given by . On the other hand 〚 x ⊢ 0 〛 is the function g:P(Z)->P(Z) given by . These functions are not equal. Sam (talk) 12:38, 2 November 2009 (UTC)[reply]

Sam: Unfortunately you have missed it again. If you work through the semantics in ActorScriptTM: Industrial strength integration of local and nonvocal concurrency for Client-cloud Computing ArXiv. 0907.3330, then the program expression "x-x" means essentially the same as "0" when x is a number. This is why compilers are able to optimize programs.

BTW, where did you get your ridiculous notation? From a program, the free identifiers should be immediately apparent so why should you have to redundantly specify them?

Also, it seems incredible that you could have missed important concepts like environments (from Lisp. etc. in the 1960's) and customers (from Actors in the 1970's). You might want to consult your colleague Robin Milner who is knowledgeable about these matters.192.18.37.228 (talk) 18:43, 2 November 2009 (UTC)[reply]

Also you have ignored program expressions like 〚 return also become(contents=newContents)〛. 171.66.107.147 (talk) 17:43, 24 October 2009 (UTC)[reply]
Yes, I would like to discuss this, but it's a research question, and Wikipedia isn't a forum for research. Sam (talk) 13:45, 28 October 2009 (UTC)[reply]
No research is needed. The solution has been published and seems quite intuitive.12.234.41.221 (talk) 17:20, 30 October 2009 (UTC)[reply]

Are you really so ignorant of environments and customers? This stuff has been published for decades.99.29.247.230 (talk) 20:17, 20 October 2009 (UTC)[reply]

Could you maybe send some published references on how environments and consumers are used in denotational semantics? Then we can put them in the article.
As far as I know, there has been no extensive study of denotational semantics for javascript. Maybe you have a nice treatment in mind. I'm just saying, it's not fair to change the main thrust of the article on that basis. The article currently describes what is most commonly understood as constituting denotational semantics. Are you disputing that? Sam (talk) 18:17, 21 October 2009 (UTC)[reply]

Who is Carl?171.66.32.60 (talk) 19:28, 28 October 2009 (UTC)[reply]

I am wondering that myself.12.234.41.221 (talk) 17:20, 30 October 2009 (UTC)[reply]

I think they mean Professor Carl Hewitt and his colleagues. From his homepage, i found the following reference that seems to be relevant:

ActorScriptTM: Industrial strength integration of local and nonlocal concurrency for Client-cloud Computing arXiv:0907.3330 —Preceding unsigned comment added by 99.29.247.230 (talk) 17:56, 1 November 2009 (UTC)[reply]
(To clarify: these last comments by anonymous editors seem to have been written as a joke.) Sam (talk) 10:30, 21 November 2009 (UTC)[reply]

Semi-protection requested

I have requested semi-protection of this article, at least until Wikipedia:Sockpuppet investigations/CarlHewitt is cleared up.

To the anonymous user(s) above: please respect Wikipedia's policies of consensus, neutrality, and no original research. In particular, do not waste time trying to argue whether ideas are right or wrong: Wikipedia is not a debating ground, and the threshold for inclusion is verifiability, not truth.

Finally, if you are serious about contributing to Wikipedia, please consider reducing confusion by creating accounts: this allows everyone to keep track of who's saying what in discussions and collaboration. Piet Delport (talk) 2009-10-22 23:54

Reply to "madmediamaven"

I have just undone the recent edit by Madmediamaven. This seems to be a pseudonym of Carl Hewitt.

I feel I should respond to the edit summary "Sam Staton's point of view is extremely obsolete.". First, the current state of the article does not reflect any "point of view"; second, it is not "obsolete" in any way. This is the established state of the art as reported in textbooks, lectured in courses at universities around the world, and forms the basis for all research in denotational semantics, as published in journals and reported at conferences.

The edits that Madmediamaven wants to reinstate change the perspective of the article. They make the article entirely based around a recent unpublished paper by Carl Hewitt, that has not been peer reviewed. If these ideas are to appear in the article, we need references to published articles that form the basis of a body of research by more than one author.

To Carl: if you want to discuss your recent research and how it fits in with other research, we can do it by email or you can visit me in Cambridge. That would be interesting. Sam (talk) 10:50, 21 November 2009 (UTC)[reply]

Mediamaven is over at Media X. Hewitt is in CS.171.66.33.137 (talk) 02:07, 23 November 2009 (UTC)[reply]

It looks like Staton doesn't believe that denotational semantics should cover modern programming languages. What kind of sense does this make? BTW, how can I find the above referenced paper by Hewitt? —Preceding unsigned comment added by 68.127.102.201 (talk) 20:01, 23 November 2009 (UTC)[reply]

I think that the paper that you want has been published on arXiv Common sense for concurrency and strong paraconsistency using unstratified inference and reflection76.254.235.105 (talk) —Preceding undated comment added 21:07, 23 November 2009 (UTC).[reply]
Actually, the paper with denotational semantics is ActorScriptTM: Industrial strength integration of local and nonlocal concurrency for Client-cloud Computing arXiv:0907.3330. Maybe you can work something out. Sam Staton's position seems a little reactionary. The others seem a bit avant-garde but are more technically correct that Staton. Maybe you can meet somewhere in between instead of just reverting each other. --68.170.176.166 (talk) 23:52, 23 November 2009 (UTC)[reply]

Indeed, hopefully we can find some way to resolve this! and I am sorry if I am reactionary or technically incorrect!

To the anonymous editor(s): What is it that you think needs to be mentioned? Can we discuss it? There are some standard and basic ideas that are related to what is in the text I deleted: continuations, closures, etc. Possibly these things should be mentioned; I am not sure there needs to be much detail because there are other pages on these topics. My complaint is that, in trying to be as "general" and "modern" as possible, the text becomes entirely based on the recent unpublished papers by Carl Hewitt, written in the language of those papers, and sadly unreadable to anyone who is unfamiliar with Carl's latest outputs. Sam (talk) 20:44, 3 December 2009 (UTC)[reply]

(Just to clarify: when I said "vandalism" in my edit summary, I was talking about the person who wrote "i hate ben french", not the other anonymous editor who reverted my earlier edits; I mean no disrespect to them. Sam (talk) 20:46, 3 December 2009 (UTC))[reply]

Like many others, it looks like Hewitt publishes in arXiv which is then picked by Google Scholar and DBLP so that his research is well known in the field. So your objection that his work is "unpublished" seems off the mark. Hewitt's approach seems general enough to cover all modern programming languages including futures, concurrency, classes, instances, etc. If denotational semantics is going to be worth anything then it has to be general enough to cover modern programming languages. Unfortunately, the current version of the article is entirely inadequate. So there would seem to be two alternatives: better explain Hewitt's approach or use a different one that has the requisite generality. —Preceding unsigned comment added by 171.66.84.169 (talk) 00:00, 4 December 2009 (UTC)[reply]

Hi. (It would be nice to know who I am talking to: is it Carl or someone else?) There is a lot of previous and ongoing work that uses the traditional and standard approaches to denotational semantics, as reported in this article. Carl Hewitt's very recent paper is the first place I have ever seen that suggests that we tear up the standard notions of compositionality. We don't yet know what the community as a whole makes of Carl's output. It is too soon, but as far as I can tell the paper has not been peer-reviewed and nobody has cited it yet. At this moment in time, it is not fair on the subject or on the semantics community, to rewrite the whole article on the basis of this one paper. It is better that we stick to the common, generally accepted material.
There is nothing in the wikipedia guidelines to require that articles have to be centered around the most general solution, nor that they should be written from the point of view of the very latest research. There are other forums where arguments about generality and inadequacy should be made; not here. I would be interested to discuss the full generality of Carl's paper and how it fits in with earlier work, but we will have to do that somewhere else. You're welcome to send me an email. Sam (talk) 20:05, 4 December 2009 (UTC)[reply]
It seems odd to be waiting around to notice that someone has put in a reference to Hewitt's paper. How do we know that it hasn't already happened? The paper seems to be very well known.99.29.247.230 (talk) 19:04, 5 December 2009 (UTC)[reply]
Hewitt did not suggest that we "tear up the standard notions of compositionality." For example, his approach gives the same semantics to an applicative expression f(E1,E2) in the case covered by the restricted semantics favored by Staton. However, it also gives a compostional semantics for f(E1,E2) in the case of exceptions that may arise in the concurrent evaluation of E1 and E2. —Preceding unsigned comment added by 68.170.178.152 (talk) 23:53, 5 December 2009 (UTC)[reply]

If it can't do modern programming languages what good is it?65.106.72.229 (talk) 03:08, 8 December 2009 (UTC)[reply]

Editor Sam Staton is being obstructionist. It is absolutely necessary for the article to describe the subject in generality. Appeals to the mythical glories of obsolete earlier work do not suffice. Why does he think that Hewitt's approach is unnatural and difficult to understand?205.248.100.252 (talk) 03:23, 9 December 2009 (UTC)[reply]
Actually, there's nothing requiring the article to describe the subject "in generality" (much less making it "absolutely necessary"). The relevant (and mandatory for all editors) policy here is that of Neutral Point of View, which requires to include all "significant views that have been published by reliable sources". This *must* be done "in proportion to the prominence of each" (see WP:Weight), which means that less frequent views must be given less coverage. If a new, more general definition of Denotational semantics is found by a researcher, it can be included in the article as an addendum, but it *must not* be used as the basis of the article. This is by Wikipedia policy, and in fact is one of the five founding pillars of Wikipedia. Diego (talk) 15:42, 10 December 2009 (UTC)[reply]

Am I "obstructionist"? I am simply saying this: It is only necessary for the article to describe the subject as it is widely understood, as it is taught worldwide and in the form that it is being investigated. Wikipedia is not the right place for you to convince others about the generality and importance of your new ideas, or for you to argue that the standard and widely used ideas are wrong. I would like to hear these arguments and have this discussion, but in a more appropriate place. Sam (talk) 02:17, 10 December 2009 (UTC)[reply]

  1. ^ Cite error: The named reference Hewitt06 was invoked but never defined (see the help page).