Sign language
Sign languages (also known as signed languages) are languages that use the visual-manual modality to convey meaning. Sign languages are expressed through manual articulations in combination with non-manual elements. Sign languages are full-fledged natural languages with their own grammar and lexicon.[1] Sign languages are not universal and they are not mutually intelligible with each other,[2] although there are also striking similarities among sign languages.
Linguists consider both spoken and signed communication to be types of natural language, meaning that both emerged through an abstract, protracted aging process and evolved over time without meticulous planning.[3] Sign language should not be confused with body language, a type of nonverbal communication.
Wherever communities of deaf people exist, sign languages have developed as useful means of communication, and they form the core of local Deaf cultures. Although signing is used primarily by the deaf and hard of hearing, it is also used by hearing individuals, such as those unable to physically speak, those who have trouble with spoken language due to a disability or condition (augmentative and alternative communication), or those with deaf family members, such as children of deaf adults.
It is unclear how many sign languages currently exist worldwide. Each country generally has its own native sign language, and some have more than one. The 2021 edition of Ethnologue lists 150 sign languages,[4] while the SIGN-HUB Atlas of Sign Language Structures lists over 200 of them and notes that there are more which have not been documented or discovered yet.[5]
Some sign languages have obtained some form of legal recognition.[6]
Linguists distinguish natural sign languages from other systems that are precursors to them or obtained from them, such as invented manual codes for spoken languages, home sign, "baby sign", and signs learned by non-human primates.
History
Groups of deaf people have used sign languages throughout history. One of the earliest written records of a sign language is from the fifth century BC, in Plato's Cratylus, where Socrates says: "If we hadn't a voice or a tongue, and wanted to express things to one another, wouldn't we try to make signs by moving our hands, head, and the rest of our body, just as dumb people do at present?"[7]
Until the 19th century, most of what is known about historical sign languages is limited to the manual alphabets (fingerspelling systems) that were invented to facilitate transfer of words from a spoken language to a sign language, rather than documentation of the language itself. Pedro Ponce de León (1520–1584) is said to have developed the first manual alphabet.[8]
In 1620, Juan Pablo Bonet published Reducción de las letras y arte para enseñar a hablar a los mudos (‘Reduction of letters and art for teaching mute people to speak’) in Madrid.[9] It is considered the first modern treatise of sign language phonetics, setting out a method of oral education for deaf people and a manual alphabet.
In Britain, manual alphabets were also in use for a number of purposes, such as secret communication,[10] public speaking, or communication by deaf people.[11] In 1648, John Bulwer described "Master Babington", a deaf man proficient in the use of a manual alphabet, "contryved on the joynts of his fingers", whose wife could converse with him easily, even in the dark through the use of tactile signing.[12]
In 1680, George Dalgarno published Didascalocophus, or, The deaf and dumb mans tutor,[13] in which he presented his own method of deaf education, including an "arthrological" alphabet, where letters are indicated by pointing to different joints of the fingers and palm of the left hand. Arthrological systems had been in use by hearing people for some time;[14] some have speculated that they can be traced to early Ogham manual alphabets.[15][16]
The vowels of this alphabet have survived in the modern alphabets used in British Sign Language, Auslan and New Zealand Sign Language. The earliest known printed pictures of consonants of the modern two-handed alphabet appeared in 1698 with Digiti Lingua (Latin for Language [or Tongue] of the Finger), a pamphlet by an anonymous author who was himself unable to speak.[17][18] He suggested that the manual alphabet could also be used by mutes, for silence and secrecy, or purely for entertainment. Nine of its letters can be traced to earlier alphabets, and 17 letters of the modern two-handed alphabet can be found among the two sets of 26 handshapes depicted.
Charles de La Fin published a book in 1692 describing an alphabetic system where pointing to a body part represented the first letter of the part (e.g. Brow=B), and vowels were located on the fingertips as with the other British systems.[19] He described such codes for both English and Latin.
By 1720, the British manual alphabet had found more or less its present form.[20] Descendants of this alphabet have been used by deaf communities (or at least in classrooms) in former British colonies India, Australia, New Zealand, Uganda and South Africa, as well as the republics and provinces of the former Yugoslavia, Grand Cayman Island in the Caribbean, Indonesia, Norway, Germany and the United States. Against the British, During Polygar war Veeran Sundaralingam a joint exercise communicated with Veerapandiya Kattabomman mute younger brother Oomaithurai by using their own sign language, first suicide bomb later two were hanged lastly in 1801.
Frenchman Charles-Michel de l'Épée published his manual alphabet in the 18th century, which has survived largely unchanged in France and North America until the present time. In 1755, Abbé de l'Épée founded the first school for deaf children in Paris; Laurent Clerc was arguably its most famous graduate. Clerc went to the United States with Thomas Hopkins Gallaudet to found the American School for the Deaf in Hartford, Connecticut, in 1817.[21][22] Gallaudet's son, Edward Miner Gallaudet, founded a school for the deaf in 1857 in Washington, D.C., which in 1864 became the National Deaf-Mute College. Now called Gallaudet University, it is still the only liberal arts university for deaf people in the world.
Sign languages generally do not have any linguistic relation to the spoken languages of the lands in which they arise. The correlation between sign and spoken languages is complex and varies depending on the country more than the spoken language. For example, Australia, Canada, New Zealand, the UK and the US all have English as their dominant language, but American Sign Language (ASL), used in the US and English-speaking Canada, is derived from French Sign Language[22] whereas the other three countries use varieties of British, Australian and New Zealand Sign Language, which is unrelated to ASL.[23] Similarly, the sign languages of Spain and Mexico are very different, despite Spanish being the national language in each country,[24] and the sign language used in Bolivia is based on ASL rather than any sign language that is used in any other Spanish-speaking country.[25] Variations also arise within a 'national' sign language which don't necessarily correspond to dialect differences in the national spoken language; rather, they can usually be correlated to the geographic location of residential schools for the deaf.[26][27]
International Sign, formerly known as Gestuno, is used mainly at international deaf events such as the Deaflympics and meetings of the World Federation of the Deaf. While recent studies claim that International Sign is a kind of a pidgin, they conclude that it is more complex than a typical pidgin and indeed is more like a full sign language.[28][29] While the more commonly used term is International Sign, it is sometimes referred to as Gestuno,[30] or International Sign Pidgin[29] and International Gesture (IG).[31] International Sign is a term used by the World Federation of the Deaf and other international organisations.
Linguistics
In linguistic terms, sign languages are as rich and complex as any spoken language, despite the common misconception that they are not "real languages". Professional linguists have studied many sign languages and found that they exhibit the fundamental properties that exist in all languages.[32][1][33] Such fundamental properties include duality of patterning[34] and recursion[35]. Duality of patterning means that languages are composed of smaller, meaningless units which can be combined to larger units with meaning (see below). The term recursion means that languages exhibit grammatical rules and the output of such a rule can be the input of the same rule. It is, for example, possible in sign languages to create subordinate clauses and a subordinate clause may contain another subordinate clause.
Sign languages are not mime—in other words, signs are conventional, often arbitrary and do not necessarily have a visual relationship to their referent, much as most spoken language is not onomatopoeic. While iconicity is more systematic and widespread in sign languages than in spoken ones, the difference is not categorical.[36] The visual modality allows the human preference for close connections between form and meaning, present but suppressed in spoken languages, to be more fully expressed.[37] This does not mean that sign languages are a visual rendition of a spoken language. They have complex grammars of their own and can be used to discuss any topic, from the simple and concrete to the lofty and abstract.
Sign languages, like spoken languages, organize elementary, meaningless units called phonemes into meaningful semantic units. (These were once called cheremes, from the Greek word for "hand", in the case of sign languages, by analogy to the phonemes, from Greek for "voice", of spoken languages, but now also called phonemes, since the function is the same.) This is often called duality of patterning. As in spoken languages, these meaningless units are represented as (combinations of) features, although crude distinctions are often also made in terms of handshape (or handform), orientation, location (or place of articulation), movement, and non-manual expression.[38] More generally, both sign and spoken languages share the characteristics that linguists have found in all natural human languages, such as transitoriness, semanticity, arbitrariness, productivity, and cultural transmission.[clarification needed]
Common linguistic features of many sign languages are the occurrence of classifier constructions, a high degree of inflection by means of changes of movement, and a topic-comment syntax. More than spoken languages, sign languages can convey meaning by simultaneous means, e.g. by the use of space, two manual articulators, and the signer's face and body. Though there is still much discussion on the topic of iconicity in sign languages, classifiers are generally considered to be highly iconic, as these complex constructions "function as predicates that may express any or all of the following: motion, position, stative-descriptive, or handling information".[39] It needs to be noted that the term classifier is not used by everyone working on these constructions. Across the field of sign language linguistics the same constructions are also referred with other terms.[which?]
Today, linguists study sign languages as true languages, part of the field of linguistics. However, the category "sign languages" was not added to the Linguistic Bibliography / Bibliographie Linguistique until the 1988 volume,[40] when it appeared with 39 entries.
Relationships with spoken languages
There is a common misconception that sign languages are somehow dependent on spoken languages: that they are spoken language expressed in signs, or that they were invented by hearing people.[41] Similarities in language processing in the brain between signed and spoken languages further perpetuated this misconception. Hearing teachers in deaf schools, such as Charles-Michel de l'Épée or Thomas Hopkins Gallaudet, are often incorrectly referred to as "inventors" of sign language. Instead, sign languages, like all natural languages, are developed by the people who use them, in this case, deaf people, who may have little or no knowledge of any spoken language.
As a sign language develops, it sometimes borrows elements from spoken languages, just as all languages borrow from other languages that they are in contact with. Sign languages vary in how and how much they borrow from spoken languages. In many sign languages, a manual alphabet (fingerspelling) may be used in signed communication to borrow a word from a spoken language, by spelling out the letters. This is most commonly used for proper names of people and places; it is also used in some languages for concepts for which no sign is available at that moment, particularly if the people involved are to some extent bilingual in the spoken language. Fingerspelling can sometimes be a source of new signs, such as initialized signs, in which the handshape represents the first letter of a spoken word with the same meaning.
On the whole, though, sign languages are independent of spoken languages and follow their own paths of development. For example, British Sign Language (BSL) and American Sign Language (ASL) are quite different and mutually unintelligible, even though the hearing people of the United Kingdom and the United States share the same spoken language. The grammars of sign languages do not usually resemble those of spoken languages used in the same geographical area; in fact, in terms of syntax, ASL shares more with spoken Japanese than it does with English.[42]
Similarly, countries which use a single spoken language throughout may have two or more sign languages, or an area that contains more than one spoken language might use only one sign language. South Africa, which has 11 official spoken languages and a similar number of other widely used spoken languages, is a good example of this. It has only one sign language with two variants due to its history of having two major educational institutions for the deaf which have served different geographic areas of the country.
Spatial grammar and simultaneity
Sign languages exploit the unique features of the visual medium (sight), but may also exploit tactile features (tactile sign languages). Spoken language is by and large linear; only one sound can be made or received at a time. Sign language, on the other hand, is visual and, hence, can use a simultaneous expression, although this is limited articulatorily and linguistically. Visual perception allows processing of simultaneous information.
One way in which many sign languages take advantage of the spatial nature of the language is through the use of classifiers. Classifiers allow a signer to spatially show a referent's type, size, shape, movement, or extent.
The large focus on the possibility of simultaneity in sign languages in contrast to spoken languages is sometimes exaggerated, though. The use of two manual articulators is subject to motor constraints, resulting in a large extent of symmetry[43] or signing with one articulator only. Further, sign languages, just like spoken languages, depend on linear sequencing of signs to form sentences; the greater use of simultaneity is mostly seen in the morphology (internal structure of individual signs).
Non-manual elements
Sign languages convey much of their prosody through non-manual elements. Postures or movements of the body, head, eyebrows, eyes, cheeks, and mouth are used in various combinations to show several categories of information, including lexical distinction, grammatical structure, adjectival or adverbial content, and discourse functions.
At the lexical level, signs can be lexically specified for non-manual elements in addition to the manual articulation. For instance, facial expressions may accompany verbs of emotion, as in the sign for angry in Czech Sign Language. Non-manual elements may also be lexically contrastive. For example, in ASL (American Sign Language), facial components distinguish some signs from other signs. An example is the sign translated as not yet, which requires that the tongue touch the lower lip and that the head rotate from side to side, in addition to the manual part of the sign. Without these features the sign would be interpreted as late.[44] Mouthings, which are (parts of) spoken words accompanying lexical signs, can also be contrastive, as in the manually identical signs for doctor and battery in Sign Language of the Netherlands.[45]
While the content of a signed sentence is produced manually, many grammatical functions are produced non-manually (i.e., with the face and the torso).[46] Such functions include questions, negation, relative clauses and topicalization.[47] ASL and BSL use similar non-manual marking for yes/no questions, for example. They are shown through raised eyebrows and a forward head tilt.[48][49]
Some adjectival and adverbial information is conveyed through non-manual elements, but what these elements are varies from language to language. For instance, in ASL a slightly open mouth with the tongue relaxed and visible in the corner of the mouth means 'carelessly', but a similar non-manual in BSL means 'boring' or 'unpleasant'.[49]
Discourse functions such as turn taking are largely regulated through head movement and eye gaze. Since the addressee in a signed conversation must be watching the signer, a signer can avoid letting the other person have a turn by not looking at them, or can indicate that the other person may have a turn by making eye contact.[50]
Iconicity
Iconicity is similarity or analogy between the form of a sign (linguistic or otherwise) and its meaning, as opposed to arbitrariness. The first studies on iconicity in ASL were published in the late 1970s, and early 1980s. Many early sign language linguists rejected the notion that iconicity was an important aspect of sign languages, considering most perceived iconicity to be extralinguistic.[51][32] However, mimetic aspects of sign language (signs that imitate, mimic, or represent) are found in abundance across a wide variety of sign languages. For example, when deaf children learning sign language try to express something but do not know the associated sign, they will often invent an iconic sign that displays mimetic properties.[52] Though it never disappears from a particular sign language, iconicity is gradually weakened as forms of sign languages become more customary and are subsequently grammaticized. As a form becomes more conventional, it becomes disseminated in a methodical way phonologically to the rest of the sign language community.[53] Nancy Frishberg concluded that though originally present in many signs, iconicity is degraded over time through the application of natural grammatical processes.[51]
In 1978, Psychologist Roger Brown was one of the first to suggest that the properties of ASL give it a clear advantage in terms of learning and memory.[54] In his study, Brown found that when a group of six hearing children were taught signs that had high levels of iconic mapping they were significantly more likely to recall the signs in a later memory task than another group of six children that were taught signs that had little or no iconic properties. In contrast to Brown, linguists Elissa Newport and Richard Meier found that iconicity "appears to have virtually no impact on the acquisition of American Sign Language".[55]
A central task for the pioneers of sign language linguistics was trying to prove that ASL was a real language and not merely a collection of gestures or "English on the hands." One of the prevailing beliefs at this time was that 'real languages' must consist of an arbitrary relationship between form and meaning. Thus, if ASL consisted of signs that had iconic form-meaning relationship, it could not be considered a real language. As a result, iconicity as a whole was largely neglected in research of sign languages for a long time. However, iconicity also plays a role in many spoken languages. Spoken Japanese for example exhibits many words mimiking the sounds of their potential referents (see Japanese sound symbolism). Later researchers, thus, acknowledged that natural languages do not need to consist of an arbitrary relationship between form and meaning. [56] The visual nature of sign language simply allows for a greater degree of iconicity compared to spoken languages as most real-world objects can be described by a prototypical shape (e.g., a table usually has a flat surface), but most real-world objects do make prototypical sounds that can be mimiked by spoken languages (e.g., tables do not make prototypical sounds). It has to be noted, however, that sign languages are not fully iconic. On the one hand, there are also many arbitrary signs in sign languages and, on the other hand, the grammar of a sign language puts limits to the degree of iconicity: All known sign languages, for example, express lexical concepts via manual signs. From a truly iconic language one would expect that a concept like SMILING would be expressed by mimiking a smile (i.e., by performing a smiling face). All known sign languages, however, do not express the concept of SMILING by a smiling face, but by a manual sign.[57]
The cognitive linguistics perspective rejects a more traditional definition of iconicity as a relationship between linguistic form and a concrete, real-world referent. Rather it is a set of selected correspondences between the form and meaning of a sign.[37] In this view, iconicity is grounded in a language user's mental representation ("construal" in cognitive grammar). It is defined as a fully grammatical and central aspect of a sign language rather than a peripheral phenomenon.[58]
The cognitive linguistics perspective allows for some signs to be fully iconic or partially iconic given the number of correspondences between the possible parameters of form and meaning.[59] In this way, the Israeli Sign Language (ISL) sign for "ask" has parts of its form that are iconic ("movement away from the mouth" means "something coming from the mouth"), and parts that are arbitrary (the handshape, and the orientation).[60]
Many signs have metaphoric mappings as well as iconic or metonymic ones. For these signs there are three way correspondences between a form, a concrete source and an abstract target meaning. The ASL sign LEARN has this three way correspondence. The abstract target meaning is "learning". The concrete source is putting objects into the head from books. The form is a grasping hand moving from an open palm to the forehead. The iconic correspondence is between form and concrete source. The metaphorical correspondence is between concrete source and abstract target meaning. Because the concrete source is connected to two correspondences linguistics refer to metaphorical signs as "double mapped".[37][59][60]
Classification
Although sign languages have emerged naturally in deaf communities alongside or among spoken languages, they are unrelated to spoken languages and have different grammatical structures at their core.
Sign languages may be classified by how they arise.
In non-signing communities, home sign is not a full language, but closer to a pidgin. Home sign is amorphous and generally idiosyncratic to a particular family, where a deaf child does not have contact with other deaf children and is not educated in sign. Such systems are not generally passed on from one generation to the next. Where they are passed on, creolization would be expected to occur, resulting in a full language. However, home sign may also be closer to full language in communities where the hearing population has a gestural mode of language; examples include various Australian Aboriginal sign languages and gestural systems across West Africa, such as Mofu-Gudur in Cameroon.
A village sign language is a local indigenous language that typically arises over several generations in a relatively insular community with a high incidence of deafness, and is used both by the deaf and by a significant portion of the hearing community, who have deaf family and friends.[61] The most famous of these is probably the extinct Martha's Vineyard Sign Language of the US, but there are also numerous village languages scattered throughout Africa, Asia, and America.
Deaf-community sign languages, on the other hand, arise where deaf people come together to form their own communities. These include school sign, such as Nicaraguan Sign Language, which develop in the student bodies of deaf schools which do not use sign as a language of instruction, as well as community languages such as Bamako Sign Language, which arise where generally uneducated deaf people congregate in urban centers for employment. At first, Deaf-community sign languages are not generally known by the hearing population, in many cases not even by close family members. However, they may grow, in some cases becoming a language of instruction and receiving official recognition, as in the case of ASL.
Both contrast with speech-taboo languages such as the various Aboriginal Australian sign languages, which are developed by the hearing community and only used secondarily by the deaf. It is doubtful whether most of these are languages in their own right, rather than manual codes of spoken languages, though a few such as Yolngu Sign Language are independent of any particular spoken language. Hearing people may also develop sign to communicate with users of other languages, as in Plains Indian Sign Language; this was a contact signing system or pidgin that was evidently not used by deaf people in the Plains nations, though it presumably influenced home sign.
Language contact and creolization is common in the development of sign languages, making clear family classifications difficult – it is often unclear whether lexical similarity is due to borrowing or a common parent language, or whether there was one or several parent languages, such as several village languages merging into a Deaf-community language. Contact occurs between sign languages, between sign and spoken languages (contact sign, a kind of pidgin), and between sign languages and gestural systems used by the broader community. One author has speculated that Adamorobe Sign Language, a village sign language of Ghana, may be related to the "gestural trade jargon used in the markets throughout West Africa", in vocabulary and areal features including prosody and phonetics.[62][63]
- BSL, Auslan and NZSL are usually considered to be a language known as BANZSL. Maritime Sign Language and South African Sign Language are also related to BSL.[64]
- Danish Sign Language and its descendants Norwegian Sign Language and Icelandic Sign Language are largely mutually intelligible with Swedish Sign Language. Finnish Sign Language and Portuguese Sign Language derive from Swedish SL, though with local admixture in the case of mutually unintelligible Finnish SL.[clarification needed] Danish SL has French SL influence and Wittmann (1991) places them in that family,[63] though he proposes that Swedish, Finnish, and Portuguese SL are instead related to British Sign Language.
- Indian Sign Language ISL is similar to Pakistani Sign Language. (ISL fingerspelling uses both hands, similarly to British Sign Language.).
- Japanese Sign Language, Taiwanese Sign Language and Korean Sign Language are thought to be members of a Japanese Sign Language family.[65]
- French Sign Language family. There are a number of sign languages that emerged from French Sign Language (LSF), or are the result of language contact between local community sign languages and LSF. These include: French Sign Language, Italian Sign Language, Quebec Sign Language, American Sign Language, Irish Sign Language, Russian Sign Language, Dutch Sign Language (NGT), Spanish Sign Language, Mexican Sign Language, Brazilian Sign Language (LIBRAS), Catalan Sign Language, Ukrainian Sign Language, Austrian Sign Language (along with its twin Hungarian Sign Language and its offspring Czech Sign Language) and others.
- A subset of this group includes languages that have been heavily influenced by American Sign Language (ASL), or are regional varieties of ASL. Bolivian Sign Language is sometimes considered a dialect of ASL. Thai Sign Language is a mixed language derived from ASL and the native sign languages of Bangkok and Chiang Mai, and may be considered part of the ASL family. Others possibly influenced by ASL include Ugandan Sign Language, Kenyan Sign Language, Philippine Sign Language and Malaysian Sign Language.
- German Sign Language (DGS) gave rise to Polish Sign Language; it also at least strongly influenced Israeli Sign Language, though it is unclear whether the latter derives from DGS or from Austrian Sign Language, which is in the French family.
- Lyons Sign Language may be the source of Flemish Sign Language (VGT) though this is unclear.
- According to an SIL report, the sign languages of Russia, Moldova and Ukraine share a high degree of lexical similarity and may be dialects of one language, or distinct related languages. The same report suggested a "cluster" of sign languages centered around Czech Sign Language, Hungarian Sign Language and Slovak Sign Language. This group may also include Romanian, Bulgarian, and Polish sign languages.
- Sign languages of Jordan, Lebanon, Syria, Palestine, and Iraq (and possibly Saudi Arabia) may be part of a sprachbund, or may be one dialect of a larger Eastern Arabic Sign Language.
- Known isolates include Nicaraguan Sign Language, Turkish Sign Language, Kata Kolok, Al-Sayyid Bedouin Sign Language and Providence Island Sign Language.
The only comprehensive classification along these lines going beyond a simple listing of languages dates back to 1991.[66] The classification is based on the 69 sign languages from the 1988 edition of Ethnologue that were known at the time of the 1989 conference on sign languages in Montreal and 11 more languages the author added after the conference.[68]
Primary language |
Primary group |
Auxiliary language |
Auxiliary group | |
---|---|---|---|---|
Prototype-A[69] | 5 | 1 | 7 | 2 |
Prototype-R[70] | 18 | 1 | 1 | – |
BSL-derived | 8 | – | – | – |
DGS-derived | 1 or 2 | – | – | – |
JSL-derived | 2 |
– |
– | – |
LSF-derived | 30 | – | – | – |
LSG-derived | – | – | – |
In his classification, the author distinguishes between primary and auxiliary sign languages[71] as well as between single languages and names that are thought to refer to more than one language.[72] The prototype-A class of languages includes all those sign languages that seemingly cannot be derived from any other language.[69] Prototype-R languages are languages that are remotely modelled on a prototype-A language (in many cases thought to have been French Sign Language) by a process Kroeber (1940) called "stimulus diffusion".[70] The families of BSL, DGS, JSL, LSF (and possibly LSG) were the products of creolization and relexification of prototype languages.[73] Creolization is seen as enriching overt morphology in sign languages, as compared to reducing overt morphology in spoken languages.[74]
Typology
Linguistic typology (going back to Edward Sapir) is based on word structure and distinguishes morphological classes such as agglutinating/concatenating, inflectional, polysynthetic, incorporating, and isolating ones.
Sign languages vary in word-order typology. For example, Austrian Sign Language, Japanese Sign Language and Indo-Pakistani Sign Language are Subject-object-verb while ASL is Subject-verb-object. Influence from the surrounding spoken languages is not improbable.
Sign languages tend to be incorporating classifier languages, where a classifier handshape representing the object is incorporated into those transitive verbs which allow such modification. For a similar group of intransitive verbs (especially motion verbs), it is the subject which is incorporated. Only in a very few sign languages (for instance Japanese Sign Language) are agents ever incorporated. In this way, since subjects of intransitives are treated similarly to objects of transitives, incorporation in sign languages can be said to follow an ergative pattern.
Brentari[75][76] classifies sign languages as a whole group determined by the medium of communication (visual instead of auditory) as one group with the features monosyllabic and polymorphemic. That means, that one syllable (i.e. one word, one sign) can express several morphemes, e.g., subject and object of a verb determine the direction of the verb's movement (inflection).
Another aspect of typology that has been studied in sign languages is their systems for cardinal numbers.<refUlrike, Zeshan; Escobedo Delgado, Cesar Ernesto; Dikyuva, Hasan; Panda, Sibaji; de Vos, Connie (2013). "Cardinal numerals in rural sign languages: Approaching cross-modal typology". Linguistic Typology. 17 (3). doi:10.1515/lity-2013-0019. hdl:11858/00-001M-0000-0013-B2E1-B. S2CID 145616039.</ref> Typologically significant differences have been found between sign languages.
Acquisition
Children who are exposed to a sign language from birth will acquire it, just as hearing children acquire their native spoken language.[77]
The Critical Period hypothesis suggests that language, spoken or signed, is more easily acquired as a child at a young age versus an adult because of the plasticity of the child's brain. In a study done at the University of McGill, they found that American Sign Language users who acquired the language natively (from birth) performed better when asked to copy videos of ASL sentences than ASL users who acquired the language later in life. They also found that there are differences in the grammatical morphology of ASL sentences between the two groups, all suggesting that there is a very important critical period in learning signed languages.[78]
The acquisition of non-manual features follows an interesting pattern: When a word that always has a particular non-manual feature associated with it (such as a wh- question word) is learned, the non-manual aspects are attached to the word but don't have the flexibility associated with adult use. At a certain point, the non-manual features are dropped and the word is produced with no facial expression. After a few months, the non-manuals reappear, this time being used the way adult signers would use them.[79]
Written forms
Sign languages do not have a traditional or formal written form. Many deaf people do not see a need to write their own language.[80]
Several ways to represent sign languages in written form have been developed.
- Stokoe notation, devised by Dr. William Stokoe for his 1965 Dictionary of American Sign Language,[81] is an abstract phonemic notation system. Designed specifically for representing the use of the hands, it has no way of expressing facial expression or other non-manual features of sign languages. However, his was designed for research, particularly in a dictionary, not for general use.
- The Hamburg Notation System (HamNoSys), developed in the early 1990s, is a detailed phonetic system, not designed for any one sign language, and intended as a transcription system for researchers rather than as a practical script.
- David J. Peterson has attempted to create a phonetic transcription system for signing that is ASCII-friendly known as the Sign Language International Phonetic Alphabet (SLIPA).
- SignWriting, developed by Valerie Sutton in 1974, is a system for representing sign languages phonetically (including mouthing, facial expression and dynamics of movement). The script is sometimes used for detailed research, language documentation, as well as publishing texts and works in sign languages.
- si5s is another orthography which is largely phonemic. However, a few signs are logographs and/or ideographs due to regional variation in sign languages.
- ASL-phabet is a system designed primarily for education of deaf children by Dr. Sam Supalla which uses a minimalist collection of symbols in the order of Handshape-Location-Movement. Many signs can be written the same way (homograph).
- The Alphabetic Writing System for sign languages (Sistema de escritura alfabética, SEA, by its Spanish name and acronym), developed by linguist Ángel Herrero Blanco and two deaf researchers, Juan José Alfaro and Inmacualada Cascales, was published as a book in 2003[82] and made accessible in Spanish Sign Language on-line.[83] This system makes use of the letters of the Latin alphabet with a few diacritics to represent sign through the morphemic sequence S L C Q D F (bimanual sign, place, contact, handshape, direction and internal form). The resulting words are meant to be read by signing. The system is designed to be applicable to any sign language with minimal modification and to be usable through any medium without special equipment or software. Non-manual elements can be encoded to some extent, but the authors argue that the system does not need to represent all elements of a sign to be practical, the same way written oral language doesn't. The system has seen some updates which are kept publicly on a wiki page.[84] The Center for Linguistic Normalization of Spanish Sign Language has made use of SEA to transcribe all signs on its dictionary.[85]
So far, there is no consensus regarding the written form of sign language. Except for SignWriting, none are widely used. Maria Galea writes that SignWriting "is becoming widespread, uncontainable and untraceable. In the same way that works written in and about a well developed writing system such as the Latin script, the time has arrived where SW is so widespread, that it is impossible in the same way to list all works that have been produced using this writing system and that have been written about this writing system."[86] In 2015, the Federal University of Santa Catarina accepted a dissertation written in Brazilian Sign Language using Sutton SignWriting for a master's degree in linguistics. The dissertation "The Writing of Grammatical Non-Manual Expressions in Sentences in LIBRAS Using the SignWriting System" by João Paulo Ampessan states that "the data indicate the need for [non-manual expressions] usage in writing sign language".
Sign perception
For a native signer, sign perception influences how the mind makes sense of their visual language experience. For example, a handshape may vary based on the other signs made before or after it, but these variations are arranged in perceptual categories during its development. The mind detects handshape contrasts but groups similar handshapes together in one category.[87][88][89] Different handshapes are stored in other categories. The mind ignores some of the similarities between different perceptual categories, at the same time preserving the visual information within each perceptual category of handshape variation.
In society
Deaf communities and deaf culture
When Deaf people constitute a relatively small proportion of the general population, Deaf communities often develop that are distinct from the surrounding hearing community.[90] These Deaf communities are very widespread in the world, associated especially with sign languages used in urban areas and throughout a nation, and the cultures they have developed are very rich.
One example of sign language variation in the Deaf community is Black ASL. This sign language was developed in the Black Deaf community as a variant during the American era of segregation and racism, where young Black Deaf students were forced to attend separate schools than their white Deaf peers.[91]
Use of sign languages in hearing communities
On occasion, where the prevalence of deaf people is high enough, a deaf sign language has been taken up by an entire local community, forming what is sometimes called a "village sign language"[92] or "shared signing community".[93] Typically this happens in small, tightly integrated communities with a closed gene pool. Famous examples include:
- Martha's Vineyard Sign Language, United States
- Al-Sayyid Bedouin Sign Language, Israel
- Kata Kolok, Bali
- Adamorobe Sign Language, Ghana
- Yucatec Maya Sign Language, Mexico
In such communities deaf people are generally well integrated in the general community and not socially disadvantaged, so much so that it is difficult to speak of a separate "Deaf" community.[90]
Many Australian Aboriginal sign languages arose in a context of extensive speech taboos, such as during mourning and initiation rites. They are or were especially highly developed among the Warlpiri, Warumungu, Dieri, Kaytetye, Arrernte, and Warlmanpa, and are based on their respective spoken languages.
A pidgin[citation needed] sign language arose among tribes of American Indians in the Great Plains region of North America (see Plains Indian Sign Language). It was used by hearing people to communicate among tribes with different spoken languages, as well as by deaf people. There are especially users today among the Crow, Cheyenne, and Arapaho. Unlike Australian Aboriginal sign languages, it shares the spatial grammar of deaf sign languages. In the 1500s, a Spanish expeditionary, Cabeza de Vaca, observed natives in the western part of modern-day Florida using sign language,[citation needed] and in the mid-16th century Coronado mentioned that communication with the Tonkawa using signs was possible without a translator.[citation needed] Whether or not these gesture systems reached the stage at which they could properly be called languages is still up for debate. There are estimates indicating that as many as 2% of Native Americans are seriously or completely deaf, a rate more than twice the national average.[citation needed]
Sign language is also used by some people as a form of alternative or augmentative communication by people who can hear but cannot use their voices to speak.
Legal recognition
Some sign languages have obtained some form of legal recognition, while others have no status at all. Sarah Batterbury has argued that sign languages should be recognized and supported not merely as an accommodation for the disabled, but as the communication medium of language communities.[94]
Telecommunications
One of the first demonstrations of the ability for telecommunications to help sign language users communicate with each other occurred when AT&T's videophone (trademarked as the "Picturephone") was introduced to the public at the 1964 New York World's Fair – two deaf users were able to freely communicate with each other between the fair and another city.[95] However, video communication did not become widely available until sufficient bandwidth for the high volume of video data became available in the early 2000s.
The Internet now allows deaf people to talk via a video link, either with a special-purpose videophone designed for use with sign language or with "off-the-shelf" video services designed for use with broadband and an ordinary computer webcam. The special videophones that are designed for sign language communication may provide better quality than 'off-the-shelf' services and may use data compression methods specifically designed to maximize the intelligibility of sign languages. Some advanced equipment enables a person to remotely control the other person's video camera, in order to zoom in and out or to point the camera better to understand the signing.
Interpretation
In order to facilitate communication between deaf and hearing people, sign language interpreters are often used. Such activities involve considerable effort on the part of the interpreter, since sign languages are distinct natural languages with their own syntax, different from any spoken language.
The interpretation flow is normally between a sign language and a spoken language that are customarily used in the same country, such as French Sign Language (LSF) and spoken French in France, Spanish Sign Language (LSE) to spoken Spanish in Spain, British Sign Language (BSL) and spoken English in the U.K., and American Sign Language (ASL) and spoken English in the US and most of anglophone Canada (since BSL and ASL are distinct sign languages both used in English-speaking countries), etc. Sign language interpreters who can translate between signed and spoken languages that are not normally paired (such as between LSE and English), are also available, albeit less frequently.
With recent developments in artificial intelligence in computer science, some recent deep learning based machine translation algorithms have been developed which automatically translate short videos containing sign language sentences (often simple sentence consists of only one clause) directly to written language.[96]
Remote interpreting
Interpreters may be physically present with both parties to the conversation but, since the technological advancements in the early 2000s, provision of interpreters in remote locations has become available. In video remote interpreting (VRI), the two clients (a sign language user and a hearing person who wish to communicate with each other) are in one location, and the interpreter is in another. The interpreter communicates with the sign language user via a video telecommunications link, and with the hearing person by an audio link. VRI can be used for situations in which no on-site interpreters are available.
However, VRI cannot be used for situations in which all parties are speaking via telephone alone. With video relay service (VRS), the sign language user, the interpreter, and the hearing person are in three separate locations, thus allowing the two clients to talk to each other on the phone through the interpreter.
Interpretation on television
Sign language is sometimes provided for television programmes that include speech. The signer usually appears in the bottom corner of the screen, with the programme being broadcast full size or slightly shrunk away from that corner. Typically for press conferences such as those given by the Mayor of New York City, the signer appears to stage left or right of the public official to allow both the speaker and signer to be in frame at the same time.
Paddy Ladd initiated deaf programming on British television in the 1980s and is credited with getting sign language on television and enabling deaf children to be educated in sign.[97]
In traditional analogue broadcasting, many programmes are repeated, often in the early hours of the morning, with the signer present rather than have them appear at the main broadcast time.[98] This is due to the distraction they cause to those not wishing to see the signer.[citation needed] On the BBC, many programmes that broadcast late at night or early in the morning are signed. Some emerging television technologies allow the viewer to turn the signer on and off in a similar manner to subtitles and closed captioning.[98]
Legal requirements covering sign language on television vary from country to country. In the United Kingdom, the Broadcasting Act 1996 addressed the requirements for blind and deaf viewers,[99] but has since been replaced by the Communications Act 2003.
Language endangerment and extinction
As with any spoken language, sign languages are also vulnerable to becoming endangered.[100] For example, a sign language used by a small community may be endangered and even abandoned as users shift to a sign language used by a larger community, as has happened with Hawai'i Sign Language, which is almost extinct except for a few elderly signers.[101][102] Even nationally recognised sign languages can be endangered; for example, New Zealand Sign Language is losing users.[103] Methods are being developed to assess the language vitality of sign languages.[104]
- Endangered sign languages
- Adamorobe Sign Language (AdaSL)[105]
- Ban Khor Sign Language (BKSL)[105]
- Benkala Sign Language (KK)[105]
- Hawai'i Sign Language (HPSL)[105]
- Inuit Sign Language (IUR)[106]
- Jamaican Country Sign Language (KS)[107]
- Maritime Sign Language (MSL)[105]
- Old Bangkok Sign Language (OBSL)[105]
- Old Chiangmai Sign Language (OCSL)[105]
- Plains Indian Sign Language (PISL[105]
- Providencia Sign Language (PSL)[105]
- Rennellese Sign Language (RSL)[105]
- Extinct sign languages
- Angami Naga Sign Language
- Belgian Sign Language (BGT)
- Henniker Sign Language
- Martha's Vineyard Sign Language (MVSL)
- Ngarrindjeri sign language
- Old French Sign Language (VLSF)
- Old Kentish Sign Language (OKSL)
- Pitta Pitta sign language
- Plateau Sign Language
- Sandy River Valley Sign Language
- Warluwarra sign language
Communication systems similar to sign language
There are a number of communication systems that are similar in some respects to sign languages, while not having all the characteristics of a full sign language, particularly its grammatical structure. Many of these are either precursors to natural sign languages or are derived from them.
Manual codes for spoken languages
When Deaf and Hearing people interact, signing systems may be developed that use signs drawn from a natural sign language but used according to the grammar of the spoken language. In particular, when people devise one-for-one sign-for-word correspondences between spoken words (or even morphemes) and signs that represent them, the system that results is a manual code for a spoken language, rather than a natural sign language. Such systems may be invented in an attempt to help teach Deaf children the spoken language, and generally are not used outside an educational context.
"Baby sign language" with hearing children
Some hearing parents teach signs to young hearing children. Since the muscles in babies' hands grow and develop quicker than their mouths, signs are seen as a beneficial option for better communication.[108] Babies can usually produce signs before they can speak.[citation needed] This reduces the confusion between parents when trying to figure out what their child wants. When the child begins to speak, signing is usually abandoned, so the child does not progress to acquiring the grammar of the sign language.[citation needed]
This is in contrast to hearing children who grow up with Deaf parents, who generally acquire the full sign language natively, the same as Deaf children of Deaf parents.
Home sign
Informal, rudimentary sign systems are sometimes developed within a single family. For instance, when hearing parents with no sign language skills have a deaf child, the child may develop a system of signs naturally, unless repressed by the parents. The term for these mini-languages is home sign (sometimes "kitchen sign").[109]
Home sign arises due to the absence of any other way to communicate. Within the span of a single lifetime and without the support or feedback of a community, the child naturally invents signs to help meet his or her communication needs, and may even develop a few grammatical rules for combining short sequences of signs. Still, this kind of system is inadequate for the intellectual development of a child and it comes nowhere near meeting the standards linguists use to describe a complete language. No type of home sign is recognized as a full language.[110]
Primate use
There have been several notable examples of scientists teaching signs to non-human primates in order to communicate with humans,[111] such as chimpanzees,[112][113][114][115][116][117][118] gorillas[119] and orangutans.[120] However, linguists generally point out that this does not constitute knowledge of a human language as a complete system, rather than simply signs / words.[121][122][123][124][125] Notable examples of animals who have learned signs include:
- Chimpanzees: Washoe, Nim Chimpsky and Loulis
- Gorillas: Koko and Michael
Gestural theory of human language origins
One theory of the evolution of human language states that it developed first as a gestural system, which later shifted to speech.[126][127][128][63][129] An important question for this gestural theory is what caused the shift to vocalization.[130]
See also
- Animal language
- Body language
- Braille
- Fingerspelling
- Cherology
- Chinese number gestures
- Gang signal
- Gestures
- Intercultural competence
- International Sign
- Legal recognition of sign languages
- List of international common standards
- List of sign languages
- List of sign languages by number of native signers
- Manual communication
- Metacommunicative competence
- Modern Sign Language communication
- Origin of language
- Origin of speech
- Sign language glove
- Sign language in infants and toddlers
- Sign language media
- Sign Language Studies (journal)
- Sign name
- Sociolinguistics of sign languages
- Tactile signing
- Machine translation of sign languages
References
- ^ a b Sandler, Wendy; & Lillo-Martin, Diane. (2006). Sign Language and Linguistic Universals. Cambridge: Cambridge University Press.
- ^ "What is Sign Language?". Archived from the original on 13 February 2018. Retrieved 10 March 2018.
- ^ E.g.: Irit Meir, Wendy Sandler, Carol Padden, and Mark Aronoff (2010): Emerging Sign Languages. In: Marc Marschark and Patricia Elizabeth Spencer (eds.): The Oxford Handbook of Deaf Studies, Language, and Education, Vol. 2, pp. 267-280.
- ^ Eberhard, David M.; Simons, Gary F.; Fennig, Charles D., eds. (2021), "Sign language", Ethnologue: Languages of the World (24th ed.), SIL International, retrieved 2021-05-15
- ^ Hosemann, Jana; Steinbach, Markus, eds. (2021), Atlas of Sign Language Structures, SIGN-HUB, retrieved 2021-01-13
- ^ Wheatley, Mark & Annika Pabsch (2012). Sign Language Legislation in the European Union – Edition II. European Union of the Deaf.
- ^ Bauman, Dirksen (2008). Open your eyes: Deaf studies talking. University of Minnesota Press. ISBN 978-0-8166-4619-7.
- ^ Nielsen, K.E.. (2012). A Disability History of the United States. Beacon Press. ISBN 9780807022047.
- ^ Pablo Bonet, J. de (1620) Reduction de las letras y Arte para enseñar á ablar los Mudos. Ed. Abarca de Angulo, Madrid, ejemplar facsímil accesible en la "Reduction de las letras y arte para enseñar a ablar los mudos – Fondos Digitalizados de la Universidad de Sevilla". Archived from the original on 2011-07-18. Retrieved 2009-11-23., online (spanish) scan of book, held at University of Sevilla, Spain
- ^ Wilkins, John (1641). Mercury, the Swift and Silent Messenger. The book is a work on cryptography, and fingerspelling was referred to as one method of "secret discoursing, by signes and gestures". Wilkins gave an example of such a system: "Let the tops of the fingers signifie the five vowels; the middle parts, the first five consonants; the bottomes of them, the five next consonants; the spaces betwixt the fingers the foure next. One finger laid on the side of the hand may signifie T. Two fingers V the consonant; Three W. The little finger crossed X. The wrist Y. The middle of the hand Z." (1641:116–117)
- ^ John Bulwer's "Chirologia: or the natural language of the hand.", published in 1644, London, mentions that alphabets are in use by deaf people, although Bulwer presents a different system which is focused on public speaking.
- ^ Bulwer, J. (1648) Philocopus, or the Deaf and Dumbe Mans Friend, London: Humphrey and Moseley.
- ^ Dalgarno, George. Didascalocophus, or, The deaf and dumb mans tutor. Oxford: Halton, 1680.
- ^ See Wilkins (1641) above. Wilkins was aware that the systems he describes are old, and refers to Bede's account of Roman and Greek finger alphabets.
- ^ "Session 9". Bris.ac.uk. 2000-11-07. Archived from the original on 2010-06-02. Retrieved 2010-09-28.
- ^ Montgomery, G. (2002). "The Ancient Origins of Sign Handshapes" (PDF). Sign Language Studies. 2 (3): 322–334. doi:10.1353/sls.2002.0010. JSTOR 26204860. S2CID 144243540.
- ^ Moser, Henry M.; O’Neill, John J.; Oyer, Herbert J.; Wolfe, Susan M.; Abernathy, Edward A.; Schowe, Ben M. (1960). "Historical Aspects of Manual Communication". Journal of Speech and Hearing Disorders. 25 (2): 145–151. doi:10.1044/jshd.2502.145. PMID 14424535.
- ^ Hay, A. and Lee, R. (2004) A Pictorial History of the evolution of the British Manual Alphabet. British Deaf History Society Publications: Middlesex
- ^ Charles de La Fin (1692). Sermo mirabilis, or, The silent language whereby one may learn ... how to impart his mind to his friend, in any language ... being a wonderful art kept secret for several ages in Padua, and now published only to the wise and prudent ... London, Printed for Tho. Salusbury... and sold by Randal Taylor... 1692. OCLC 27245872
- ^ Daniel Defoe (1720). "The Life and Adventures of Mr. Duncan Campbell"
- ^ Canlas, Loida (2006). "Laurent Clerc: Apostle to the Deaf People of the New World". The Laurent Clerc National Deaf Education Center, Gallaudet University.
- ^ a b "How Sign Language Works". Stuff You Should Know. 2014-02-06. Retrieved 2019-03-26.
- ^ "Ethnologue report for language code: bfi". Ethnologue.com. Archived from the original on 2012-10-09. Retrieved 2012-09-30.
- ^ "SIL Electronic Survey Reports: Spanish Sign Language survey" (PDF). Sil.org. Archived (PDF) from the original on 2012-10-20. Retrieved 2012-09-30.
- ^ "SIL Electronic Survey Reports: Bolivia deaf community and sign language pre-survey report" (PDF). Sil.org. Archived (PDF) from the original on 2012-09-15. Retrieved 2012-09-30.
- ^ Lucas, Ceil, Robert Bayley and Clayton Valli. 2001. Sociolinguistic Variation in American Sign Language. Washington, DC: Gallaudet University Press.
- ^ Lucas, Ceil, Bayley, Robert, Clayton Valli. (2003). What's Your Sign for PIZZA? An Introduction to Variation in American Sign Language. Washington, DC: Gallaudet University Press.
- ^ Cf. Supalla, Ted & Rebecca Webb (1995). "The grammar of international sign: A new look at pidgin languages." In: Emmorey, Karen & Judy Reilly (eds). Language, gesture, and space. (International Conference on Theoretical Issues in Sign Language Research) Hillsdale, N.J.: Erlbaum, pp. 333–352
- ^ a b McKee, Rachel; Napier, Jemina (2002). "Interpreting into International Sign Pidgin". Sign Language & Linguistics. 5 (1): 27–54. doi:10.1075/sll.5.1.04mck.
- ^ Rubino, F., Hayhurst, A., and Guejlman, J. (1975). Gestuno. International sign language of the deaf. Carlisle: British Deaf Association.
- ^ Bar-Tzur, David (2002). International gesture: Principles and gestures website
Moody, W. (1987).International gesture. In J. V. Van Cleve (ed.), "Gallaudet encyclopedia of deaf people and deafness", Vol 3 S-Z, Index. New York: McGraw-Hill Book Company Inc. - ^ a b Klima, Edward S.; & Bellugi, Ursula. (1979). The signs of language. Cambridge, Massachusetts: Harvard University Press. ISBN 0-674-80795-2.
- ^ Baker, Anne; Bogaerde, Beppie van den; Pfau, Roland; Schermer, G. M. (2016). The Linguistics of Sign Languages: An Introduction. John Benjamins Publishing Company. p. 2. ISBN 978-90-272-1230-6.
- ^ Stokoe, William C. 1960. Sign Language Structure: An Outline of the Visual Communication Systems of the American Deaf, Studies in linguistics: Occasional papers (No. 8). Buffalo: Dept. of Anthropology and Linguistics, University of Buffalo.
- ^ Bross, Fabian (2020). The clausal syntax of German Sign Language. A cartographic approach. Berlin: Language Science Press. Page 37-38.
- ^ Johnston, Trevor A. (1989). Auslan: The Sign Language of the Australian Deaf community. The University of Sydney: unpublished Ph.D. dissertation.
- ^ a b c Taub, S. (2001). Language from the body. New York : Cambridge University Press.
- ^ Fabian Bross (2016). "Chereme" Archived 2018-03-17 at the Wayback Machine. In: Hall, T. A. Pompino-Marschall, B. (ed.): Dictionaries of Linguistics and Communication Science. Volume: Phonetics and Phonology. Berlin, New York: Mouton de Gruyter.
- ^ Emmorey, K. (2002). Language, cognition and the brain: Insights from sign language research. Mahwah, NJ: Lawrence Erlbaum Associates.
- ^ Janse, Mark; Borkent, Hans; Tol, Sijmen, eds. (1990). Linguistic Bibliography for the Year 1988. Leiden, Netherlands: Brill. pp. 970–972. ISBN 978-07-92-30936-9.
- ^ Perlmutter, David M. "What is Sign Language?" (PDF). LSA. Archived (PDF) from the original on 12 April 2014. Retrieved 4 November 2013.
- ^ Nakamura, Karen. (1995). "About American Sign Language." Deaf Resource Library, Yale University. [1]
- ^ Battison, Robbin (1978). Lexical Borrowing in American Sign Language. Silver Spring, MD: Linstok Press.
- ^ Liddell, Scott K. (2003). Grammar, Gesture, and Meaning in American Sign Language. Cambridge: Cambridge University Press.
- ^ Josep Quer i Carbonell; Carlo Cecchetto; Rannveig Sverrisd Ãttir, eds. (2017). SignGram blueprint: A guide to sign language grammar writing. De Gruyter Mouton. ISBN 9781501511806. OCLC 1012688117.
- ^ Bross, Fabian; Hole, Daniel. "Scope-taking strategies in German Sign Language". Glossa. 2 (1): 1–30. doi:10.5334/gjgl.106.
- ^ Boudreault, Patrick; Mayberry, Rachel I. (2006). "Grammatical processing in American Sign Language: Age of first-language acquisition effects in relation to syntactic structure". Language and Cognitive Processes. 21 (5): 608–635. doi:10.1080/01690960500139363. S2CID 13572435.
- ^ Baker, Charlotte, and Dennis Cokely (1980). American Sign Language: A teacher's resource text on grammar and culture. Silver Spring, MD: T.J. Publishers.
- ^ a b Sutton-Spence, Rachel, and Bencie Woll (1998). The linguistics of British Sign Language. Cambridge: Cambridge University Press.
- ^ Baker, Charlotte (1977). Regulators and turn-taking in American Sign Language discourse, in Lynn Friedman, On the other hand: New perspectives on American Sign Language. New York: Academic Press. ISBN 9780122678509
- ^ a b Frishberg, N (1975). "Arbitrariness and Iconicity: Historical Change in America". Language. 51 (3): 696–719. doi:10.2307/412894. JSTOR 412894.
- ^ Klima, Edward; Bellugi, Ursula (1989). "The Signs of Language". Sign Language Studies. 1062 (1): 11.
- ^ Brentari, Diane. "Introduction." Sign Languages, 2011, pp. 12.
- ^ Brown, R (1978). "Why Are Signed Languages Easier to Learn than Spoken Languages? Part Two". Bulletin of the American Academy of Arts and Sciences. 32 (3): 25–44. doi:10.2307/3823113. JSTOR 3823113.
- ^ Newport, Elissa; Meier, Richard (1985). The crosslinguistic study of language acquisition. Lawrence Erlbaum Associates. pp. 881–938. ISBN 0898593670.
- ^ For the history of research on iconicity in sign languages see, for example: Vermeerbergen, Myriam (2006): Past and current trends in sign language research. In: Language & Communication, 26(2). 168-192.
- ^ Bross, Fabian (2020). The clausal syntax of German Sign Language. A cartographic approach. Berlin: Language Science Press. Page 25.
- ^ Wilcox, S (2004). "Conceptual spaces and embodied actions: Cognitive iconicity and signed languages". Cognitive Linguistics. 15 (2): 119–47. doi:10.1515/cogl.2004.005.
- ^ a b Wilcox, P. (2000). Metaphor in American Sign Language. Washington D.C.: Gallaudet University Press.
- ^ a b Meir, I (2010). "Iconicity and metaphor: Constraints on metaphorical extension of iconic forms". Language. 86 (4): 865–96. doi:10.1353/lan.2010.0044. S2CID 117619041.
- ^ Meir, Irit; Sandler, Wendy; Padden, Carol; Aronoff, Mark (2010). "Chapter 18: Emerging sign languages" (PDF). In Marschark, Marc; Spencer, Patricia Elizabeth (eds.). Oxford Handbook of Deaf Studies, Language, and Education. Vol. 2. New York: Oxford University Press. ISBN 978-0-19-539003-2. OCLC 779907637. Retrieved 2016-11-05.
- ^ Frishberg, Nancy (1987). "Ghanaian Sign Language." In: Cleve, J. Van (ed.), Gallaudet encyclopaedia of deaf people and deafness. New York: McGraw-Hill Book Company. ISBN 9780070792296
- ^ a b c Wittmann, Henri (1991). "Classification linguistique des langues signées non vocalement" (PDF). Revue québécoise de linguistique théorique et appliquée. 10 (1): 215–88.
- ^ See Gordon (2008), under nsr "Maritime Sign Language". Archived from the original on 2011-06-04. Retrieved 2011-06-01. and sfs "South African Sign Language". Archived from the original on 2008-09-21. Retrieved 2008-09-19..
- ^ Fischer, Susan D. et al. (2010). "Variation in East Asian Sign Language Structures" in Sign Languages,, p. 499, at Google Books
- ^ Henri Wittmann (1991). The classification is said to be typological satisfying Jakobson's condition of genetic interpretability.
- ^ Simons, Gary F.; Charles D. Fennig, eds. (2018). "Bibliography of Ethnologue Data Sources". Ethnologue: Languages of the World (21st ed.). SIL International. Archived from the original on 2008-07-25. Retrieved 2008-09-19.
- ^ Wittmann's classification went into Ethnologue's database where it is still cited.[67] The subsequent edition of Ethnologue in 1992 went up to 81 sign languages, ultimately adopting Wittmann's distinction between primary and alternate sign languages (going back ultimately to Stokoe 1974) and, more vaguely, some other traits from his analysis. The 2013 version (17th edition) of Ethnologue is now up to 137 sign languages.
- ^ a b These are Adamorobe Sign Language, Armenian Sign Language, Australian Aboriginal sign languages, Hindu mudra, the Monastic sign languages, Martha's Vineyard Sign Language, Plains Indian Sign Language, Urubú-Kaapor Sign Language, Chinese Sign Language, Indo-Pakistani Sign Language (Pakistani SL is said to be R, but Indian SL to be A, though they are the same language), Japanese Sign Language, and maybe the various Thai Hill-Country sign languages, French Sign Language, Lyons Sign Language, and Nohya Maya Sign Language. Wittmann also includes, bizarrely, Chinese characters and Egyptian hieroglyphs.
- ^ a b These are Providencia Island, Kod Tangan Bahasa Malaysia (manually signed Malay), German, Ecuadoran, Salvadoran, Gestuno, Indo-Pakistani (Pakistani SL is said to be R, but Indian SL to be A, though they are the same language), Kenyan, Brazilian, Spanish, Nepali (with possible admixture), Penang, Rennellese, Saudi, the various Sri Lankan sign languages, and perhaps BSL, Peruvian, Tijuana (spurious), Venezuelan, and Nicaraguan sign languages.
- ^ Wittmann adds that this taxonomic criterion is not really applicable with any scientific rigor: Auxiliary sign languages, to the extent that they are full-fledged natural languages (and therefore included in his survey) at all, are mostly used by the deaf as well, and some primary sign languages (such as ASL and Adamorobe Sign Language) have acquired auxiliary usages.
- ^ Wittmann includes in this class Australian Aboriginal sign languages (at least 14 different languages), Monastic sign language, Thai Hill-Country sign languages (possibly including languages in Vietnam and Laos), and Sri Lankan sign languages (14 deaf schools with different sign languages).
- ^ Wittmann's references on the subject, besides his own work on creolization and relexification in spoken languages, include papers such as Fischer (1974, 1978), Deuchar (1987) and Judy Kegl's pre-1991 work on creolization in sign languages.
- ^ Wittmann's explanation for this is that models of acquisition and transmission for sign languages are not based on any typical parent-child relation model of direct transmission which is inducive to variation and change to a greater extent. He notes that sign creoles are much more common than vocal creoles and that we can't know on how many successive creolizations prototype-A sign languages are based prior to their historicity.[clarification needed]
- ^ Brentari, Diane (1998) A prosodic model of sign language phonology. Cambridge, Massachusetts: MIT Press
- ^ Brentari, Diane (2002). "Modality differences in sign language phonology and morphophonemics". In P. Meier; Kearsy Cormier; David Quinto-Pozos (eds.). Modality and Structure in Signed and Spoken Languages. pp. 35–36. doi:10.1017/CBO9780511486777.003. ISBN 9780511486777.
- ^ Emmorey, Karen (2002). Language, Cognition, and the Brain. Mahwah, NJ: Lawrence Erlbaum Associates.
- ^ Mayberry, Rachel. "The Critical Period for Language Acquisition and The Deaf Child's Language Comprehension: A Psycholinguistic Approach" (PDF). ACFOS. Archived (PDF) from the original on 2017-12-01.
- ^ Reilly, Judy (2005). "How Faces Come to Serve Grammar: The Development of Nonmanual Morphology in American Sign Language". In Brenda Schick; Marc Marschack; Patricia Elizabeth Spencer (eds.). Advances in the Sign Language Development of Deaf Children. Cary, NC: Oxford University Press. pp. 262–290. ISBN 978-0-19-803996-9.
- ^ Hopkins, Jason (2008). "Choosing how to write sign language: a sociolinguistic perspective". International Journal of the Sociology of Language. 2008 (192): 75–90. doi:10.1515/ijsl.2008.036. S2CID 145429638.
- ^ Stokoe, William C.; Dorothy C. Casterline; Carl G. Croneberg. 1965. A dictionary of American sign language on linguistic principles. Washington, D.C.: Gallaudet College Press
- ^ Herrero Blanco, Ángel L. (2003). Escritura alfabética de la Lengua de Signos Española : once lecciones. Alfaro, Juan José,, Cascales, Inmaculada. San Vicente del Raspeig [Alicante]: Publicaciones de la Universidad de Alicante. ISBN 9781282574960. OCLC 643124997.
- ^ "Biblioteca de signos – Materiales". www.cervantesvirtual.com. Retrieved 2019-07-07.
- ^ "Traductor de español a LSE – Apertium". wiki.apertium.org. Retrieved 2019-07-07.
- ^ http://sid.usal.es, Servicio de Informacion sobre Discapacidad-sid@sid usal es-. "Diccionario normativo de la lengua de signos española ... (SID)". sid.usal.es (in Spanish). Retrieved 2019-07-07.
{{cite web}}
: External link in
(help)|last=
- ^ Galea, Maria (2014). SignWriting (SW) of Maltese Sign Language (LSM) and its development into an orthography: Linguistic considerations (Ph.D. dissertation). Malta: University of Malta. Archived from the original on 13 May 2018. Retrieved 4 February 2015.
- ^ Morford, Jill P.; Staley, Joshua; Burns, Brian (Fall 2010). "Seeing Signs: Language Experience and Handshape Perception" (PDF). Deaf Studies Digital Journal (2). Videography by Jo Santiago and Brian Burns. Archived (PDF) from the original on 2012-01-11. Retrieved 2011-12-14.
- ^ Kuhl, P (1991). "Human adults and human infants show a 'perceptual magnet effect' for the prototypes of speech categories, monkeys do not". Perception and Psychophysics. 50 (2): 93–107. doi:10.3758/bf03212211. PMID 1945741.
- ^ Morford, J. P.; Grieve-Smith, A. B.; MacFarlane, J.; Staley, J.; Waters, G. S. (2008). "Effects of language experience on the perception of American Sign Language". Cognition. 109 (41–53): 41–53. doi:10.1016/j.cognition.2008.07.016. PMC 2639215. PMID 18834975.
- ^ a b Woll, Bencie; Ladd, Paddy (2003), "Deaf communities", in Marschark, Marc; Spencer, Patricia Elizabeth (eds.), Oxford handbook of deaf studies, language, and education, Oxford UK: Oxford University Press, ISBN 978-0-195-14997-5
- ^ McCaskill, C. (2011). The hidden treasure of Black ASL: its history and structure. Washington, D.C.: Gallaudet University Press.
- ^ Zeshan, Ulrike; de Vos, Connie (2012). Sign languages in village communities: Anthropological and linguistic insights. Berlin and Nijmegen: De Gruyter Mouton and Ishara Press.
- ^ Kisch, Shifra (2008). ""Deaf discourse": The social construction of deafness in a Bedouin community". Medical Anthropology. 27 (3): 283–313. doi:10.1080/01459740802222807. hdl:11245/1.345005. PMID 18663641. S2CID 1745792.
- ^ Sarah C. E. Batterbury. 2012. Language Policy 11:253–272.
- ^ Bell Laboratories RECORD (1969) A collection of several articles on the AT&T Picturephone Archived 2012-06-23 at the Wayback Machine (then about to be released) Bell Laboratories, Pg.134–153 & 160–187, Volume 47, No. 5, May/June 1969;
- ^ Huang, Jie; Zhou, Wengang; Zhang, Qilin; Li, Houqiang; Li, Weiping (2018-01-30). Video-based Sign Language Recognition without Temporal Segmentation (PDF). 32nd AAAI Conference on Artificial Intelligence (AAAI-18), Feb. 2–7, 2018, New Orleans, Louisiana, USA. arXiv:1801.10111. Archived (PDF) from the original on 2018-03-29.
- ^ Prasad, Raekha (2003-03-19). "Sound and Fury". Guardian Unlimited. Archived from the original on 2014-09-10. Retrieved 2008-01-30.
- ^ a b "Sign Language on Television". RNID. Archived from the original on 2009-04-17. Retrieved 2008-01-30.
- ^ "ITC Guidelines on Standards for Sign Language on Digital Terrestrial Television". Archived from the original on 2007-04-23. Retrieved 2008-01-30.
- ^ Bickford, J. Albert, and Melanie McKay-Cody (2018). "Endangerment and revitalization of sign languages", pp. 255–264 in The Routledge handbook of language revitalization.
- ^ "Did you know Hawai'i Sign Language is critically endangered?". Endangered Languages. Archived from the original on 2016-03-07. Retrieved 2016-02-28.
- ^ International Encyclopedia of Linguistics. Oxford University Press. 2003-01-01. ISBN 9780195139778.
The language is considered to be endangered. 9,600 deaf people in Hawaii now use American Sign Language with a few local signs for place-names and cultural items.
- ^ McKee, Rachel; McKee, David (2016), "Assessing the vitality of NZSL", 12th International Conference on Theoretical Issues in Sign Language Research (PDF), Melbourne, Australia, archived (PDF) from the original on 2016-11-01
{{citation}}
: CS1 maint: location missing publisher (link) - ^ Bickford; Albert, J.; Lewis, M. Paul; Simons, Gary F. (2014). "Rating the vitality of sign languages". Journal of Multilingual and Multicultural Development. 36 (5): 1–15.
- ^ a b c d e f g h i j Velupillai, Viveka (2012). An Introduction to Linguistic Typology. Amsterdam, Philadelphia: John Benjamins Publishing. pp. 57–58. ISBN 9789027211989. Retrieved 16 April 2020.
- ^ MacDougall, Jamie (February 2001). "Access to justice for deaf Inuit in Nunavut: The role of "Inuit sign language"". Canadian Psychology. 41 (1): 61. doi:10.1037/h0086880.
- ^ Zeshan, Ulrike. (2007). The ethics of documenting sign languages in village communities. In Peter K. Austin, Oliver Bond & David Nathan (eds) Proceedings of Conference on Language Documentation and Linguistic Theory. London: SOAS. p. 271.
- ^ Taylor-DiLeva, Kim. Once Upon A Sign : Using American Sign Language To Engage, Entertain, And Teach All Children, p. 15. Libraries Unlimited, 2011. eBook Collection (EBSCOhost). Web. 29 Feb. 2012.
- ^ Susan Goldin-Meadow (Goldin-Meadow 2003, Van Deusen, Goldin-Meadow & Miller 2001) has done extensive work on home sign systems. Adam Kendon (1988) published a seminal study of the homesign system of a deaf Enga woman from the Papua New Guinea highlands, with special emphasis on iconicity.
- ^ The one possible exception to this is Rennellese Sign Language, which has the ISO 639-3 code [rsi]. It only ever had one deaf user, and thus appears to have been a home sign system that was mistakenly-accepted into the ISO 639-3 standard. It has been proposed for deletion from the standard. ("Change Request Number: 2016-002" (PDF). ISO 639-3. SIL International. Archived (PDF) from the original on 2016-01-28. Retrieved 2016-07-05.)
- ^ Premack and Premack, David and Ann J (1984). The Mind of an Ape (1st ed.). NY: W.W. Norton & Co. ISBN 978-0393015812.
- ^ Plooij, F.X. (1978) "Some basic traits of language in wild chimpanzees?" in A. Lock (ed.) Action, Gesture and Symbol New York: Academic Press.
- ^ Nishida, T (1968). "The social group of wild chimpanzees in the Mahali Mountains". Primates. 9 (3): 167–224. doi:10.1007/bf01730971. hdl:2433/213162. S2CID 28751730.
- ^ Premack, D (1985). "'Gavagai!' or the future of the animal language controversy". Cognition. 19 (3): 207–296. doi:10.1016/0010-0277(85)90036-8. PMID 4017517. S2CID 39292094.
- ^ Gardner, R.A.; Gardner, B.T. (1969). "Teaching Sign Language to a Chimpanzee". Science. 165 (3894): 664–672. Bibcode:1969Sci...165..664G. CiteSeerX 10.1.1.384.4164. doi:10.1126/science.165.3894.664. PMID 5793972.
- ^ Gardner, R.A., Gardner, B.T., and Van Cantfort, T.E. (1989), Teaching Sign Language to Chimpanzees, Albany: SUNY Press.
- ^ Terrace, H.S. (1979). Nim: A chimpanzee who learned Sign Language New York: Knopf.
- ^ Savage-Rumbaugh, E.S; Rumbaugh, D.M.; McDonald, K. (1985). "Language learning in two species of apes". Neuroscience and Biobehavioral Reviews. 9 (4): 653–665. doi:10.1016/0149-7634(85)90012-0. PMID 4080283. S2CID 579851.
- ^ Patterson, F.G. and Linden E. (1981), The education of Koko, New York: Holt, Rinehart and Winston
- ^ Miles, H.L. (1990) "The cognitive foundations for reference in a signing orangutan" in S.T. Parker and K.R. Gibson (eds.) "Language" and intelligence in monkeys and apes: Comparative Developmental Perspectives. Cambridge Univ. Press. pp. 511–539. doi:10.1017/CBO9780511665486.021. ISBN 9780511665486
- ^ Wallman, Joel (1992). Aping Language. Cambridge University Press. ISBN 978-0-521-40666-6.
- ^ "Animal Communication". Department of Linguistics, The Ohio State University. 1994. Archived from the original on 2008-02-07. Retrieved 2008-02-21.
- ^ Stewart, Thomas W.; Vaillette, Nathan (2001). Language Files: Materials for an Introduction to Language & Linguistics (8th ed.). Columbus: The Ohio State University Press. pp. 26–31. ISBN 978-0-8142-5076-1.
- ^ Anderson, Stephen R. (2004). Doctor Doolittle's Delusion. New Haven CT: Yale University Press. pp. 263–300. ISBN 978-0-300-10339-7.
- ^ Fromkin, Victoria; Rodman, Robert; Hyams, Nina (2007). An introduction to language (8th ed.). Boston: Thomson Wadsworth. pp. 352–356. ISBN 978-1-4130-1773-1.
- ^ Hewes, Gordon W. (1973). "Primate communication and the gestural origin of language". Current Anthropology. 14: 5–32. doi:10.1086/201401. S2CID 146288708.
- ^ Kimura, Doreen (1993). Neuromotor Mechanisms in Human Communication. Oxford: Oxford University Press.
- ^ Newman, A. J.; Bavelier, D; Corina, D; Jezzard, P; Neville, HJ (2002). "A Critical Period for Right Hemisphere Recruitment in American Sign Language Processing". Nature Neuroscience. 5 (1): 76–80. doi:10.1038/nn775. PMID 11753419. S2CID 2745545.
- ^ Wittmann, Henri (1980). "Intonation in glottogenesis", pp. 315–29 in The melody of language: Festschrift Dwight L. Bolinger, Linda R. Waugh & Cornelius H. van Schooneveld (eds.). Baltimore: University Park Press.
- ^ Kolb, Bryan, and Ian Q. Whishaw (2003). Fundamentals of Human Neuropsychology, 5th edition, Worth Publishers.
Bibliography
- Aronoff, Mark; Meir, Irit; Sandler, Wendy (2005). "The Paradox of Sign Language Morphology". Language. 81 (2): 301–44. doi:10.1353/lan.2005.0043. PMC 3250214. PMID 22223926.
- Branson, J., D. Miller, & I G. Marsaja. (1996). "Everyone here speaks sign language, too: a deaf village in Bali, Indonesia." In: C. Lucas (ed.): Multicultural aspects of sociolinguistics in deaf communities. Washington, Gallaudet University Press, pp. 39+
- Deuchar, Margaret (1987). "Sign languages as creoles and Chomsky's notion of Universal Grammar." Essays in honor of Noam Chomsky, 81–91. New York: Falmer.
- Emmorey, Karen; & Lane, Harlan L. (Eds.). (2000). The signs of language revisited: An anthology to honor Ursula Bellugi and Edward Klima. Mahwah, NJ: Lawrence Erlbaum Associates. ISBN 0-8058-3246-7.
- Fischer, Susan D. (1974). "Sign language and linguistic universals." Actes du Colloque franco-allemand de grammaire générative, 2.187–204. Tübingen: Niemeyer.
- Fischer, Susan D. (1978). "Sign languages and creoles". Siple. 1978: 309–31.
- Goldin-Meadow, Susan (2003), The Resilience of Language: What Gesture Creation in Deaf Children Can Tell Us About How All Children Learn Language, Psychology Press, a subsidiary of Taylor & Francis, New York, 2003
- Gordon, Raymond, ed. (2008). Ethnologue: Languages of the World, 15th edition. SIL International, ISBN 978-1-55671-159-6, 1-55671-159-X. Archived January 13, 2013, at the Wayback Machine Sections for primary sign languages [2] and alternative ones [3].
- Groce, Nora E. (1988). Everyone here spoke sign language: Hereditary deafness on Martha's Vineyard. Cambridge, Massachusetts: Harvard University Press. ISBN 0-674-27041-X.
- Healy, Alice F. (1980). "Can Chimpanzees learn a phonemic language?" In: Sebeok, Thomas A. & Jean Umiker-Sebeok, eds, Speaking of apes: a critical anthology of two-way communication with man. New York: Plenum, 141–43.
- Kamei, Nobutaka (2004). The Sign Languages of Africa, "Journal of African Studies" (Japan Association for African Studies) Vol. 64, March, 2004. [NOTE: Kamei lists 23 African sign languages in this article].
- Kegl, Judy (1994). "The Nicaraguan Sign Language Project: An Overview". Signpost. 7 (1): 24–31.
- Kegl, Judy, Senghas A., Coppola M (1999). "Creation through contact: Sign language emergence and sign language change in Nicaragua." In: M. DeGraff (ed.), Comparative Grammatical Change: The Intersection of Language Acquisition, Creole Genesis, and Diachronic Syntax, pp. 179–237. Cambridge, Massachusetts: MIT Press.
- Kegl, Judy (2004). "Language Emergence in a Language-Ready Brain: Acquisition Issues." In: Jenkins, Lyle (ed.), Biolinguistics and the Evolution of Language. John Benjamins.
- Kendon, Adam. (1988). Sign Languages of Aboriginal Australia: Cultural, Semiotic and Communicative Perspectives. Cambridge: Cambridge University Press.
- Kroeber, Alfred L. (1940). "Stimulus diffusion". American Anthropologist. 42: 1–20. doi:10.1525/aa.1940.42.1.02a00020.
- Lane, Harlan L. (Ed.). (1984). The Deaf experience: Classics in language and education. Cambridge, Massachusetts: Harvard University Press. ISBN 0-674-19460-8.
- Lane, Harlan L. (1984). When the mind hears: A history of the deaf. New York: Random House. ISBN 0-394-50878-5.
- Madell, Samantha (1998). Warlpiri Sign Language and Auslan – A Comparison. M.A. Thesis, Macquarie University, Sydney, Australia. Archived June 8, 2011, at the Wayback Machine
- Madsen, Willard J. (1982), Intermediate Conversational Sign Language. Gallaudet University Press. ISBN 978-0-913580-79-0.
- O'Reilly, S. (2005). Indigenous Sign Language and Culture; the interpreting and access needs of Deaf people who are of Aboriginal and/or Torres Strait Islander in Far North Queensland. Sponsored by ASLIA, the Australian Sign Language Interpreters Association.
- Padden, Carol; & Humphries, Tom. (1988). Deaf in America: Voices from a culture. Cambridge, Massachusetts: Harvard University Press. ISBN 0-674-19423-3.
- Pfau, Roland, Markus Steinbach & Bencie Woll (eds.), Sign language. An international handbook (HSK – Handbooks of linguistics and communication science). Berlin: Mouton de Gruyter.
- Poizner, Howard; Klima, Edward S.; & Bellugi, Ursula. (1987). What the hands reveal about the brain. Cambridge, Massachusetts: MIT Press.
Premack, David, & Ann J. Premack (1983). The mind of an ape. New York: Norton.
- Premack, David (1985). "'Gavagai!' or the future of the animal language controversy". Cognition. 19 (3): 207–96. doi:10.1016/0010-0277(85)90036-8. PMID 4017517. S2CID 39292094.
- Sacks, Oliver W. (1989). Seeing voices: A journey into the world of the deaf. Berkeley: University of California Press. ISBN 0-520-06083-0.
- Sandler, Wendy (2003). "Sign Language Phonology". In William Frawley (Ed.), The Oxford International Encyclopedia of Linguistics.[4]
- Sandler, Wendy & Lillo-Martin, Diane (2001). "Natural sign languages". In M. Aronoff & J. Rees-Miller (Eds.), Handbook of linguistics (pp. 533–562). Malden, MA: Blackwell Publishers. ISBN 0-631-20497-0.
- Stiles-Davis, Joan; Kritchevsky, Mark; & Bellugi, Ursula (Eds.). (1988). Spatial cognition: Brain bases and development. Hillsdale, NJ: L. Erlbaum Associates. ISBN 0-8058-0046-8; ISBN 0-8058-0078-6.
- Stokoe, William C. (1960, 1978). Sign language structure: An outline of the visual communication systems of the American deaf. Studies in linguistics, Occasional papers, No. 8, Dept. of Anthropology and Linguistics, University at Buffalo. 2d ed., Silver Spring: Md: Linstok Press.
- Stokoe, William C. (1974). Classification and description of sign languages. Current Trends in Linguistics 12.345–71.
- Twilhaar, Jan Nijen, and Beppie van den Bogaerde. 2016. Concise Lexicon for Sign Linguistics. John Benjamins Publishing Company.
- Valli, Clayton, Ceil Lucas, and Kristin Mulrooney. (2005) Linguistics of American Sign Language: An Introduction, 4th Ed. Washington, DC: Gallaudet University Press.
- Van Deusen-Phillips S.B., Goldin-Meadow S., Miller P.J., 2001. Enacting Stories, Seeing Worlds: Similarities and Differences in the Cross-Cultural Narrative Development of Linguistically Isolated Deaf Children, Human Development, Vol. 44, No. 6.
- Wilbur, R.B. (1987). American Sign Language: Linguistic and applied dimensions. San Diego, CA: College-Hill.
Further reading
- Fox, Margalit (2007) Talking Hands: What Sign Language Reveals About the Mind , Simon & Schuster ISBN 978-0-7432-4712-2
- Quenqua, Douglas. Pushing Science’s Limits in Sign Language Lexicon, The New York Times, December 4, 2012, p. D1 and published online at NYTimes.com on December 3, 2012. Retrieved on December 7, 2012.
Academic journals related to sign languages
- American Annals of the Deaf, Gallaudet University Press
- Journal of American Sign Language and Literature, ASLized!
- Journal of Deaf Studies and Deaf Education, Oxford University Press
- Sign Language Studies, Gallaudet University Press
- Sign Language & Linguistics, John Benjamins Publishing Company
External links
Note: the articles for specific sign languages (e.g. ASL or BSL) may contain further external links, e.g. for learning those languages.
- Langue:Signes du Monde, directory for all online Sign Languages dictionaries (in French and English)
- List Serv for Sign Language Linguistics
- The MUSSLAP Project, Multimodal Human Speech and Sign Language Processing for Human-Machine Communication
- Mallery, Garrick. 1879–1880. Sign language among North American Indians compared with that among other peoples and deaf-mutes. A first annual report of the Bureau of Ethnology to the Secretary of the Smithsonian Institution. Project Gutenberg.
- Pablo Bonet, J. de (1620) Reduction de las letras y Arte para enseñar á ablar los Mudos, Biblioteca Digital Hispánica (BNE).
- Watch the Bible and other video publications in 99 sign languages. Bibles and sign-language study material by Jehovah's Witnesses.
- Science in Sign (video, 3 min. 48 secs.), by Davis, Leslye & Huang, Jon & Xaquin, G.V.; interpreted by Callis, Lydia, on NYTimes.com website, December 4, 2012. Retrieved December 13, 2012. The video translates a shortened version of a N.Y. Times science article on how new signs are being developed to enhance communication in the sciences, extracted from:
- Quenqua, Douglas. Pushing Science’s Limits in Sign Language Lexicon, The New York Times, December 4, 2012, p.D1 and published online at NYTimes.com on December 3, 2012. Retrieved on December 7, 2012.
- signlangtv.org, a project documenting sign language television shows for the deaf around the world
- Template:Curlie