Tag Archives: language

Better living through chemistry

3.20-3.04 billion years ago

Chemistry plays a big role once Earth forms. Different mineral species appear, with different chemical compositions. Magnesium-heavy olivine sinks to the lower mantle of the Earth. Aluminum-rich feldspars float to the top.

Chemistry is an example of what William Abler calls “the particulate principle of self-diversifying systems,” what you get when a collection of discrete units (atoms) can combine according to definite rules to create larger units (molecules) whose properties aren’t just intermediate between the constituents. Paint is not an example. Red paint plus white paint is just pink paint. But atoms and molecules are: two moles of hydrogen gas plus one mole of oxygen gas, compounded, make something very different, one mole of liquid water.

A lot of important chemical principles are summed up in the periodic table.

periodictable copy

On the far right are atoms that have their electron shells filled, and don’t feel like combining with anyone. Most, but not all the way, to the right are atoms with almost all their shells filled, just looking for an extra electron or two. (Think oxygen, O, with slots for two extra electrons). On the left are atoms with a few extra electrons they can share. (Think hydrogen, H, each atom with an extra electron it’s willing to share with, say, oxygen.) In the middle are atoms that could go either way: polymorphously perverse carbon, C, with four slots to fill and four electrons to share, and metals, that like to pool their electrons in a big cloud, and conduct electricity and heat easily. (Think of Earth’s core of molten iron, Fe, a big electric dynamo.)

Another example of “the particulate principle of self-diversifying systems” is human language. Consider speech sounds, for example. You’ve got small discrete units (phonemes, the sounds we write bpskchsh, and so on) that can combine according to rules to give syllables. Some syllables are possible, according to the rules of English, others not. Star and spikythole and plast, are possible English words, tsar and psyche are not (at least if you pronounce all the consonants, the way Russians or Greeks do), nor tlaps nor bratz (if you actually try to pronounce the z). Thirty years ago appblog, and twerk were not words in the English language, but they were possible words, according to English sound laws.

You can make a periodic table of consonants.

phonemes

Across the top are the different places in the vocal tract where you block the flow of air. Along the left side are different ways of blocking the flow (stopping it completely –t-, letting it leak out –s-, etc.) The table can explain why, for example, we use in for intangible and indelicate, but switch to im for impossible and imbalance. (The table contains sounds we don’t use in English, and uses a special set of signs, the International Phonetic Alphabet, which assigns one letter per phoneme.) This is why a book title like The Atoms of Language makes sense (a good book by the way).

So sometimes the universe gets more complex because already existing stuff organizes itself into complex new patterns  – clumps and swirls and stripes. But sometimes the universe gets more complex because brand new kinds of stuff appear, because a new particulate system comes online: elementary particles combine to make atoms, atoms combine to make molecules, or one set of systems (nucleotides to make genes, amino acids to make proteins) combines to make life, or another set of systems (phonemes to make words, words to make phrases and sentences) combines to make language.

Talk Like a Neanderthal Day

Like “Talk Like a Pirate Day,” but more scientific!

Human language is probably more than One Weird Trick. It’s multiple weird tricks. We’ve already posted about phonemes, and how they can be strung together to make words. And then words are strung together to make phrases and sentences: but there are a multiple weird tricks here as well. Consider this quotation from some language researchers:

Every human language sentence is composed of two layers of meaning: a lexical structure that contains the lexical meaning, and an expression structure that is composed of function elements that give shape to the expression. In the question, Did John eat pizza?, the lexical layer is composed of the words John, eat, pizza … The sentence also contains did, which has two functions: it marks tense, and by occurring at the head of the sentence, it also signifies a question. (Miyagawa et. al.)

The lexical level of language includes content words: nouns, most verbs, adjectives. The expressive level contains functional words (auxiliary verbs, conjunctions, articles, and so on), as well as tenses and other inflections, and even functional operations like moving around the parts of a phrase. We can think of a sentence like a piece of carpentry, a bookshelf, say. A typical bookshelf will consist of the parts that hold things up (shelves, sides, etc., analogous to lexical structure), and parts that fasten these parts together (dowels, screws, bolts, nuts, nails, glue, etc., analogous to expressive structure).
bookshelf1
But there are other ways to build furniture. For example, here’s a desk with no fasteners. Instead, the load bearing parts have slots and tabs that fit together. This is simpler but less flexible than having boards and fasteners that you can put together however you see fit.
bookshelf2

The analogy with language would be a protolanguage with nothing but content words – nouns, verbs, and adjectives, say – and lexical structure. The analogy works because verbs come with built in slots that nouns can fit into, even without any extra “fasteners” to hold them together. Linguists call this the “argument structure” of a verb. (Think about functions and their arguments if you’re into math or computer science.) For example fear and frighten are both transitive verbs, but they have different argument structures

  • Carg fear thunder.
  • Thunder frighten Carg.

In one case the experiencer goes in the subject slot, and the agent goes in the direct object slot. In the other case it’s the reverse. Some verbs, like burn, have more than one argument structure.

  • Carg burn meat.
  • Meat burn.

English verbs have some tens of different argument structures. (Note that I haven’t put any tense on the verbs. That would be part of expressive structure, which we’re leaving off here.)

So a protolanguage, one step along the way to a full blown language, could consist of a bunch of verbs and their argument structures, with nouns slotted in the appropriate spaces as needed, and adjectives added to convey additional information. Is this what Neanderthal language was like? We don’t know yet, but as we figure out the genetics of language, we’ll find out. For now though, let’s make today – just about the last day on Logarithmic History that Neanderthals are around – “Talk Like a Neanderthal Day.”

Carg publish blogpost now. Next Carg get Mother brunch. Carg and Mother eat brunch. Goodbye!

Hits, slides, and rings

Part of the challenge of language is coming up with some way to distinguish thousands or tens of thousands of words from one another. It would be hard to come up with that many unique sounds. What human languages do instead is to come up with phonemes and rules for stringing phonemes together into syllables, and then create words by arbitrarily pairing up one syllable, or a few, with a meaning. Phonemes are the individual sounds of a language, roughly comparable to individual letters. There are about forty phonemes in most dialects of English. (English spelling does a pretty sloppy job of matching up phonemes and letters. Finnish comes close to one phoneme per letter.)

Often in evolution organisms don’t solve new problems from scratch, but instead harness preexisting adaptations. I argued earlier that the abstract “space” of possession (“The Crampden estate went to Reginald.”) may have developed by harnessing preexisting concepts of physical space. And our abilities to recognize speech sounds may harness our preexisting capacities for recognizing the sounds of solid objects interacting. At least that’s the argument of a recent book by Mark Changizi, Harnessed: How Language and Music Mimicked Nature and Transformed Ape to Man.

Changizi notes that even though we’re mostly not aware of it, we’re very good at using our hearing to keep track of what’s going on in our physical surroundings. For example, people easily recognize the difference between someone going upstairs and someone going downstairs, and we’re pretty good at recognizing individuals by their treads. The sounds that solid objects make can be broadly categorized as hits, slides, and rings. Hits: one object collides with another and sends out a sharp burst of sound. Slides: an object scrapes against another and sends out a more extended sound. Rings: an object reverberates after a collision. Changizi argues that these correspond to the major categories of phonemes.

  • Hits = plosives, like p b t g k
  • Slides = fricatives, like s sh th f v z
  • Rings = sonorants, including sonorant consonants, like l r y w m n, and vowels

These are not the only sounds we can make with our mouths. We can do barks and pops and farts and so on. But our auditory systems are especially cued into solid object physics, so when we try to come up with easy-to-distinguish phonemes, that’s what we focus on. And a lot of rules about how phonemes hook up also follow from this principle – for example hits followed by rings are more common than the reverse.

There’s surely more going on with speech sounds than Changizi allows for. But if imitating nature is not the whole story of phonemes, it may at least be where they got started.

Later on when we talk about writing systems, we’ll see there’s a similar argument about how these are tuned to tickle our primate visual systems.

Speech sounds

Below are some reflections on language. There will be plenty more in days to come. For a science-fictional take on language, try Octavia Butler’s account of a world where language has disappeared, Speech Sounds. It’s one of her best. It won science fiction’s Hugo Award for best short story in 1984.

We’re now six months through the year 2016 at Logarithmic History. We raced through time at the rate of 751 million years a day on January 1. December 31 we’ll cover just one year per day. Today, June 30, covers 29,815 years, from 547,500 to 517,686 years ago.

By today’s date, the universe is a lot more complicated than when we started. As we mentioned before, one of the major sources of complexity is the origin of new discrete combinatorial systems, made of small units that can be combined into larger units that have different properties than their constituents. Elementary particles are the first discrete combinatorial system to appear, already present in the early moments of the Big Bang. The different chemical elements are another major discrete combinatorial system. It took billions of years for enough heavy atoms, beyond hydrogen and helium, to accumulate from stellar explosions, allowing the complex chemistry and geology that we know on Earth. It may be that the paucity of heavy elements in the early Universe is what prevented earlier planetary systems from developing complex life.

With the origin of life comes another discrete combinatorial systems, or rather two connected systems: nucleotides strung together to make genes, which code for amino acids strung together to make proteins.

For the second half of the Logarithmic History year, we’ll be spending a lot of time looking at the consequences of another discrete combinatorial system: language. Or maybe, as with genes-and-proteins there are really two systems here: words strung into phrases and sentences, and concepts strung together into complex propositions in a Language of Thought.

The origin of modern human is one of the major transitions in evolution, comparable to the origin of eukaryotic cells, or of social insects. Language is crucial here: ants organize high levels of cooperation by secreting pheromones. Humans organize by secreting cosmologies.

My handaxe

By today’s date, around 1.3 million years ago, Acheulean tools are well developed in Africa, and found in India too. Sophisticated tools like the Acheulean hand axe probably tell us something not just about cognition in relation to tool making, but also about social cognition. You wouldn’t make a hand axe, use it, and abandon it. Nor would you go to all the trouble if the biggest, baddest guy in the group was immediately going to grab it from you. So there is probably some notion of artifacts-as-personal-possessions by the time Acheulean appears.

Possession is a social relationship, a relationship between two or more individuals with respect to the thing possessed. Robinson Crusoe didn’t “own” anything on his island before Friday came along.

Linguists have noted something interesting about the language of possession that maybe tells us something about the psychology of possession: Expressions for possession are often similar to expressions for spatial locations. Compare spatial locations;

João went to Recife.
Chico stayed in Rio.
The gang kept Zezinho in Curitiba.

and corresponding constructions for possessions

The Crampden estate went to Reginald.
The Hampden estate stayed with Lionel.
Thag kept axe.

Of course the Crampden estate didn’t go anywhere in physical space, but it still traveled in the abstract social space of possession. In some cases just switching from inanimate to animate will switch the meaning from locative to possessive. The Russian preposition y means at/near when applied to a place (People are at Nevsky street) but possession when applied to a person (Hat is “at” Ivan = Ivan has hat.)

What may be going on here: people (and many other creatures) have some mental machinery for thinking about physical space. That machinery gets retooled/borrowed/exapted for thinking about more abstract relationships. The cognitive psychology of space gets retooled for thinking about other abstract relationships too: close and distant kin, time ahead and behind. (You can find Steve Pinker making this argument in The Stuff of Thought.) In other words, we may be seeing a common evolutionary phenomenon of organs evolved for one purpose being put to another purpose – reptile jaw bones evolve into mammalian inner ear bones, dinosaur forelimbs evolve into bird wings. We’ll see other possible examples, involving e.g. the evolution of speech sounds, as we move along.

Better living through chemistry

January 27. 3.29 – 3.04 Bya (billion years ago)

Chemistry plays a big role once Earth forms. Different mineral species appear, with different chemical compositions. Magnesium-heavy olivine sinks to the lower mantle of the Earth. Aluminum-rich feldspars float to the top.

Chemistry is an example of what William Abler calls “the particulate principle of self-diversifying systems,” what you get when a collection of discrete units (atoms) can combine according to definite rules to create larger units (molecules) whose properties aren’t just intermediate between the constituents. Paint is not an example. Red paint plus white paint is just pink paint. But atoms and molecules are: two moles of hydrogen gas plus one mole of oxygen gas, compounded, make something very different, one mole of liquid water.

A lot of important chemical principles are summed up in the periodic table.

periodictable copy

On the far right are atoms that have their electron shells filled, and don’t feel like combining with anyone. Most, but not all the way, to the right are atoms with almost all their shells filled, just looking for an extra electron or two. (Think oxygen, O, with slots for two extra electrons). On the left are atoms with a few extra electrons they can share. (Think hydrogen, H, each atom with an extra electron it’s willing to share with, say, oxygen.) In the middle are atoms that could go either way, including metals that like to pool their electrons in a big cloud, and conduct electricity and heat easily. (Think of Earth’s core of molten iron, Fe, a big electric dynamo.)

Another example of “the particulate principle of self-diversifying systems” is human language. Consider speech sounds, for example. You’ve got small discrete units (phonemes, the sounds we write b, p, s, k, ch, sh, and so on) that can combine according to rules to give syllables. Some syllables are possible, according to the rules of English, others not. Star and spiky, thole and plast, are possible English words, tsar and psyche are not (at least if you pronounce all the consonants, the way Russians or Greeks do), nor tlaps nor bratz (if you actually try to pronounce the z). Thirty years ago app, blog, and twerk were not words in the English language, but they were possible words, according to English sound laws.

You can make a periodic table of consonants.

phonemes

Across the top are the different places in the vocal tract where you block the flow of air. Along the left side are different ways of blocking the flow (stopping it completely –t-, letting it leak out –s-, etc.) The table can explain why, for example, we use in for intangible and indelicate, but switch to im for impossible and imbalance. (The table contains sounds we don’t use in English, and uses a special alphabet with one letter per phoneme.) This is why a book title like The Atoms of Language makes sense (a good book by the way).

A lot of the major leaps in complexity in the history of the universe (the ones that go beyond just already existing stuff organizing itself in clumps and swirls and stripes) happen when brand new kinds of stuff appears because a new particulate system comes online: when elementary particles combine to make atoms, atoms combine to make molecules, and multiple systems (nucleotides to make genes, amino acids to make proteins) combine to make life.

Talk Like a Neanderthal Day

Like “Talk Like a Pirate Day,” but more scientific!

Human language is probably more than One Weird Trick. It’s multiple weird tricks. We’ve already posted about phonemes, and how they can be strung together to make words. And then words are strung together to make phrases and sentences: but there are a multiple weird tricks here as well. Consider this quotation from some language researchers:

Every human language sentence is composed of two layers of meaning: a lexical structure that contains the lexical meaning, and an expression structure that is composed of function elements that give shape to the expression. In the question, Did John eat pizza?, the lexical layer is composed of the words John, eat, pizza … The sentence also contains did, which has two functions: it marks tense, and by occurring at the head of the sentence, it also signifies a question. (Miyagawa et. al.)

The lexical level of language includes content words: nouns, most verbs, adjectives. The expressive level contains functional words (auxiliary verbs, conjunctions, articles, and so on), as well as tenses and other inflections, and even functional operations like moving around the parts of a phrase. We can think of a sentence like a piece of carpentry, a bookshelf, say. A typical bookshelf will consist of the parts that hold things up (shelves, sides, etc., analogous to lexical structure), and parts that fasten these parts together (dowels, screws, bolts, nuts, nails, glue, etc., analogous to expressive structure).
bookshelf1
But there are other ways to build furniture. For example, here’s a desk with no fasteners. Instead, the load bearing parts have slots and tabs that fit together. This is simpler but less flexible than having boards and fasteners that you can put together however you see fit.
bookshelf2

The analogy with language would be a protolanguage with nothing but content words – nouns, verbs, and adjectives, say – and lexical structure. The analogy works because verbs come with built in slots that nouns can fit into, even without any extra “fasteners” to hold them together. Linguists call this the “argument structure” of a verb. (Think about functions and their arguments if you’re into math or computer science.) For example fear and frighten are both transitive verbs, but they have different argument structures

  • Carg fear thunder.
  • Thunder frighten Carg.

In one case the experiencer goes in the subject slot, and the agent goes in the direct object slot. In the other case it’s the reverse. Some verbs, like burn, have more than one argument structure.

  • Carg burn meat.
  • Meat burn.

English verbs have some tens of different argument structures. (Note that I haven’t put any tense on the verbs. That would be part of expressive structure, which we’re leaving off here.)

So a protolanguage, one step along the way to a full blown language, could consist of a bunch of verbs and their argument structures, with nouns slotted in the appropriate spaces as needed, and adjectives added to convey additional information. Is this what Neanderthal language was like? We don’t know yet, but as we figure out the genetics of language, we’ll find out. For now though, let’s make today – just about the last day on Logarithmic History that Neanderthals are around – “Talk Like a Neanderthal Day.”

Carg publish blogpost now. Next Carg get Oona brunch. Carg and Ooona eat brunch. Goodbye!