Every time we speak, we're improvising:
"Humans possess a remarkable ability to talk about almost anything, sometimes putting words together into never-before-spoken or -written sentences," said Morten H. Christiansen, the William R. Kenan, Jr. Professor of Psychology in the College of Arts and Sciences.
We can improvise new sentences so readily, language scientists believe, because we have acquired mental representations of the patterns of language that allow us to combine words into sentences. The nature of those patterns and how they work, however, remains a puzzle in cognitive science, Christiansen said.
[...] For decades, scientists have believed we rely on a complex mental grammar to build sentences that have hierarchically organized structure – like a branching tree. But Christiansen and Nielsen suggest that our mental representations might be more like snapping together pre-assembled LEGO pieces (such as a door frame or a wheel set) into a complete model. Instead of intricate hierarchies, they propose, we use small, linear chunks of word classes like nouns and verbs – including short sequences that can't be formed by way of grammar, such as "in the middle of the" or "wondered if you."
[...] The prevailing theory since at least the 1950s is based on hierarchical, tree-like mental representations, setting humans apart from other animals, Christiansen said. In this view, words and phrases combine according to the principles of grammar into larger units called constituents. For example, in the sentence "She ate the cake," "the" and "cake" combine into a noun phrase "the cake", which then combines with "ate" into the verb phrase "ate the cake," and finally with "she" to make the sentence.
"But not all sequences of words form constituents," Christiansen and Nielsen wrote in a summary of their paper. "In fact, the most common three- or four-word sequences in language are often nonconstituents, such as 'can I have a' or 'it was in the.'"
Because they don't conform to grammar, nonconstituent sequences have been overlooked. But they do play a role in a speaker's knowledge of their language, the researchers found.
In experiments, an eye-tracking study and an analysis of phone conversations, they discovered that linear sequences of word classes can be "primed," meaning when we hear or read them once, we process them faster the next time. That's compelling evidence they're part of our mental representation of language, Christiansen said. In other words, they're a key part of our mental representation of language that goes beyond the rules of grammar.
"I think the main contribution is showing that traditional rules of grammar cannot capture all of the mental representations of language structure," Nielsen said.
"It might even be possible to account for how we use language in general with flatter structure," Christiansen said. "Importantly, if you don't need the more complex machinery of hierarchical syntax, then this could mean that the gulf between human language and other animal communication systems is much smaller than previously thought."
Journal Reference: Nielsen, Y.A., Christiansen, M.H. Evidence for the representation of non-hierarchical structures in language. Nat Hum Behav (2026). https://doi.org/10.1038/s41562-025-02387-z
(Score: 2) by VLM on Monday February 02, @02:35PM (5 children)
Plausible. Interesting way to look at it. The correlation between verbal intelligence and visiospatial intelligence is certainly not zero (or 1.0 either).
(Score: 5, Interesting) by VLM on Monday February 02, @04:16PM (2 children)
ooooh I just thought of an interesting example of visuospatial intelligence crossover into verbal/language intelligence: The Lost Art of Sentence Diagramming. Turning lines of text into little maps. Those were a Big Thing for about one year when I was in, like, middle school, then completely forgotten, like cursive.
From the wikipedia
Yeah so thats why I got one unit of it for a week or so and then forgotten. Apparently, for linguist types, the cool and trendy thing is now to go full on computer science with their own unique interpretation of parse trees.
I wonder what would happen if someone rolled up on a formal grammar academic English Professor with some GNU Bison parser code. Or retro 1970s BSD YACC or modern ANTLR. I bet that would be an interesting collision of worlds.
(Score: 1, Insightful) by Anonymous Coward on Monday February 02, @07:21PM
We got Sentence Diagramming in, iirc, 5th grade, before the jump to middle school in 6th grade. It was more than a couple of weeks, but never came back in later grades. This was mid-1960's. From very vague memories, diagramming was often easy for those of us who were comfortable with what is now called STEM, and not so easy for the poets in the class.
(Score: 5, Interesting) by Reziac on Tuesday February 03, @02:47AM
I freakin' loved diagramming sentences. In the prologue of The Scarlet Letter there is a sentence that is three pages long in tiny print. I diagrammed the whole thing, just for fun. Grammar is algebra for words, and diagramming is the flowchart. I have wished for a computerized sentence diagrammer...
I have a conlang for my SF novels, and it has grown its own structure. Basically, one tacks together particles and indicators to achieve meaning. Frex, sa = other (as distinguished from self), -ik is a suffix meaning to cut off or end by separation, so -sik is literally "separated other" but in practice means corpse. -- Perhaps illustrating priorities, in this conlang I can curse fluently, but I cannot yet order breakfast.
And there is no Alkibiades to come back and save us from ourselves.
(Score: 3, Interesting) by mcgrew on Tuesday February 03, @11:49PM (1 child)
might be more like snapping together pre-assembled LEGO pieces
Far more than that, like creating new Lego shapes that fit into the existing pattern. One example of many is "astronaut". It was coined by a New York State bureaucrat Niel R Jones who wrote science fiction on the side in his short story The Death's Head Meteor [mcgrewbooks.com](nothing but the text is uploaded right now). Isaac Asimov coined "Robotics" thinking that it was already a word. James Blish [wikipedia.org] coined "gas giant" in his story Solar Plexus. And there's Rority's Stratodoober...
Why do the mainstream media act as if Donald Trump isn't a pathological liar with dozens of felony fraud convictions?
(Score: 2) by hendrikboom on Saturday February 07, @02:17AM
Thanks for the lovely old-time story.