Every time we speak, we're improvising:
"Humans possess a remarkable ability to talk about almost anything, sometimes putting words together into never-before-spoken or -written sentences," said Morten H. Christiansen, the William R. Kenan, Jr. Professor of Psychology in the College of Arts and Sciences.
We can improvise new sentences so readily, language scientists believe, because we have acquired mental representations of the patterns of language that allow us to combine words into sentences. The nature of those patterns and how they work, however, remains a puzzle in cognitive science, Christiansen said.
[...] For decades, scientists have believed we rely on a complex mental grammar to build sentences that have hierarchically organized structure – like a branching tree. But Christiansen and Nielsen suggest that our mental representations might be more like snapping together pre-assembled LEGO pieces (such as a door frame or a wheel set) into a complete model. Instead of intricate hierarchies, they propose, we use small, linear chunks of word classes like nouns and verbs – including short sequences that can't be formed by way of grammar, such as "in the middle of the" or "wondered if you."
[...] The prevailing theory since at least the 1950s is based on hierarchical, tree-like mental representations, setting humans apart from other animals, Christiansen said. In this view, words and phrases combine according to the principles of grammar into larger units called constituents. For example, in the sentence "She ate the cake," "the" and "cake" combine into a noun phrase "the cake", which then combines with "ate" into the verb phrase "ate the cake," and finally with "she" to make the sentence.
"But not all sequences of words form constituents," Christiansen and Nielsen wrote in a summary of their paper. "In fact, the most common three- or four-word sequences in language are often nonconstituents, such as 'can I have a' or 'it was in the.'"
Because they don't conform to grammar, nonconstituent sequences have been overlooked. But they do play a role in a speaker's knowledge of their language, the researchers found.
In experiments, an eye-tracking study and an analysis of phone conversations, they discovered that linear sequences of word classes can be "primed," meaning when we hear or read them once, we process them faster the next time. That's compelling evidence they're part of our mental representation of language, Christiansen said. In other words, they're a key part of our mental representation of language that goes beyond the rules of grammar.
"I think the main contribution is showing that traditional rules of grammar cannot capture all of the mental representations of language structure," Nielsen said.
"It might even be possible to account for how we use language in general with flatter structure," Christiansen said. "Importantly, if you don't need the more complex machinery of hierarchical syntax, then this could mean that the gulf between human language and other animal communication systems is much smaller than previously thought."
Journal Reference: Nielsen, Y.A., Christiansen, M.H. Evidence for the representation of non-hierarchical structures in language. Nat Hum Behav (2026). https://doi.org/10.1038/s41562-025-02387-z
(Score: 2) by VLM on Monday February 02, @02:35PM (5 children)
Plausible. Interesting way to look at it. The correlation between verbal intelligence and visiospatial intelligence is certainly not zero (or 1.0 either).
(Score: 5, Interesting) by VLM on Monday February 02, @04:16PM (2 children)
ooooh I just thought of an interesting example of visuospatial intelligence crossover into verbal/language intelligence: The Lost Art of Sentence Diagramming. Turning lines of text into little maps. Those were a Big Thing for about one year when I was in, like, middle school, then completely forgotten, like cursive.
From the wikipedia
Yeah so thats why I got one unit of it for a week or so and then forgotten. Apparently, for linguist types, the cool and trendy thing is now to go full on computer science with their own unique interpretation of parse trees.
I wonder what would happen if someone rolled up on a formal grammar academic English Professor with some GNU Bison parser code. Or retro 1970s BSD YACC or modern ANTLR. I bet that would be an interesting collision of worlds.
(Score: 1, Insightful) by Anonymous Coward on Monday February 02, @07:21PM
We got Sentence Diagramming in, iirc, 5th grade, before the jump to middle school in 6th grade. It was more than a couple of weeks, but never came back in later grades. This was mid-1960's. From very vague memories, diagramming was often easy for those of us who were comfortable with what is now called STEM, and not so easy for the poets in the class.
(Score: 5, Interesting) by Reziac on Tuesday February 03, @02:47AM
I freakin' loved diagramming sentences. In the prologue of The Scarlet Letter there is a sentence that is three pages long in tiny print. I diagrammed the whole thing, just for fun. Grammar is algebra for words, and diagramming is the flowchart. I have wished for a computerized sentence diagrammer...
I have a conlang for my SF novels, and it has grown its own structure. Basically, one tacks together particles and indicators to achieve meaning. Frex, sa = other (as distinguished from self), -ik is a suffix meaning to cut off or end by separation, so -sik is literally "separated other" but in practice means corpse. -- Perhaps illustrating priorities, in this conlang I can curse fluently, but I cannot yet order breakfast.
And there is no Alkibiades to come back and save us from ourselves.
(Score: 2) by mcgrew on Tuesday February 03, @11:49PM (1 child)
might be more like snapping together pre-assembled LEGO pieces
Far more than that, like creating new Lego shapes that fit into the existing pattern. One example of many is "astronaut". It was coined by a New York State bureaucrat Niel R Jones who wrote science fiction on the side in his short story The Death's Head Meteor [mcgrewbooks.com](nothing but the text is uploaded right now). Isaac Asimov coined "Robotics" thinking that it was already a word. James Blish [wikipedia.org] coined "gas giant" in his story Solar Plexus. And there's Rority's Stratodoober...
Mad at your neighbors? Join ICE, $50,000 signing bonus and a LICENSE TO MURDER!
(Score: 2) by hendrikboom on Saturday February 07, @02:17AM
Thanks for the lovely old-time story.
(Score: 1, Informative) by Anonymous Coward on Monday February 02, @04:04PM (2 children)
English is the most widely broken language in the world... 🤣
Structure? Yeah nah, nah yeah... same same but different. https://theconversation.com/beware-the-bad-big-wolf-why-you-need-to-put-your-adjectives-in-the-right-order-64982 [theconversation.com]
(Score: 0) by Anonymous Coward on Monday February 02, @04:06PM
(Score: 5, Funny) by HiThere on Monday February 02, @10:26PM
My favorite description of English is "English is the result of Normal men-at-arms trying to make dates with Saxon bar maids.". -- H. Beam Piper
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 3, Informative) by pTamok on Monday February 02, @04:32PM (5 children)
Yet again, we discover that 'the map is not the territory'. In this case, our description of language, and especially grammar, is not the language itself. It should not come as a surprise that there is evidence that we think in ways not described by grammar textbooks.
Colorless green ideas sleep furiously. [wikipedia.org]
So she went into the garden to cut a cabbage-leaf to make an apple-pie; and at the same time a great she-bear, coming up the street, pops its head into the shop. “What! No soap?” So he died, and she very imprudently married the barber; and there were present the Picninnies, and the Joblillies, and the Garyulies, and the grand Panjandrum himself, with the little round button at top, and they all fell to playing the game of catch as catch can till the gunpowder ran out at the heels of their boots. [wikipedia.org]
They do in fact go back to Ethelrebbers Unready, King Albert's burnt capers where, you know, the toast fell in and the dear lady did get a very cross knit and smote him across the eardrome excallybold. The great sword which riseyhuff and Merlin forevermore was the beginning of the Great Constitution of the Englishspeaking peeploders of these islone, oh yes. [wikipedia.org]
I like the idea of language as 'Lego blocks', but it is also a large game of slow-moving Calvinball [wikipedia.org], where you can't break all the rules all of the time, but you can break some occasionally, and still be regarded as speaking 'perfectly good' {whatever}.
The rule about not putting prepositions at the end of a sentence is the kind of thing up with which I shall not put. [grammarly.com] What is the world coming to?
(Score: 0) by Anonymous Coward on Monday February 02, @07:37PM (1 child)
Then, for another level, try some Ella!
https://www.youtube.com/watch?v=OFclKdniaDk [youtube.com]
(Score: 1) by pTamok on Monday February 02, @08:03PM
If you enjoyed that, try some old recordings of Cleo Laine singing scat.
(Score: 2) by hendrikboom on Wednesday February 04, @04:15AM (2 children)
Prepositions can be at end in English.
What's more, in that last example, 'up' is not a preposition. "... with which I shall not put up" would be better.
(Score: 1) by pTamok on Wednesday February 04, @07:24AM (1 child)
I will guess that you are Dutch (the name is a clue), and in my experience, many Dutch people speak and write grammatically better English than the average British person.
The position of prepositions in sentences is much like the admonition against splitting infinitives - grammarians wanting to impose rules rather than describing how people actually use language to communicate. If you ask a native speaker about sentence structure, a common answer is that the word order they use just 'feels right', and many would not know what a preposition is, even though they use them all the time.
Learning Classical Greek has been described to me as first learning all the rules, and subsequently, maddeningly, spending a great deal more effort in learning all the exceptions to the rules. Exceptions demonstrate that rules are approximations, slavish adherence to which will lead one astray. The same is true for English, a language full of exceptions to trip up the unwary adult foreign learner. Informal English follows the 'rules' of English grammar less rigidly than formal English, and in many social situations speaking grammatically correct English would mark you as being slightly strange.
I try to write more formally than the average bear, and try to appreciate it when faults in my writing are pointed out. Improvement is a worthy goal.
(Score: 2) by hendrikboom on Friday February 06, @04:20AM
The venerable Fowler's dictionary of modern English usage considers the restriction about preposition at end to be a superstition.
(Score: 5, Interesting) by JoeMerchant on Monday February 02, @05:30PM (10 children)
There's an old saw about "working memory limits" typically demonstrated as: the average person starts to struggle to remember a sequence of more than 6 numbers, 7 is a typical upper limit... but, there are workarounds: https://en.wikipedia.org/wiki/Piphilology [wikipedia.org]
Without going overboard, if you can "chunk" a number sequence like: 3 5 1 5 4 6 7 2 into recognizable entities like 351 54 6 7 2 - now that's just a sequence of 5 to remember: the 351cu in V8, Car 54, and 6 7 2.
My (profoundly autistic communication impaired) son typically speaks in 2 or 3 word sentences (utterances), but sometimes he will recite a whole phrase as a single chunk: "ride in the orange car now please daddy!" I'm pretty sure he's processing that as a single concept.
🌻🌻🌻🌻 [google.com]
(Score: 3, Interesting) by pTamok on Monday February 02, @08:10PM (2 children)
Aye, when I was younger, I memorized pi to 50 decimal places. I recall it even now, and it is definitely chunked, and there is a rhythm. Same with memorizing some Shakespeare, Coleridge, and Shelley. I don't think it is implausible that language is 'chunked', and we can shuffle around the chunks.
(Score: 0) by Anonymous Coward on Tuesday February 03, @03:06PM (1 child)
Maybe you have a 50 decimal place pi neuron just like someone has a Halle Berry neuron: https://www.caltech.edu/about/news/single-cell-recognition-halle-berry-brain-cell-1013 [caltech.edu]
(Score: 1) by pTamok on Wednesday February 04, @07:05AM
Darn. I shall have to continue administering alcohol until it is eradicated and then I can lead a normal life.
(Score: 3, Interesting) by aafcac on Tuesday February 03, @12:31AM (6 children)
That sort of thing has been known for quite a while. Words don't normally show up randomly, they tend to have collocations of other words that go with them. And IMHO, at the start of learning a language, it's better to think in terms of sentence frames where you're swapping out a word or two to generate a new meaning. It's more or less what you're doing most of the time as one of the goals of fluency is to be able to express your thoughts without having to fixate on the words you're using. You'll typically find that most languages front load the irregular words in terms of stuff that you'll encounter early on as the less commonly used phrases don't get used often enough to be able to maintain oddities.
When there are serious misunderstandings, it's usually not grammar doing it, it's far more likely to be an outright wrong word being used. Grammar is typically more about efficiency than actual communication and you can get a lot further than people oftentimes realize with just 2 and 3 word sentences, provided the words are correct. Nobody really needs compound, complex or compound, complex sentences in English. You can use just simple sentences and strip those down from there and still be largely understood, assuming you don't need to communicate anything too complicated.
(Score: 3, Funny) by Reziac on Tuesday February 03, @02:49AM (5 children)
So, we are just bio-instances of a large language model....
And there is no Alkibiades to come back and save us from ourselves.
(Score: 3, Insightful) by aafcac on Tuesday February 03, @03:13AM (1 child)
Pretty much, the stuff we do and think, up until quite recently, was primarily focused on what got us to live long enough to reproduce. A bunch of the stuff that we do for fun is rooted in some sort of evolutionary need.
(Score: 0) by Anonymous Coward on Tuesday February 03, @07:47AM
For very loose definitions of "need".
For example, there's not a strong evolutionary need for music. Sure it can impress potential mates, but then those mates would have had to evolve that "need" to be impressed by music in the first place.
Perhaps it's like the peacock's tail.
In scenarios where not everything needs to be so close to the min-maxing optimums, there's a lot of room for other stuff.
(Score: 0) by Anonymous Coward on Tuesday February 03, @03:03PM
Once you've trained yourself to do stuff, you can do it without thinking too much about it. Ride a bicycle, type the correct letters for words. Use various words for various thoughts.
But how did we train that. How did we get all those vectors to be closer to useful?
Smarter dogs (and probably even crows) can figure out the difference between a bus and a car without thousands of samples. And definitely they won't mistake those for a traffic light.
(Score: 2, Interesting) by pTamok on Wednesday February 04, @08:02AM (1 child)
While amusing, that gets things somewhat reversed. Large Language Models were developed using a model of how people thought the brain works. It turns out that the model is a poor one, and while LLMs offer interesting results, it is clear that they are built according to a model that is an inadequate description of the brain and cognition.
A well-built orrery will give good predictions of where planets will be seen in the night sky seen from Earth. However, it is a model, not reality, and inspection of interplanetary space will not show a system of rods and gears. LLMs are a model of some of the workings of the brain and will give reasonably good imitations of what output from a real brain looks like. No-one expert in the field will claim that the brain is simply an instance of an LLM. Some of the physical features of the cerebellum (multi-layer neural networks) are implemented in software for LLMs, but scaling up something based on an inadequate description of the original does not give you the original, in the same way that putting together a lot of candles does not give you a sun. LLMs are candles, not small suns.
(Score: 2) by Reziac on Wednesday February 04, @02:50PM
Old Beetle Bailey comic:
Sarge (shaking his head over Beetle's myriad deficiencies): "Bailey, you are a model soldier."
Beetle, confused, consults a dictionary and reads: "Model: a small copy of the real thing."
Yeah, LLM to Brain is at best a spotty model. But it makes for an interesting extrapolation, and another way of looking at how we hang words together.
And there is no Alkibiades to come back and save us from ourselves.