MIT presents the "Wearable Reasoner," a proof-of-concept wearable system capable of analyzing if an argument is stated with supporting evidence or not to prompt people to question and reflect on the justification of their own beliefs and the arguments of others:
In an experimental study, we explored the impact of argumentation mining and explainability of the AI feedback on the user through a verbal statement evaluation task. The results demonstrate that the device with explainable feedback is effective in enhancing rationality by helping users differentiate between statements supported by evidence and those without. When assisted by an AI system with explainable feedback, users significantly consider claims given with reasons or evidence more reasonable than those without. Qualitative interviews demonstrate users' internal processes of reflection and integration of the new information in their judgment and decision making, stating that they were happy to have a second opinion present, and emphasizing the improved evaluation of presented arguments.
Based on recent advances in artificial intelligence (AI), argument mining, and computational linguistics, we envision the possibility of having an AI assistant as a symbiotic counterpart to the biological human brain. As a "second brain," the AI serves as an extended, rational reasoning organ that assists the individual and can teach them to become more rational over time by making them aware of biased and fallacious information through just-in-time feedback. To ensure the transparency of the AI system, and prevent it from becoming an AI "black box,'' it is important for the AI to be able to explain how it generates its classifications. This Explainable AI additionally allows the person to speculate, internalize and learn from the AI system, and prevents an over-reliance on the technology.
https://doi.org/10.1145/3384657.3384799
Will this help the fight against misinformation/disinformation? Originally spotted on The Eponymous Pickle.
Related Stories
Dick Clark's New Year's Rockin' Eve has become a woke, sanitized shell of its former self. The crowd of rowdy, inebriated locals and tourists is long gone. What you see now is bouncing and screaming for the latest flash-in-the-pan artists while industry veterans like Duran Duran barely elicit a cheer.
Youtuber and music industry veteran Rick Beato recently posted an interesting video on how Auto-Tune has destroyed popular music. Beato quotes from an interview he did with Smashing Pumpkins' Billy Corgan where the latter stated, "AI systems will completely dominate music. The idea of an intuitive artist beating an AI system is going to be very difficult." AI is making inroads into visual art as well, and hackers, artists and others seem to be embracing it with enthusiasm.
AI seems to be everywhere lately, from retrofitting decades old manufacturing operations to online help desk shenanigans to a wearable assistant to helping students cheat. Experts are predicting AI to usher in the next cyber security crisis and the end of programming as we know it.
Will there be a future where AI can and will do everything? Where artists are judged on their talents with a keyboard/mouse instead of a paintbrush or guitar? And what about those of us who will be developing the systems AI uses to produce stuff? Will tomorrow's artist be the programming genius who devises a profound algorithm that can produce stuff faster, or more eye/ear-appealing, where everything is completely computerized and lacking any humanity? Beato makes a good point in his video on auto-tune, that most people don't notice when something has been digitally altered, and quite frankly, they don't care either.
Will the "purists" among us be disparaged and become the new "Boomers"? What do you think?.
(Score: 2) by Frosty Piss on Wednesday December 14 2022, @12:04PM (3 children)
Probable just makes a lot of Wikipedia queries. Seriously, the things they call "AI"...
(Score: 0) by Anonymous Coward on Wednesday December 14 2022, @12:21PM (1 child)
Seconded. I'll stick with, "Do your own thinking".
(Score: 2) by Immerman on Thursday December 15 2022, @02:09AM
It think the point is that the overwhelming majority of people *really suck* at thinking rationally. Our brains aren't designed for it, it's a skill that tends to take decades of practice to get good at, and most students are more interested in memorizing the answers and playing social games, and then never again seriously training their mind after graduating.
Hell, most people probably can't name even three common logical fallacies - and they've got names precisely because they're so easy to fall prey to that almost everyone not watching out for them does.
(Score: 2) by Immerman on Thursday December 15 2022, @02:17AM
As described, that would be useless.
It doesn't sound like they're making a fact checker - which a Wikipedia search could provide, if you trust Wikipedia.
It sounds from the summary like they're analyzing the structure of argument to see if supporting evidence is provided at all, rather than if it's accurate. Presumably as opposed to the much more common case where no supporting evidence is provided at all, and the argument is all bombast and empty rhetoric without even an attempt to provide evidence. See: almost everything that has ever came out of Trump's mouth, and a great many other politicians for that matter.
One of the great things about logic is that you can analyze the integrity of an argument entirely independently from the accuracy of any statements within it. If the logical integrity is flawed, then it doesn't matter how accurate the statements are, they can't carry you from premise to conclusion.
(Score: 2) by bradley13 on Wednesday December 14 2022, @12:23PM (3 children)
There's some psychological principle (whose name I forget) that basically says: People assume that the information they possess is correct. People are not going to "reflect on the justification of their beliefs". At best, they will get irritated that this system is bombarding them with stupid questions about stuff that they know to be true.
Leaving this psychological principle aside: We are once again confronted with the question of "who decides what truth is". For some objective things like "the earth is round" that's not difficult. However, such clear facts make up a very small portion of daily discourse.
Everyone is somebody else's weirdo.
(Score: 2) by JoeMerchant on Wednesday December 14 2022, @03:25PM
>For some objective things like "the earth is round" that's not difficult
that depends entirely upon who you are asking. I have seen enough direct evidence of "round earth" for myself to go along with the preponderance of evidence also presented, and generally believe other things those people are saying too.
However... "Settled Science" is a religion onto itself. What's a "safe dose" of radiation? Well, that depends on when you ask "Settled Science" the question.
Rather than declaring "objective things" as true, or false, it might be more informative to give a summary score, starting with an estimate of current beliefs of the world's 8 billion+ population. Is the earth round? 99.9944% True according to popular opinion. Break that down by country, by economic class, by education level, etc. etc. etc.
Fun would be a search of this space to come up with "nuggets of truth" widely "known to be fact" by your own classifications, but regarded as untrue by a majority of other populations - with bonus points for deep dives into why others believe your "truths" to be false.
🌻🌻 [google.com]
(Score: 2) by Michael on Wednesday December 14 2022, @05:06PM
Have a look at the youtube channel "street epistemology" for counter-examples to your second assumption (if that's something you're interested in).
(Score: 2) by crafoo on Wednesday December 14 2022, @07:52PM
my thought is that the device would at the very least point out incorrect logical statements. As in, the AI understands axiomatic and predicate logic. then it can correct the user or anyone else in the conversation when they make logical mistakes. Seriously though, many people do not really understand compound AND, OR, NOT statements and then struggle when simply trying to draw logical conclusions.
It's like people's logic and reason is just broken, or was never really taught, or is being actively undermined and corrupted daily...
(Score: 2) by Thexalon on Wednesday December 14 2022, @01:18PM (3 children)
Let's say you have access to an all-knowing machine that can tell you what would be the wisest course of action available. And let's say, for the sake of argument, that this machine works perfectly every time.
Does that mean you'll always behave wisely? Not at all! For the same reason that the warnings to not smoke cigarettes, or the PSAs telling you not to drive drunk, or the doctor telling you to eat beans-and-greens rather than pizza and soda, or the friends warning you about the hot person you're about to bed don't work all the time or all that well: The smart part of human brains is wired to justify the decisions the stupid impulsive part already made.
The only thing that stops a bad guy with a compiler is a good guy with a compiler.
(Score: 1) by khallow on Wednesday December 14 2022, @03:25PM
(Score: 2) by tangomargarine on Wednesday December 14 2022, @08:26PM (1 child)
Sorry to be a wet blanket here...but I imagine this would violate some sort of law of the universe on a quantum level. You'd need to be able to see the future, because while there may be a *logical* best course of action, that always relies to a greater or lesser extent on *other people* also approaching the problem logically.
And, well...after the Trump presidency and COVID and everything the last 6 years, that's one belief of mine that has been shattered.
Logically, there's a pandemic happening, and thousands of people are dying. How do you solve this problem? Quarantine. Until a cure is developed, this is not in any way debatable.
But no, people whining about "my freedoms" and refusing to behave like an adult, all over a silly minor little demand like putting a piece of fabric over your face.
And yes, even when we logically know something is a bad idea, sometimes we do it anyway.
"Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
(Score: 2) by Thexalon on Wednesday December 14 2022, @11:41PM
I'm well aware of that - the point I was making is that even if you've solved the technical problems perfectly, you still haven't solved the human problems.
The only thing that stops a bad guy with a compiler is a good guy with a compiler.
(Score: 0) by Anonymous Coward on Wednesday December 14 2022, @02:51PM (2 children)
"teach them to become more rational over time by making them aware of biased and fallacious information through just-in-time feedback."
Get back to me when you can find a working example of this that doesn't need to be Musked into my meatspace.
(Score: 2) by Michael on Wednesday December 14 2022, @05:12PM
I'll get back to you in the sixteenth century then, when what we'd recognise as the scientific method was formalised.
For a looser definition, there's also the Aristotelian methods of ancient Greece.
(Score: 2) by tangomargarine on Wednesday December 14 2022, @08:29PM
Nobody is talking about surgically implanting the device. A glance at the article shows that it's a headband that you wear.
Hell, the *headline* says Wearable Reasoner. Unless you consider a Pacemaker something you "wear"?
"Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
(Score: 2) by Snospar on Wednesday December 14 2022, @04:15PM (2 children)
We all carry around relatively powerful computers which should easily be capable of processing this type of task. Why bother making it an independent "wearable" device? Unless of course you want to be able to spot, from a distance, those people who need this type of assistance. I've never really thought of idiocy as one of those invisible disabilities but maybe I should, maybe this is just what those morons need to stand out and feel special.
(Sorry, that's come out more grumpy than normal. Bad day at work.)
Huge thanks to all the Soylent volunteers without whom this community (and this post) would not be possible.
(Score: 2) by looorg on Wednesday December 14 2022, @06:53PM (1 child)
Do you want to surgically implant your smartphone into your brain (or other body cavity)? In that regard I guess I would prefer to have it wearable, at least then I can take if off. I recon people are somewhat turned off by the idea that you should insert things into your body, AI stuff or otherwise. But as noted I guess they want to go all man-machine and enhance ourselves or whatever they want to call it instead of just walking around with a fairly small computer in our pocket. Perhaps there is an interface and speed issue -- you can only type and read so fast but if you can just hook it up to your brain it might be there with or at the speed of thinking.
(Score: 2) by Snospar on Wednesday December 14 2022, @06:59PM
Of course not! And I don't think I alluded to anything of the sort. You can use most of the functions of a smart phone, especially this sort of "assistant", using a discrete Bluetooth headset.
Huge thanks to all the Soylent volunteers without whom this community (and this post) would not be possible.
(Score: 0) by Anonymous Coward on Wednesday December 14 2022, @04:54PM
Corporations, or the Ministry of Truthiness?
"We all carry around relatively powerful computers which should easily be capable of processing this type of task."
I don't.
When I don't want to be contacted, you can't.
Tune in, Turn on, TURN OFF!!
(Score: 2) by SomeRandomGeek on Wednesday December 14 2022, @05:04PM (3 children)
Irrational people are often irrational intentionally. To pick a famous example, consider Kim Jung-Un, the leader of a country whose whole defense strategy rests on a willingness to use nuclear weapons. And he comes across as the kind of guy who would nuke Seoul just to see the pretty lights. The North Koreans constantly signal irrationality because it makes their threats credible. In every day life, being "rational" means that if some smooth talker can sell you a rational argument, and you can't see the flaw in it, you have to let yourself be persuaded. That's great if you're smart, but what if you're not? What if you can't tell the difference between the truth and lies? Do you have to believe everyone who comes along and drops some big words on you? Lots of people are irrational as a defense mechanism. They know they can't win the "rational" game, so they just refuse to play. Those people do not want a wearable tool that points out when they are being irrational. Being irrational doesn't work unless you can pretend you are doing it in good faith.
(Score: 2) by JoeMerchant on Wednesday December 14 2022, @09:36PM (1 child)
Mod system broken. +1 Insightful ^^^
🌻🌻 [google.com]
(Score: 2) by acid andy on Thursday December 15 2022, @01:53AM
+1 Informative ^^^
Welcome to Edgeways. Words should apply in advance as spaces are highly limite—
(Score: 2) by coolgopher on Thursday December 15 2022, @12:52AM
Actually, in the case of Kim Jung-Un, I'd argue he is acting quite rationally. He's achieving the primary objectives - 1) keeping himself in power, 2) keeping scarcity so it can be used as a mean to control people, and 3) keeping other countries from wiping it out with military force.
Do I think he's morally right with such a priority order? Well, no. But I think he's acting well in line with his chosen* priorities.
*) Priorities assumed based on outside observation; no special knowledge available to me
(Score: 0) by Anonymous Coward on Thursday December 15 2022, @02:18AM
If I have an advanced wearable AI what I'd want it to do is to do facial and audio recognition etc AND then help remind me of people's names and other useful info (background info, when I last met them etc). Also to notify me if any bosses, notable people, etc are approaching/near.
Continuous high res recording in an X minute loop so when I want to start a high res recording, I can, without missing the past X minutes.
Military version could help identify potential military items of interest (mines, weapons, glint, etc) and also work with other squad member's devices to detect "crack-thump" sounds and help highlight the area where the shooter might be at.