Seasons

This is a forum or general chit-chat, small talk, a "hey, how ya doing?" and such. Or hell, get crazy deep on something. Whatever you like.

Posts 4,180 - 4,191 of 6,170

18 years ago #4180
and (rather more complicated,) http://www.uphs.upenn.edu/trc/langleben/tellingtruth.pdf Quotes a reliability of 97% lie detection and 93% truth detection (see Table 1).

This study reports, “Lie was discriminated from truth on a single-event level with an accuracy of 78%, while the predictive ability expressed as the area under the curve (AUC) of the receiver operator characteristic curve (ROC) was 85%."

Table 1 gives some data that was subject to further statistical analysis. Even the authors of the study do not claim (and I doubt will ever claim) that they could get 100% under any conditions. No one who does human research claim 100% on anything.

I can't fault their statistical method, and I think that's pretty darned good for a field of study this young.

I don’t fault their method, or their conclusion that in limited circumstances, there is a certain level of correlation between some MRI patterns and a forced “lie” as they defined it in this experiment. It is unwise, however, to generalizes these results to “lie detections” in the real word. If you read the study carefully, they are making very limited claims about what they were able to do, and they are careful to define “lie” as a forced choice.

Since you bring it up, let’s examine their method and definitions, so that we don’t fall into the trap of equivocation and overgeneralization so often exhibited by the popular press. The authors report that the participants were twenty-six right-handed male undergraduate students with a mean age of 19. Your brain continues growing throughout your life, and 19 year olds do not have “mature brains” in any sense of the word. The study controlled for brain dominance because they know that a significant portion of the population is wired differently in that respect and may not fit the pattern they hope to establish. They controlled for gender for similar reasons (there are many reported differences between the male and female brain). They should control for these factors but when reading these studies we must note that we should not generalize the results to all people based on results of a small number of select participants used in human research.

This experiment involved what they called a forced lie task. They state that a “pseudorandom sequence of photographed playing cards was presented. The series included five stimulus classes: (1) Lie (5 of clubs or 7 of spades); (2) Truth (5 of clubs or 7 of spades); (3) recurrent distracter (2 of hearts); (4) variant distracter (remaining cards 2–10, all suits); and (5) null (back of a card).” This study would be more appropriately used to detect “tells” in playing poker than in forensic applications. There are most likely different brain functions used in seeing one thing and saying another than in either making up a lie on the spot unrelated to stimuli in front of you or in repeating a lie you have memorized and conditioned yourself to mimic a believed (or to tell a story in which you have suspended disbelief).

Creating a lie, repeating a lie and reporting one thing while seeing another may all be called “lies” but they are not the same brain functions. Have you ever seen the demonstration of the “Stroop effect” which examines how difficult it is to see a color and read a word that names a different color? You can try it for yourself here http://www.apa.org/science/stroop.html. I would expect the type of suppression of “truth” reported in this study and the type of suppression of two different messages regarding stimuli (e.g. as in the Stroop effect) would be more similar than the patterns of various types of “lies” one may tell.

The authors of this study were careful to define their terms and limit the application of their results. They said in their discussion of the results, “the final common denominator of intentional deception could be conceptualized as a conscious act of suppression of information that is subjectively true. This may or may not be accompanied by a release of subjectively false information.” They acknowledge subjective truths may vary and that the release of subjectively false information may be another matter. It is the release of false information most people think of as a lie. They make no claims of being able to detect such lies. The do say, “Although lie and truth are mediated by a similar frontoparietal network, lie appears to be a more working memory-intensive activity, characterized by increased activation of the inferolateral cortex implicated in response selection, inhibition, and generation.” However, trying to remember a past event may be equally memory intensive, even when is telling the subjective truth to the best of one's ability.


18 years ago #4181
With MRI scanning resolution improving exponentially year by year, I can't see it not reaching virtually 100% very soon now.

The problem with using MRIs as lie detectors in real world applications is not the resolution of the MRI technology. The problem is the way the human brain works is not exactly the same form person to person, and there are many confounding variables.

Other brain functions may have the same or similar patterns to those observed in “lying” in these experiments. The authors of the study you referenced note, “Critical questions remain concerning the use of fMRI in lie detection. First, the pattern of activation reported in deception studies was also observed in studies of working memory, error monitoring, response selection, and “target” detection [Hester et al., 2004; Huettel and McCarthy, 2004; Zarahn et al., 2004]. “ In other words, just because these patterns may correlate with a forced lie, doesn’t mean they cannot correlate with other activities. The person may be suppressing the truth of stimulus presented at that moment, or checking for errors. Higher resolution won’t fix this issue because our brain uses the same areas to do more than one task.

The conditions under which the MRI is given may effect the results. The authors continue, “Second, inference in the General Linear Model (GLM) analysis of blood oxygenation level-dependent (BOLD) fMRI is based on a contrast of conditions, making the choice of a control condition critical. Thus, the difference in the attentional value (salience) of the cue (condition) intended to elicit “lie” and the control “truth” may have confounded previous studies [Cabeza et al., 2003; Guet al., 2004; Langleben et al., 2002]. “ (my emphasis). So, there are possibly confounding variables (other explanations for these results besides lying) and the conditions in which the MRI is given is critical. Higher resolution MRIs will do diddle squat to make real word conditions (especially inquisitorial settings) like lab conditions in these studies.

There are other problems with the generalization and forensic application of such studies The authors admit, “Finally, the sensitivity and specificity with which an fMRI experiment can discriminate lie from truth in the individual subject or single event level is unknown.” Higher resolution does not solve these problems.

Have you considered my point about the brain’s plasticity? (See http://books.google.com/books?hl=en&lr=&id=MMaujNPDptAC&oi=fnd&pg=RA1-PR10&sig=F7xwhWzCREELRZGr9twi15kcZbM&dq=brain+plasticity#PPR5,M1). One aspect of this is that if part of the brain is injured, the brain reroutes the tasks that used to be performed by that part to other areas. Another aspect is that we use various parts of our brain in different ways depending on our age and stage of development. Furthermore, a change in observed brain patterns (plasticity) can be induced through training (http://brain.oxfordjournals.org/cgi/content/abstract/122/9/1781). Do you want to create a group of people who have trained their brain to lie differently? Each of us has a unique brain developing under unique conditions, you cannot reliably say that any generalization about a specific brain function can be applied to an individual with no chance of error.

When studying large groups of people, you will find patterns about what areas of the brain are often used in certain tasks. This does not mean every individual will exhibit the same pattern. The brain itself is constantly adapting and changing what parts of itself is used, and for what tasks. So these 26 young men in this study used these parts of there brain for the given task of a forced lie. Within this group, there was difference in patterns depending on whether the participant used their right hand or left hand to complete the task. It follows that there may be other differences based on how the participant completes the task and any individual’s particular neurological development and condition.


18 years ago #4182
But seriously - I think ubiquitous honesty is the logical next step for our species.

Morally and philosophically you may have a point. I do not, however, believe it has a scientific basis. “Lying” is an evolutionally advantage, interconnected with creativity, empathy and abstract thinking. Without the ability to state that which is not true, we cannot think of hypothetical’s or conduct thought experiments or have fantasies. Lying is an important part of the human experience.

It's time humans stopped lying to each other, and if they couldn't get away with it, they wouldn't do it (especially the politicians.) It's scary, sure - growing up always is, but it's an unavoidable step towards maturity.

If the politicians really believed the “lie detectors” worked and could be used on them (as opposed to being a tool of propaganda in the war on terror) they would cut the funding immediately and ban all such research.

Wars couldn't be fought, people couldn't be exploited, criminals couldn't escape justice, the innocent couldn't suffer miscarriages of justice, if we had a totally reliable and universally available truth verification technology. People would stop lying, because it simply wouldn't have any advantage any more.

Ha! We know there were never any WMDs, we have evidence our leader knew that too and we are still at war. Lies and truth mean nothing next to belief , desire and ambition. Politics is called the master science for a reason. Besides, even at best version of your lie detectors, if such mechanisms existed, would only tell you whether the individual believed they were lying—a subjective truth. There is much research to show the difference between subjective truth and actual events, but I have no time to go pull it up now.

MRIs and the like are wonderful things. We should continue to do research by all means. We should especially concentrate on areas such as diagnosis and treatment of neurological disease. However, as “lie detectors” or “mind readers” we need to be very skeptical and cautious before accepting generalizations claims of scientific truth based on such data.

18 years ago #4183
“Lying” is an evolutionally advantage, interconnected with creativity, empathy and abstract thinking. Without the ability to state that which is not true, we cannot think of hypothetical’s or conduct thought experiments or have fantasies. Lying is an important part of the human experience.

IMHO [heartbeat accelerates], perhaps the generation of fictions, hypotheses, metaphors [e.g., saying, "my love is a rose" when you know very well she's not] and other figures of speech, fantasies, jokes, ironic and sarcastic statements, and perhaps other kinds of deliberate literal falsehoods are not lies. To lie is to intentionally state a falsehood with intent to deceive.

It strikes me as a layperson [epinephrine concentration increases] that old-fashioned lie detectors were not detecting falsehood, or [pupils contract] the intent to deceive, but [skin pales] the anxiety experienced by the [blood flow in posterior cingulate gyrus increases] liar on account of fear of [size and frequency of saccadic jumps rises] detection. Is it possible [skin conductivity rises] that these new methods are also detecting something of the sort? [wrings hands] In which case they are n-n-not 'mind-readers' at all, j-just [cringing expression] 'emotion-readers.' [Output of pheromone TX-trans-op56 increases, vd-AB+ decreases.]

18 years ago #4184
[mostly re: 4178]
Bev,

Too many excellent points for me to deal with all of them until I get another dose of sleep.

Just a few quick points -

Dr. Farwell's 'brain fingerprinting' isn't MRI-based - it's a much inferior EEG-based technique that's only one step up from the polygraph.
The polygraph is claimed to be something between 70-90% accurate, and that's a problem: we do need a lot more accuracy - as close to 100% as possible. But a bigger problem is that it's very bulky and needs trained specialists to interpret the data. But with computer power advancing and miniaturising the way it is, it can't be long before automated data analysis outperforms human interpretation by a long way.

Yes, the US Government funds much of it ATM, but there are research institutes all over the world not under their control. And this isn't multi-billion $ 'particle collider' stuff. The only expensive bit of equipment (the MRI scanner itself) is standard issue in most western hospitals and medical research facilities. If the US choose to supress their research, the rest of the world will publish, commercialize and market product, if only because there's a profit to be made.

I agree that MRI-based technology is unlikely to be portable any time soon - apart from the great hazard that using it in public would constitute (watch out any passing pedestrians with steel pins or pacemakers!) But the origins of the polygraph go back to 1913. Nearly a century later, with all our technical advances, I think the time is ripe for a more reliable technology. Perhaps near infrared laser spectroscopy, perhaps microfeature recognition, or thermal imaging, perhaps a combination of strategies. None of these is particularly expensive.
And as the computer power for analysing the data carries on getting exponentially cheaper, so any hope of governmental control slips away. What's the US Government going to do when Toshiba or Samsung start marketing the things? Bomb Tokyo or Seoul? Ban their import? Like that's ever worked! Frankly I think politicians are too fixated on their own crooked schemes to appreciate the danger until it's too late.

And that's [reading your “powered up” brain] exactly what an fMRI scanner does, just at a resolution much cruder than the individual data bits (currently.)

No, the MRI does not work that way.

Well, only because the resolution isn't up imaging individual cellular signalling yet. They have to work in voxels (volumetric pixels,) of aggregates of cells. But in the same way as you can recognize a surprisingly low-definition pixelated picture of something (100 pixels can often reliably encode a known face,) a low-definition voxelated video, the evidence seems increasingly to show, can convey reliable data.
Sure, we won't be reading individual thoughts yet, and certainly not porting minds to silicon for as long as it takes to scale up another few orders of magnitude, but I see no intrinsic problem in doing so once the technology has the bandwidth to map an entire brain at a cellular level in realtime (how ever many fps that needs to be to achieve the necessary definition.) But that is a whole other rung of the ladder. And not necessarily the next one after UH™.

Do you really think it is intrinsically impossible, even given that the human race (and it's exponential technological growth,) may continue to evolve for another 100 million years? Or more? If not, then we're only arguing about timing. In the last 100 million years, we've progressed from arboreal rodents in a world dominated by reptiles to putting a man on the moon. And 99.999% of that technological advance has happened in the last few hundred years - it's massively exponential.

But if you think it is intrinsically impossible... well, I'd like to know why, because it seems unnecessarily pessimistic to me. Our technology has reached the point on an exponential curve where it's about to go asymptotic into some sort of Singularity, unless we blow ourselves up, or someone switches the universe off by mistake. And I think there's at least as good a chance that that won't happen as that it will. At least not before we figure out how to perfect a low-tech, reliable lie detector. It's a pretty modest goal (compared to some of the weird and wonderful stuff that gets suggested by people like me )

But it can at least image the whole brain at once in realtime.

That it can. Just as an x-ray can show you whether or not you have a broken bone

Except a fMRI does it in video - not just a snapshot. That's the point of the "functional". And that's a lot more relational data than a picture can ever paint.

I'll respond to your comments on the 2 articles (sorry they're pdf - I know it's a PITA) when I've resynched my body clock.

18 years ago #4185
“Lying” is an evolutionally advantage

So were tails. And before that fins.

It doesn't mean we still need it - especially if it gives an excuse for cynical politicians to start wars in a world already over-proliferated with WMDs.

18 years ago #4186
In the last 100 million years, we've progressed from arboreal rodents in a world dominated by reptiles to putting a man on the moon.

I'm not sure that is the fact most relevant to predicting our future fate. More relevant may our progress in facing problems before they reach crisis proportions (see the above remarks about global warming) and our learning to co-operate rather than compete. Certain kinds of technological sophistication tend to exacerbate these two problems rather than solve them.

18 years ago #4187
More relevant may our progress in facing problems before they reach crisis proportions (see the above remarks about global warming) and our learning to co-operate rather than compete

All the more reason why we need to stick fizziplexers (nice word BTW,) on all our politicians and make them answer the question "do you genuinely believe there are doubts about climate change, or is the oil industry paying you to allow them to carry on trashing the planet for profit?"

There's no hiding place once the fizziplexers hit eBay

18 years ago #4188
I'm inclined to agree that the fizziplexers would be a good thing. Not only because people lie to each other, but because they lie to themselves. [fizziplexer remains green and silent]

18 years ago #4189
Well, only because the resolution isn't up imaging individual cellular signalling yet. They have to work in voxels (volumetric pixels,) of aggregates of cells. But in the same way as you can recognize a surprisingly low-definition pixelated picture of something (100 pixels can often reliably encode a known face,) a low-definition voxelated video, the evidence seems increasingly to show, can convey reliable data.

It's not matter of resolution. It's not possible to use MRIs to detect lies or read your mind because the human brain does not work that way. You are ignoring the type of chemical communication involved in neural transmissions and the fact that we don't understand how though works at this time and so have no chance of decoding or reading it.

The most accurate real time model of what an individual brain does will not tell you what a person thinks. More accuracy will give better pictures of the specific neural network used by one individual for a specific task at a specific time. It will not tell you the specific chemical message being sent between neurons in the synapses or interpret how they fit together to form "thought" and tell you what that individual is thinking. We don't even understand neurochemicals, much less how they make thought happen or what the contents of thought may actually be.

You seem to think that size is an issue. Let me tell you: size does not matter (at least not to me). What matters is that chemical messages transmitted through each synapse cannot be read, understood, put in context with other neural activity and translated into "thought", decoded and "read" as such. You are confusing a picture of the brain with the contents of the thought. They are not the same.

With some improvement, you may achieve some sort of scan that tells you exactly what part of the brain an individual uses when he thinks the words "Mary had a little lamb" under specific circumstances at a given time. You will not be able to use a scan to read what he is thinking unless he tells you. You must trust his self report. Furthermore, you will not know that in a week, a month or a year he will use that same neural path the same way, and you can not generalize to say that all people would use that particular neural path--they may not even have the same neural path developed.

Even if you presume to identify the neural path or part of the brain involved for specific types of "lying", how will higher resolution account for the fact that these neural paths and areas may be used for other tasks as well? What about plasticty? What about a reasonable margin for error?

I am not saying that it is impossible to ever figure out how to read brains. I am saying we are no where close to understand how thoughts happen, much less reading minds. Trying to use MRIs as lie detectors would be based on correlations and assumptions, not on the ability to actually read the thoughts of the person in questions.

18 years ago #4190
Odd how the things that used to be 'labled deadly sins' are now an evolutionally advantages. I don't see where they serve any purpose these days, if they ever did..And I have an odd feeling that "fizziplexer" knockoffs.. (remote controlled to signal as the user wished) would hit the market the day after the real fizziplexers. Look how honest I am my fizzy is all green..

18 years ago #4191
IMHO [heartbeat accelerates], perhaps the generation of fictions, hypotheses, metaphors [e.g., saying, "my love is a rose" when you know very well she's not] and other figures of speech, fantasies, jokes, ironic and sarcastic statements, and perhaps other kinds of deliberate literal falsehoods are not lies. To lie is to intentionally state a falsehood with intent to deceive.

But if you start looking at the MRI studies and similar lie detector claims, you see lies most often defined as the suppression of subjective truth. If you want to use technology to tell if someone has the intent to deceive, you would have to do more than even read thoughts (something we are nowhere even close to doing). You would have to find a way to identify and read complex motivations and tease out whether there was "intent" and argue about exactly what that "intent" may be and whether or not it was acted on or if other intentions were there, and if the person was aware of all those competing drives and motives at the time the statement was made. No researcher claims they have anything like that cooking.

A philosophical definition of lie may not match the definitions used in neurological reasearch.


Posts 4,180 - 4,191 of 6,170

» More new posts: Doghead's Cosmic Bar