Tuesday, September 16, 2008

Brain Scans Used to Convict Woman of Murder in India

I have a serious problem with this -- seems like a perfect example of a little information being dangerous. We are not at a point where the science is reliable enough to be used in this way -- that women could easily be innocent.

Reason Magazine posted this:

Brain Scans Used to Convict Woman of Murder in India

The New York Times is reporting that an Indian criminal court accepted a brain scan as evidence of guilt in a murder trial in India earlier this year. The developer of the the Brain Electrical Oscillation Signature (BEOS) test claims that it uses electrodes to detect when regions of the brain "light up" with guilty knowledge.

http://www.newyorker.com/images/2007/07/02/p233/070702_r16377_p233.jpg

According to the Times:

The woman, Aditi Sharma, was accused of killing her former fiancĂ©, Udit Bharati. They were living in Pune when Ms. Sharma met another man and eloped with him to Delhi. Later Ms. Sharma returned to Pune and, according to prosecutors, asked Mr. Bharati to meet her at a McDonald’s. She was accused of poisoning him with arsenic-laced food.

Ms. Sharma, 24, agreed to take a BEOS test in Mumbai, the capital of Maharashtra. (Suspects may be tested only with their consent, but forensic investigators say many agree because they assume it will spare them an aggressive police interrogation.)

After placing 32 electrodes on Ms. Sharma’s head, investigators said, they read aloud their version of events, speaking in the first person (“I bought arsenic”; “I met Udit at McDonald’s”), along with neutral statements like “The sky is blue,” which help the software distinguish memories from normal cognition.

For an hour, Ms. Sharma said nothing. But the relevant nooks of her brain where memories are thought to be stored buzzed when the crime was recounted, according to Mr. Joseph, the state investigator. The judge endorsed Mr. Joseph’s assertion that the scans were proof of “experiential knowledge” of having committed the murder, rather than just having heard about it...

Ms. Sharma insists that she is innocent.

As the Times points out, most U.S. experts doubt that the BEOS technology has been properly validated. However, neuroscience researchers are working toward creating such a "truth machine." As the Times notes:

“As we enter more fully into the era of mapping and understanding the brain, society will face an increasing number of important ethical, legal and social issues raised by these new technologies,” Mr. [Hank] Greely, the Stanford bioethicist, and his colleague Judy Illes wrote last year in the American Journal of Law & Medicine.

If brain scans are widely adopted, they said, “the legal issues alone are enormous, implicating at least the First, Fourth, Fifth, Sixth, Seventh and 14th Amendments to the U.S. Constitution.”

“At the same time,” they continued, “the potential benefits to society of such a technology, if used well, could be at least equally large.”

Back in 2001, I looked at the status of brain scanning technologies and pointed to some of the possibilities that fully validated brain scanning technologies would offer for abuse by government and some implications for the future of privacy:

...James Halperin, author of the 1996 science fiction novel The Truth Machine.., notes an interesting convergence in current fMRI and brainwave research since his fictional "Cerebral Image Processor" measured a combination of electrical activity and blood flow. In The Truth Machine, Halperin illustrates the benefits and problems that the pervasive availability of an infallible lie detector would cause society. It is easy to see some of the benefits -- detecting would-be terrorists, finding politicians who tell the truth during campaigns, detecting honesty in meeting contractual obligations. But what about those areas of life we would like to keep private, say, one's sexual orientation, or unusual religious beliefs, or drug habits, or taste in pornography? Halperin suggests that right now, many of us tolerate laws and regulations on many of these private activities because we know that we are not likely to be caught when we violate them. In a world where the truth can be known absolutely, Halperin thinks laws regulating many private activities would be repealed and there would be areas of life in which the use of a truth machine itself would be banned.

Whole column here.


No comments: