These may sound like scenarios from the latest hit video game, but according to a report by the Royal Society, Britain's national academy of science, rapid advances in neuroscience could make them a realistic vision of the future of warfare and law enforcement. The report - titled Brain Waves Module 3: Neuroscience, Conflict and Security - concludes neuroscience could be used to boost the performance of soldiers and policemen, choose the most suitable individuals for particular tasks and enable soldiers to control weapons through a direct mind-machine interface, as well as creating a new generation of chemical weapons.
Don't believe it? US researchers have already used transcranial direct current stimulation (tDCS), a weak electrical current passed through the skull, to help solders spot targets more effectively.
Biochemical pharmacologist Rod Flower, chairman of the report's working group, admits most such technology is in its infancy, but says "all leaps forward start out this way. You have a groundswell of ideas and suddenly you get a step change."
Yes and no, says Robert Sparrow, a philosopher with Melbourne's Monash University. "Given how primitive treatments for mental illness remain, the idea that we're going to be able to control complex machinery through brain-machine interfaces any time soon strikes me as very optimistic indeed."
Still, Sparrow says if such advances are made they would throw up a tangled web of ethical and legal issues such as so-called dual use. A case in point is tDCS. It may produce important new therapies for dementia and mental illness, as well as military applications. As the Royal Society report notes, scientists must be made aware early in their careers that purely medical research may not stay that way.
Sydney-based Wendy Rogers agrees: "I think there are good reasons for scientists in general to be exposed to both historical examples and discussion of the ethical dilemmas that can arise from the uses of any new technologies," says the Macquarie University bioethicist. "Not with the aim of stifling research, but (so) that scientists will recognise social and moral responsibilities as well as scientific ones."
But dual use is a numbingly complex issue. Should researchers shy away from medically important research if it has an obvious illegitimate application? What is and isn't legitimate anyway? And isn't it true that much research could be put to dubious use if it got into the wrong hands?
Blaming scientists for harmful use of their work is akin to blaming knife manufacturers for an upturn in muggings, says philosopher and neurolaw expert Nicole Vincent, also at Macquarie."(But) this does not absolve scientists of responsibility to not work on projects which they know, or have reasonable grounds to suspect, will yield morally or legally objectionable uses."
Clearly, morality is in the mind of the beholder. Still, there's little argument that it's unacceptable to redirect expertise and resources from medical research to military neuroscience.