Kicking off the search for an answer to the question of whether there are truths we can learn from Science Fiction in the development of technology, I turn to one of my favourite episodes of Star Trek: The Next Generation (TNG) - "The Measure of a Man" which was first aired on February 13th 1989. The story centres on the character 'Data' who is an android (an automaton made to resemble a human) and whether he is the property of the organisation 'Star Fleet' in which he serves as a crew member.
Data has served aboard the starship Enterprise and has, we are told an exemplary record of service. A scientist Dr Maddox has been studying Data with the view to reverse engineering the technology that enables his 'positronic' brain to function in order to create many hundreds or thousands of replicas that can perform dangerous tasks in the place of humans.
Initially Data is intrigued by the notion, but on inquiry he comes to the view that Maddox lacks the necessary expertise to study him and in fact Maddox' experiments could cause him irreparable harm. His commanding officer, Captain Picard tries to convince data of the potential benefits of the experiments to their organisation - Data counters with the argument that it is because he is a machine (i.e. not human) that he is dispensable. Picard, realising Data ought to be treated with the same rights as any other member of his crew looks for ways to stop Data being transferred and being forced to subject himself to the experiments.
The central argument Maddox puts forward to the Judge Advocate General (JAG) officer is that as Data is a machine, he does not have any rights. The JAG officer finds supporting precedent to support this position (from the 21st Century no less), and Picard challenges forcing a hearing to be convened to decide whether Data is property, and in doing so whether he can be forced to submit to Maddox’ experiments.
What follows is a trial proceeding starting with the prosecution making the case for Data being property. The prosecution’s case breaks into two areas of argument; firstly that Data was built by a man – therefore he is the property of that man, and subsequent people or organisations to whom his ownership is transferred; secondly, his physical and mental attributes being of an order of magnitude greater than anything were created with the purpose of serving human needs and interests. The prosecutor continues with an explanation of how Data operates through an elaborate and complicated interplay between his physical hardware and the neural networks and ‘heuristic algorithms’ including severing his forearm to show how it is constructed and finally by activating Data’s “Emergency Manual Control” i.e. switching him off with the words “Pinocchio is broken, his strings have been cut”.
After a recess, Picard starts his defence. He does not deny that Data is a machine nor that Data was created by a man, arguing that they are not relevant issues to the debate. He calls Data to the stand and undertakes a demonstration of his ‘softer-side’ i.e. his ability to form relationships with people and his own (albeit rough and synthetic) sentimentality.
The second aspect to the defence is Picard calling Maddox to the stand and questioning him on the definition of ‘sentience’. Having argued that Data is not sentient and therefore does not accrue the same rights as ‘living’ beings, Maddox defines sentience as being; intelligent, self-awareness, and consciousness. Having conceded that Data is in fact intelligent, the debate centres on the degree to which he is self-aware, the degree to which he has an ego.
Thirdly, Picard interrogates Maddox on his intentions for Data. Maddox reveals that he intends to study him and then replicate the technology by creating ‘hundreds, thousands if necessary’. Picard answers with the quote “a single Data… is a curiosity. A wonder, even. But thousands of Datas. Isn’t that becoming a race? And won’t we be judged by how we treat that race?” and finally “You see, he’s met two of your three criteria for sentience, so what if he meets the third: consciousness – in even the smallest degree? What is he then?... sooner or later, this man or others like him will succeed in replicating Commander Data. And the decision you reach here today will determine how we will regard this creation of our genius. It will reveal the kind of people we are, what is destined to be. It will reach far beyond this courtroom and this one android. It could significantly redefine the boundaries of personal liberty and freedom, expanding them for some, savagely curtailing them for others. Are you prepared to condemn him and all who come after him to servitude and slavery? Your Honour, Starfleet was founded to seek out new life. Well, there it sits”.
Finally, The JAG officer is called to give her ruling which is that while Data is indeed a machine he is not the property of Starfleet, “This case has dealt with metaphysics, with questions best left to Saints and philosophers… I’m neither competent nor qualified to answer those. I’ve got to make a ruling, to try and speak to the future. Is Data a machine? Yes. Is he the property of Starfleet? No. We have all been dancing around the basic issue. Does Data have a soul? I don’t know that he has. I don’t know that I have. But I have got to give him the freedom to explore that question himself. It is the ruling of this court that Lieutenant Commander Data has the freedom to choose”. The episode ends with Data formally refusing to undergo Maddox’s procedure and returning to his normal life and duties with his colleagues and friends.
For me, the two most interesting areas of technology research today lie in Artificial Intelligence and Robotics. When I think of the purpose behind our investment in these sectors it can only be to satisfy two needs. The first, to better understand ourselves; and the second to continue the tradition of devising labour saving tools to enable us to have the time and capacity to explore other goals in art and in science.
Tool making is all about leverage. By creating a hammer we can exert a force on a nail using our muscles sufficient to drive it into a wall whereas this would be impossible with our fingers alone regardless of our strength. The next step is about time efficiency – a nail gun allows us to drive many more nails into walls in a minute than we could do with a hammer alone. Tools are however single purpose, a quick browse in my garden shed reveals an array of labour saving devices but each only good for the purpose they were designed.
Perhaps it was the aviation industry that first caused us to study the natural world with the view to using this knowledge to build better tools. Our increased appreciation of how a bird achieves flight, combined with our scientific appreciation for aerodynamic computation, together with our engineering ability to create new materials and assemble them in new ways have yielded us incredible results. We have reached the point where our flying machines are larger than anything in the natural world, and are faster than anything in the natural world; albeit perhaps less efficient and less agile than examples from the natural world. Looking at prototype drawings of futuristic planes, the resemblance to the natural world is striking.
We are also limited by our own abilities in aviation. Even the smallest plane has to accommodate a human pilot which puts a limit on how small and agile it can be created. The speed of reflex of a human limits how manoeuvrable a plane can be. Our physical tolerance for G-force limits our ability to create flying machines with the same agility was we see in the natural world. This is where electronic communication and computation has revolutionised this space. Planes can be flown by wire, negating the need for an in-situ human pilot. Computers can process many more calculations than a person ever could, and increasingly, make mathematical predictions better than the most experienced pilot. Watching a formation display at an air-show is an impressive demonstration of skill, but watching a swarm of starlings fly in formation is a marvel – one we can only achieve with computers in the cockpit and not humans.
In the last half-century of the computer revolution we have built machines that can compute calculations reliability at a speed and intensity unimaginable to the early pioneers. However, we have discovered in the process that regardless of a computer’s ability to store, recall, and process data at speed – tasks we find trivial are beyond our technology’s capability. Again, we have turned to the natural world for help. As we have studied the workings of our own brain and those of animals we have experimented with new approaches in computer science with remarkable results. The creation of ‘neural networks’ enables us to ‘teach’ computers activities such as pattern recognition that was previously impossible. Accuracy of applications of this such as in speed recognition has been dramatically improved from the previous approaches which had plateaued in their improvements.
In short, studying the workings of our own mind and attributes of intelligence allows us to better build machines to mimic our own abilities and perform useful tasks. When combined with advanced robotics (through the same process of mimicking the natural world), and some imagination – we arrive in the future with the creation of such a machine as Commander Data.
The story referenced here is set many hundreds of years into the future, at a point where a breakthrough has been made and Data almost fully resembles and mimics human attributes to the point where he should be receiving the same rights as humans. This breakthrough may indeed occur in the 24th Century, or sooner – or never at all, but it is my view that the issues which we will face when we reach that point are sufficiently critical that they ought not just be contemplated long in advance, but be incorporated in the design and development process of the machines we build.
In the end, Data, is considered to have the right to decide his own future, not for any legal or constitutional concerns but simply because he wants to. His character is portrayed as being naïve and in no way manipulative, but imagine if he was programmed with psychopathic qualities? What if your computer refused to undergo a hardware upgrade – would you see this as a practical joke of programming or an expression of self-determination? Would you care? I have personally spent much time this week upgrading my PC to Windows 10 to no avail – it amuses me to consider whether in some way the resistance to the procedure was motivated by its own nascent consciousness rather than a technical ‘glitch’.
Many feel squeamish by the thought of having a physical upgrade, say in the form of plastic surgery. In some cases, it might be medically necessary; but in most cases we tend to believe in the freedom of the individual to choose what’s in their own best interests. In the case of those who cannot choose what’s in their own best interests, we make that choice for them. At what point, I wonder, do we stop considering performing tasks, upgrades, and procedures on the machines that we create for our needs and start considering what might be in their best interest?