By Michael LaBossiere
While science fiction has long included speculation about robot-human sex and romance, the current technology offers little more than sex dolls. In terms of the physical aspects of sexual activity, the development of more “active” sexbots is an engineering problem—getting the machinery to perform properly and in ways that are safe for the user (or unsafe, if that is what one wants). Regarding cheating, while a suitably advanced sexbot could actively engage in sexual activity with a human, the sexbot would not be a person and hence the standard definition of cheating (as discussed in the previous essays) would not be met. Put another way, sexual activity with such a sexbot would be analogous to the use of any other sex toy (such as a simple “blow up doll” or vibrator). Since a person cannot cheat with an object, such activity would not be cheating. Naturally enough, some people might take issue with their partner sexing it up with a sexbot and forbid such activity. While a person who broke such an agreement about robot sex would be acting wrongly, they would not be cheating. Unless, of course, the sexbot was close enough to being a person for cheating to occur.
While many people would just be interested in sexbots that engage in mechanical sexual functions, there are already efforts to make sexbots like people in terms of their “mental” functions. For example, being able to create the illusion of conversation via programming. As such efforts progress and sexbots act more and more like people, the philosophical question of whether they really are people or not will be a rather important one. While the main moral concerns would be about the ethics of how sexbots are treated, there is also the matter at hand about cheating.
Obviously enough, if a sexbot were a person, then it would be possible to cheat with that sexbot—just as one could cheat with an organic person. The fact that a sexbot might be purely mechanical would not be relevant to the ethics of the cheating, what would matter would be that a person was engaging in sexual activity with another person when their relationship with another person forbids such behavior.
It could be objected that the mechanical nature of the sexbot would matter—that sex requires organic parts of the right sort and thus a human cannot really have sex with a sexbot—no matter how the parts of the robot are shaped.
One counter to this is to use a functional argument. To draw an analogy to the philosophy of mind known as functionalism, it could be argued that the composition of the relevant parts does not matter, what matters is their functional role. A such, a human could have sex with a sexbot that had the right parts.
Another counter is to argue that the composition of the parts does not matter, rather it is the sexual activity with a person that matters. To use an analogy, a human could cheat on another human even if their only sexual contact with the other human involved sex toys. In this case, what matters is that the activity is sexual and involves people, not that objects rather than body parts are used. As such, sex with a sexbot person could be cheating if the human was breaking their commitment.
While knowing whether a sexbot was a person would largely settle the cheating issue, there remains the epistemic problem of other minds. In this case, the problem is determining whether a sexbot has a mind that qualifies them as a person. There can, of course, be varying degrees of confidence in the determination and there could also be degrees of personness. Or, rather, degrees of how person-like a sexbot might be.
Thanks to Descartes and Turing, there is a language test for having a mind—roughly put, if a sexbot can engage in conversation that is indistinguishable from conversation with a human, then it would be reasonable to regard the sexbot as a person. That said, there might be good reasons for having a more extensive testing system for personhood which might include such things as testing for emotions and self-awareness. But, from a practical standpoint, if a sexbot can engage in a level of behavior that would qualify them for person status if they were a human, then it would be just as reasonable to regard the sexbot as a person as it would be to regard an analogous human as a person. To do otherwise would seem to be mere prejudice. As such, a human person could cheat with a sexbot that could pass this test.
Since it will be a long time (if ever) before such a sexbot is constructed, what will be of more immediate concern are sexbots that are person-like. That is, that are not able to meet the standards that would qualify a human as a person, yet have behavior that is sophisticated enough that they seem to be more than mere objects. One might consider an analogy here to animals: they do not qualify as human-level people, but their behavior does qualify them for a moral status above that of objects (at least for most moral philosophers and all decent people). In this case, the question about cheating becomes a question of whether the sexbot is person-like enough to enable cheating to take place.
One approach is to consider the matter from the perspective of the human—if the human engaged in sexual activity with the sexbot regards them as being person-like enough, then the activity can be seen as cheating. An objection to this is that it does not matter what the human thinks about the sexbot, what matters is its actual status. After all, if a human regards a human they are cheating with as a mere object, this does not make it so they are not cheating. Likewise, if a human feels like they are cheating, it does not mean they really are.
This can be countered by arguing that how the human feels does matter. After all, if the human thinks they are cheating and they are engaging in the behavior, they are still acting wrongly. To use an analogy, if a person thinks they are stealing something and take it anyway, they still have acted wrongly even if it turns out that they were not stealing (that the thing they took was actually being given away). The obvious objection to this line of reasoning is that while a person who thinks they are stealing did act wrongly by engaging in what they thought was theft, they did not actually commit a theft. Likewise, a person who thinks they are engaging in cheating, but are not, would be acting wrongly, but not cheating.
Another approach is to consider the matter objectively—the degree of cheating would be proportional to the degree that the sexbot is person-like. On this view, cheating with a person-like sexbot would not be as bad as cheating with a full person. The obvious objection is that one is either cheating or not; there are not degrees of cheating. The obvious counter is to try to appeal to the intuition that there could be degrees of cheating in this manner. To use an analogy, just as there can be degrees of cheating in terms of the sexual activity engaged in, there can also be degrees of cheating in terms of how person-like the sexbot is.
While person-like sexbots are still the stuff of science fiction, I suspect the future will see some interesting divorce cases in which this matter is debated in court.