There was a time when technology would advance and launch debates over ethical concerns raised by the technology. Lately, however, it seems ethical debate is (I hope) in advance of the actual technology. Maybe that’s a good thing.
Case in point: A paper at Ethicomp 2015 from De Montfort University warns that having sex with robots may have negative effects on par with prostitution. You might think that this is an isolated academic concept, but apparently there is a conference titled The International Congress on Love and Sex with Robots. There’s even a 2008 book titled Love and Sex with Robots that is neither science fiction nor pornography.
Second case: Softbank has created a robot called [Pepper] that supposedly can understand human emotions. You know the license agreements you get with everything you buy that you don’t really read? Here’s a translation of part of the one that comes with [Pepper]: ” …owner must not perform any sexual act or other indecent behavior.”
That’s right. If you perform a sexual act with your [Pepper], you’ve violated the license agreement. Honestly, despite rule 34, it is hard to imagine who wants a tryst with the 47 inch plastic robot (see the video below). Then again, people have been having some kind of sex with machinery since at least the 1700s.
However, this isn’t a debate about the pros and cons about having sex with robots. My guess is if you are going to, I’m not going to stop you and if you aren’t then you won’t even be a little curious. What I think is interesting is the parallel between this and the perennial question that occupies hackers mind’s: what can you stop me from doing with something I own?
There’s an old saying that if you can’t hack it (or open it, or fix it), then you don’t own it. While [Pepper] is not my type, if she is yours and you paid the price, maybe it should be your business. I can understand, perhaps, that Softbank may not want [Pepper] associated with pornography. After all, supposedly 1920’s pornography hurt sales of Hamilton Beach’s patented electric massage machine. But if a relaxing evening in private with [Pepper] is your idea of a good time, maybe that’s not really for Softbank or De Montfort University to say.
I have said before that things are easier to reason about logically if you take the sex out of them. Imagine if Microsoft’s license agreement forbid you from writing murder novels on machines running their software because, obviously, murder is wrong. That’s probably unenforceable and would be, in any event, kind of strange. What if Ford made you agree not to drive to fast food restaurants, because we all know fast food is bad for you?
If [Pepper’s] manufacturer prohibits me from reverse engineering its firmware (it might; I don’t know), I’ll bristle, but I’d get it. Especially if it prevents me from disseminating the reverse engineering. But if I own it, then how I use it for my own use ought to be my business.
Consider this: What will it look like for them to attempt to enforce their license and how will the courts will respond if they do?
In a larger sense, though, as designers of things we should think about ethics. Discussion and debate about it is healthy, but ultimately very few ethical situations don’t require some personal interpretation. Your ethics may prevent you, for example, from working on a weapon system. Clearly, not everyone agrees with that. While there are clear cut cases, I doubt that robosex is going to fall into that unambiguous category.