With robots and artificial intelligence getting more sophisticated, are our current law's enough? Imagine having a private conversation with someone, for instance at a party, and later find out that the ”person” you spoke with was a robot. Or imagine going on a date not knowing that you are actually meeting with a very sophisticated AI-powered humanoid. Or imagine finding out that your shrink is a robot.
I would act and say different things to a robot than to a real human. Therefore I want to know if the person I talk to on the phone is a software program instead of a human. I want to know if people I meet are not human.
Today human robots and digital assistants aren't good enough to fool most people. That will change. They will be capable of fooling anyone into believing that the person at the other end of the phone is a real person, or that the shop clerk you are flirting with is interested in you, not a robot doing everything they can to sell you something.
Is it okay for software and hardware to mimic people without us knowing what is what? Do you think we need to adapt our laws to a future where humans and robots are living side by side, looking and acting the same? Just like it is a crime to pose as someone else, shouldn't it be illegal to pose as a human when you are not? In my opinion it would be fraud to let an object imitate a human if there is a risk of confusion.
What do you think? Do we need to adjust our laws? Is this covered by the laws we already have?
(Photo by Parker Johnson at Unsplash)