Back

Salacious Servitude: Artificial Intelligence and the Dangers of Dirty Talk

Written by Conor Hassett |

Machines can detect tumors; robots can remove them almost completely unassisted. 20,000 miles above the Earth, satellites can reposition automatically—tracking wildfires or infantry movements. If those things don’t impress you, look to your kitchen, where internet-connected speakers can control A/C – or provide your toddler with suggestions for the web’s best pornography.

If you don’t have your Amazon Echo, Ask Alexa, or other voice-activated personal assistant nearby to clarify, here’s a little background: a few weeks ago, a video surfaced (and went viral) of a young child asking Alexa to play a song he called “Digger Digger,” from his favorite bedtime book. The device promptly reached deep into its databanks, responded, “you want to hear a station for porn” and offered up foul-mouthed categories for the dirtiest videos humanity’s greatest modern tool has to offer.

It’s not the only time something like this has happened with seemingly smart, artificially intelligent technology. Early last month, a kindergartener ordered $160-worth of cookies and a doll house from an Amazon tool, unbeknownst to her parents. In March 2016, the short-lived chatterbot A.I. “Tay” lived for a single day on Twitter, interacting with users before being taken down by its Microsoft creators because it had learned to tweet racist, sexually-explicit remarks.

It stands to reason that we’ll see more of such snafus as the wonders of technology continue their exponential growth – for such is the nature of progress. And at some point, we’ll probably figure out how to prevent them. In the meantime, it’s probably best if you don’t leave your children alone in a room with intelligent tech, unless you’re ready to have “The Talk” way earlier than planned. If you’re not, I suppose you could always Ask Alexa to have that conversation for you.