Is Amazon Echo Preparing Children for Predatory Robots?

Is Amazon Echo Preparing Children for Predatory Robots?

Is Amazon Echo Preparing Children for Predatory Robots?

To any children who have ever experienced the excitement of whipping out a voice-activated device to make a request, like the Amazon Echo or Echo Dot, you might be excited to hear this: a wide-reaching range of technology manufacturers are working on developing voice-activated products.

It was with this in mind that The Telegraph decided to dig into the new wave of voice-activated products and inquire about the safety of these products for children. The article revealed what it found:

“More than 2,400 voice-enabled devices are now in use across Europe, and they have grown exponentially since August last year when the EU introduced a single, ‘net-neutral’ set of rules ensuring that any device – be it a sofa-mounted connected to a TV or a smart oven – is ‘compatible’ with any other device.”

It’s good to have competition in the market, but there’s just something unnerving about something that can be turned on and off by a voice.

“Having voice control could quickly go from a novelty to a big draw for young children if it’s done right,” said the Telegraph, while pointing out that Amazon, Samsung, Philips, Sonos, Amazon Echo, and Google Assistant are some of the biggest players in the space.

There’s no need to worry about the EU’s “inclusive” move. Brand new laws in the country mean that the new generation of products have a very good starting point in terms of safety.

Indeed, Amazon has brought a slew of helpful software into its Alexa technology, including the ability to flag up concerning situations like parents being out shopping, because the app can monitor family members’ locations and alerts them immediately if they’re putting a child at risk.

There is a potentially tricky situation which can be difficult to foresee: can such products be used for surveillance?

This means, yes, if they’re hooked up to a camera, like the Amazon Echo or Echo Dot. Face-recognition technology can make that possible, and of course, children’s imagination often takes them places they shouldn’t go – on the outside of which you’ll find a screen.

Speaking of video, is it wrong for machines to expose children to so much information at such a young age?

“There is a potential for this product to be used for surveillance purposes,” the Telegraph points out. “But while some products with video screens may pass this off as entertainment, parents and teachers are best placed to decide whether a video screen is preferable to an older screen or to a smartphone.”

Consider this: there’s not much of a “radar” effect from today’s devices like social media, where a child’s first words or first posts on a site will most likely come from their own choice. If they’re watching videos of a Tiddler, they’re likely to be buying Tickle-Me-Elmo.

There’s no danger in children finding out what they’re saying on these devices. Maybe they’ll find it too hard to distinguish that it’s simply a cute toy with a little screen instead of something exciting and involving.

In terms of privacy, that’s a different story. This is the very same subject that has prompted a backlash against the hit game Minecraft. At least, that’s what some have said.

The truth is, privacy isn’t an issue that has to be kept in an “off” mode. Many technology manufacturers, like Apple, have privacy filters on their own platform, and their devices can also be easily shut down, with an easy override.

There’s already the fear that robots will emerge from the factory to exploit human beings, and the “AI is coming” warning. And therein lies the potential danger: not so much that robots will kill humans, but that machines with minds of their own will kill children, or bring them up to their level.

The key will be how children react. There are some advantages to the new technology, but parents should also pay attention to what kind of software and hardware they choose, and make sure that what children are being exposed to is age-appropriate and in the right way.

Tech companies need to step up their security and safety security to make sure they can keep their products and services safe for children. Otherwise, a real community problem might come up.

This article is satire.

Leave a Reply

Your email address will not be published. Required fields are marked *