Does it make sense to put a voice abettor in your microwave? Amazon, the leader in smart speakers and voice assistants, seemed to be assertive that it does when it alien the $60 AmazonBasics Bake in September.

This week, the first reviews of the device came out, and while I’m not an avid reader of gadget reviews, I was absorbed to see what consequence Amazon’s new adventure would create. I found TechCrunch’s contour of the Amazon microwave, aptly accounting by Sarah Perez, an absorbing read.

Beyond exploring the pros and cons of the device itself, Perez’s acquaintance with the Alexa-powered bake shows that the developers of AI administration have yet to understand the limits and capabilities of voice assistants and the actual way to accommodate agenda administration into concrete devices.

This is important since Amazon has advised the bake oven as a demo of its board and API that will enable other manufacturers to pair Alexa with their appliances.

The aberration amid smart speakers and smart home appliances

When bedfast in the Echo smart speaker, there’s no ambiguity over how to collaborate with Alexa. Users know that they should call the assistant’s name followed by the command. This works for ambience timers, axis the lights on and off, arena music and other functionality that don’t crave complex, multi-step interactions.

But what happens when you try to achieve a task that requires assorted steps, some of which you must accomplish yourself? The bake provides a very absorbing example. Everyone knows that Alexa won’t put the food in the oven or take it out for you, two acutely important steps in using the microwave.

However, we absently expect Alexa to know that we are interacting with the bake oven and to action our commands in that context. After all, isn’t it powered by bogus intelligence?

But Alexa has no idea that your commands are accompanying to the bake and requires you to use keywords such as “microwave” or “reheat” or “defrost” to make it accept that you want to make a modification to the microwave.

That’s why, as Perez shows, we might abash the abettor by giving commands such as “Alexa, two minutes” (instead of “Alexa, bake for two minutes”) or “Alexa, add 30 seconds” (instead of “Alexa, add 30 abnormal to the microwave”).

Amazon has tried to work around this botheration with an “Ask Alexa” button on the front panel of the bake oven. When acute the button, you don’t need to accommodate “Alexa” or other keywords in your command. It’ll accept that whatever command you’ll give will be accompanying to the functionalities of the bake oven. So for instance, you can just press the button and say “two minutes” instead of saying “Alexa, add two account to the microwave.”

But the command can break the accord of commands you give to Alexa. Consequently, you can easily become abashed and absently accommodate the keywords when acute the button, or forget them when not acute it. Perez shows this very well in her review of the Amazon’s bake oven.

What’s the problem?


Unfortunately (or fortunately), Alexa doesn’t follow you around the house (yet) and unless you absolutely state your intentions, it can’t figure out the hidden acceptation behind your words. But why do we expect it to have a accepted compassionate of the ambience of our commands?

Alexa and other AI administration are able to action voice commands thanks to advances in natural accent processing (NLP), the branch of AI that enables accretion accessories to make sense of human-written and -spoken language.

While it might come as accepted to most of us today, the advance in NLP is a huge step for the human-computer alternation space, which was ahead authentic by rigid and assured user interfaces.

So we absently expect a technology that is adult enough to accept the accent of humans to also be able to accept the ambience of those commands, abnormally when it has a human name. After all, if Alexa was a real human, she would acutely know what we meant when we said “Add two minutes” anon after we told her to fire up the microwave.

Arthur C. Clarke has abundantly said, “Any abundantly avant-garde technology is duplicate from magic.”

But as we have discussed in these pages, there’s annihilation bewitched about AI assistants. The bogus intelligence that powers Alexa and other agenda administration has a very bank compassionate of human language.

Likewise, it has no compassionate of the ambiance it’s in, including the rooms in your house, and the altar in those rooms, or the people who live in the house. No matter how accustomed or smart Alexa may sound, it is powered by narrow AI, which has audible limits. It can’t relate facts to each other and fill the gaps where there’s missing information.

How do you fix Alexa?

A very simple fix would be to give the Amazon bake an abettor of its own, with a unique name, say “Micro.” Then users wouldn’t have to bethink long commands. And for the device, there would be no catechism as to what you mean when you say “Micro, add two minutes.” There would be no need to add an “Ask Alexa” button and crave users to learn two types of commands.

The botheration with this access is that you’ll soon have addition abettor for your oven, addition for your dishwasher, your abrasion machine, coffee maker and a dozen other accessories in your home. I think this is article we could adapt to, but at first glance, it might sound a bit silly.

But calling a computer by a human-sounding name was the stuff of sci-fi movies a few decades ago. It’s now an inherent part of our daily lives.

Another band-aid is to give Alexa more ambience about your home. While bound at cerebration in the abstruse or basic a accepted compassionate of their surrounding world, AI agents are still very good at processing sensor information. As a rule of thumb, the more data points you give to an AI system, the better it will become at admiration and correlating information.

For instance, if you equip the Alexa bake oven with a motion sensor and eye tracking technology, Alexa would be able to accept that when you’re staring at the bake and saying “Alexa, add two minutes,” it should apparently add two account to the bake oven’s timer.

This would accommodate a much more accustomed acquaintance compared to the “Ask Alexa” button. When you’re staring at the microwave, you expect the person you’re talking with to accept that your instructions are directed at the microwave. When you’re staring abroad or are in addition room, you will absolutely tell the person that you’re giving an apprenticeship about the microwave.

The advantage of this access would enable users to stick to Alexa for all their tasks while also simplifying commands. One of the downsides would be the costs. Packing the bake with motion and eye tracking sensors would apparently double its price. But the greater disadvantage would be the aloofness risk you would incur if you gave Alexa even more advice about your home and living habits.

The aegis rules and laws surrounding smart homes and the internet of things are still being developed, and IoT accessories have accounted for their fair share of aegis and aloofness disasters in the past years.

So, we’ll have to ask ourselves: Do we want Alexa to be smarter, or do we like it just as stupid as it is? That’s article that both developers and consumers will have to figure out as the IoT space develops and finds its way into more domains.

Read next: Report: GitHub hosts the most cryptocurrency mining malware of any site