The ‘artificial intelligence’ in your new smart gadget may not be what you think

12

Walk the halls of the Consumer Electronics Show, or browse gadgets online, and you might hear that a gizmo has, or uses, AI. Artificial intelligence is a broad, catchall term, and so it can be hard to know what it actually means for a product to have AI. Does it mean you can talk to it, and it talks back? Can make decisions on its own? Is going to lead a robot army to harvest the organs of everyone you know?

The powerful technology is also becoming ubiquitous enough that’s it’s common to see it employed, and touted by, small companies you haven’t heard of, as opposed to just the big players, like Amazon or Google. Plus, companies that make gadgets that connect to a voice assistant, like Alexa, may use that as reason enough to call their product “smart.”

A key point to understand is that artificial intelligence isn’t just synonymous with a voice assistant. Those voices, like Alexa, make use of AI, to be sure—but there’s much more going on in the world of artificial intelligence.

Under the umbrella of AI is the large, dynamic field of machine learning. Frequently, when you encounter artificial intelligence in a product, it’s because it’s employing machine learning under the hood to do something, make a decision, or both. At its simplest, machine learning involves engineers feeding data into software, which then learns from it. The resulting algorithms can accomplish different tasks.

Listen up

Here’s an example: Danish company Jabra announced their latest headphones at CES, the Elite 85h. They advertise the new $299 ‘phones as using “AI technology,” and they do, in the form of machine learning. They’re not “artificially intelligent” in the sense that they can read your mind and start talking to you, but the way they make use of AI is indeed smart.

Perhaps predictably, they call the feature in question SmartSound. “It listens to the environment the user is in,” says Fred Lilliehook, a senior product marketing manager at Jabra. “It automatically adapts the audio experience.”

If you’re on a bus, it can recognize that sound signature, and then put the headphones in their “Commute” mode, meaning that active noise canceling kicks in. In a public space, like a sidewalk, the headphones switch into a mode called “In Public,” which triggers a feature called “HearThrough” that uses the mics—it has eight in total—to amplify the sidewalk sounds.

Those factors are the mass distribution and use of Internet-connected devices, which generate massive quantities of data, and cloud computing and software algorithms that can recognize patterns within data, he says.