Artificial intelligence might one day be used to power genuinely humanlike cyborgs or other figments of humanity’s fertile imagination. For now, Ingo Stork is using the technology to help restaurant chains waste less food and do more with fewer workers.

Dr. Stork is co-founder of PreciTaste, a startup that uses AI-based sensors and algorithms to accomplish one fairly specific task: predict how much food people will order at any given moment, and make sure that it’s being prepared in a timely fashion.

The idea—to reduce waste—came in part from a visit Dr. Stork made to a quick-service kitchen one afternoon a few years ago, where he watched a cook fire up 30 burger patties, and then throw them all away when no one showed up to eat them. Why, he wondered, should this cook have to follow that day’s schedule, written in anticipation of a normal day at the restaurant, instead of the slow one it turned out to be?

A Phuc Labs engineer prepares a sample for processing in the startup’s AI-based filtration system, which uses a machine-vision algorithm to identify valuable metal particles in e-waste.


Justin Salem Meyer/Phuc Labs

“Each of those burgers is a 50-mile car ride in terms of CO2 emissions,” he says, referring to the energy required to raise the cows and eventually transform them into burgers. “Think of all the logistics just to get them there, all just to go to waste and be discarded.”

Using AI tools to reduce waste and increase productivity in fast-food joints is hardly the stuff of science fiction. It isn’t as flashy as some of the artificial intelligences that have been getting wider attention lately, such as DALL-E, which can create clever images based on text suggestions, or GPT-3, text-generation software good enough to write scientific papers about itself. And it’s not as likely to make headlines as Google’s LaMDA chatbot, which can produce such humanlike conversation that one of the company’s engineers declared it to be sentient—a notion the company flatly rejected.

But, with a few exceptions, these headline-grabbing systems aren’t having a material impact on anyone’s bottom line yet.

The AI systems that currently matter the most to companies tend to be far more humble. Were they human, they would probably be wearing hard hats and making cameos on the reality show “Dirty Jobs.”

Ruthlessly simplify

When entrepreneur Phuc Vinh Truong found himself holed up in his Massachusetts home because of Covid-19 lockdowns, he hit on a simple idea. What if you could see contaminants in a stream of liquid, and suck them out one by one?

That led to Phuc Labs, a startup working on a new way to use AI to make recycling electronic waste more efficient.

The system starts with the chopped-up debris left after recyclers of batteries and other e-waste crush old electronics. Typically, this waste is processed with a variety of techniques, including chemical separation. Instead, Phuc Labs suspends the particles in water, then channels the resulting slurry through tiny tubes, where a camera captures its passage at 100 frames per second.

Each frame is analyzed by a computer running a machine-vision algorithm that has been trained to tell the difference between the metal particles valuable to recyclers, and everything else. When a particle travels to the end of the tube, a tiny, powerful jet of air fires at the stream, redirecting the “slice” of water containing the particle into a reservoir. The water is recirculated through the system repeatedly until nearly all the valuable bits of metal have been separated out.

Phuc Labs’ “vision valve” technology is still in its early stages, but the company is working on a pilot program with IRI, one of the biggest recyclers of e-waste in the Philippines, says IRI President Lee Echiverri.

This novel kind of filtration would be impossible without AI, but it’s not fancy AI. Machine-vision systems are probably the best-studied flavor of AI, and have been refined for decades. They’re used in everything from the face-recognizing camera in your phone to autonomous-driving systems to the missiles taking out Russian tanks in Ukraine.

Identifying tiny metal particles in shredded e-waste like this is akin to a simple game for an artificial-intelligence system. Extracting them is a bigger challenge.


Justin Salem Meyer/Phuc Labs

Mr. Truong’s team was able to build one of the first versions of their system using an off-the-shelf computer-vision system called Roboflow. They trained it by manually identifying a few hundred images of particles—drawing boxes around particles and labeling them accordingly—and Roboflow’s software did the rest.

While AI is a unique enabler of Phuc Labs’ filtration system, it works because the system asks so little of the AI at its heart—just “is this a piece of metal or not?” In essence, his engineers are creating a simple game for their AI to learn, and games like chess and Go are things AI has already proved to be excellent at, says Mr. Truong.

In many other real-world applications of AI, engineers have found that trying to do less with AI is what ultimately leads to success. A prime example of this is autonomous driving systems, which have consistently failed to deliver on earlier promises of full autonomy, but are finding success in navigating some vehicles in more limited and forgiving environments, such as the ones traversed by trains, oceangoing ships and long-haul trucks.

Specialization trumps flexibility

Every fast-food restaurant chain that Dr. Stork’s New York City-based company, PreciTaste, works with presents a new set of challenges for his engineers and the AI-powered restaurant-management systems they build.

“Each food chain has its own menu, operations, equipment and way of handling things,” he says. The array of wall-mounted cameras equipped with machine vision that can track an order from the moment its raw ingredients leave a refrigerator until it’s ready to be handed to a customer may have to be laid out differently, for example. And the number of preparation steps can vary greatly by restaurant.

Today’s artificial intelligence possesses no intelligence at all.

PreciTaste says it can’t disclose which chains are considering its technology. But it’s working with the commercial-kitchen fabrication giant Franke to pilot its tech in a handful of national fast-food and fast-casual restaurants, says Greg Richards, vice president of business development at the company. (Franke has since the 1970s been a supplier to



To make its system work, depth-sensing cameras must be trained to recognize how much of an ingredient—say, rice—remains in a prep tray. Knowing when to replenish it depends on what will happen to demand, which in turn depends on factors including weather and local holidays that might determine whether people will go out to eat and what they’ll order. All of this and more is fed into the same kind of prediction algorithms that help retailers like


manage their logistics networks.

Today’s AI systems lack common sense, can behave erratically when faced with unexpected events, and have minimal ability to transfer knowledge “learned” from one task to analogous situations. In this way, it could be said that today’s artificial intelligence possesses no intelligence at all—it is, as one AI pioneer put it, just “complex information processing.”

The result is that engineers and data scientists have to do a lot of hand-holding for their fragile AIs, including planning, hardware engineering, and writing software. All that to build a scaffolding within which an AI can be trained to accomplish a set of tasks that have been defined as narrowly as possible.

It might not always be like this

AIs like DALL-E, GPT-3, and LaMDA are known as “foundation models,” says Oren Etzioni, chief executive of the Allen Institute for AI. For now, they are mostly research projects. But someday systems like these might be flexible enough to throw at problems that today remain solely the domain of human intelligence, he adds.

Already, these AIs are starting to diversify and take on a wider range of tasks. One way this is happening is that foundation models have so much data stuffed into them that they are equally capable of, say, crafting an essay or writing code. For example, genre-fiction writers are using software based on GPT-3 to help them churn out straight-to-Kindle novels faster. And programmers who use


Copilot system can become more productive when it autocompletes lines of code they are writing. Copilot has a shared lineage with GPT-3, and like its cousin that writes marketing copy, fiction and essays, it’s far from perfect.

While we wait for these foundation models to find more applications outside R&D labs, such research into related systems that get us part way there is proving useful.

Gong is a cloud-based system, from a San Francisco-based startup of the same name, that records and analyzes every channel of communication used by a sales team. That includes phone calls, Zoom meetings, emails, chat transcripts and more. It then analyzes all that communication, and makes suggestions, so salespeople can close more deals. These suggestions range from words and phrases that tend to come up in successful sales pitches to how much to talk during a pitch—usually less.


What real-world business applications can you envision for AI? Join the conversation below.

Gong works in dozens of languages. For years this meant that every time the company wanted to update any of its AI models to make them better at transcribing or analyzing speech, it had to do it separately for each language, and sometimes even dialect. It was an enormous task, says Gong CEO

Amit Bendov.

Then, in 2019,


AI, a research division of Facebook parent Meta Platforms, released a system called Wav2vec that uses a novel algorithm to quickly teach itself any language. Using this open-source code allowed Gong’s engineers to build a single system able to parse all of the languages and dialects Gong supports, says Mr. Bendov. Gong now uses one single polyglot AI model, constantly updated, to understand everything the company’s system works for.

Even with this leg up from the researchers at Meta, Gong still uses a custom-built speech-recognition system trained on thousands of hours of recorded audio and human-written transcripts. (This includes recordings of customer phone calls, “Seinfeld” episodes and fan-transcribed scripts for them.)

Gong’s use of AI for relatively narrow tasks, like speech recognition, and the way its engineers built custom systems to accomplish it, embody the same principles of workaday AI as Phuc Labs’ waste-filtering tech and PreciTaste’s restaurant-management systems.

Someday, the big, fancy models that garner attention might apply to the work of this company and others—but not yet. Getting there may take big leaps, such as giving AI common sense, including knowledge about the real world, so that it can derive meaning from all the data it ingests.

“The funny thing is, Gong doesn’t know what an iPad is or anything about our customers’ business,” says Mr. Bendov. “It just knows ‘this is what is said when you are successful.’”

For more WSJ Technology analysis, reviews, advice and headlines, sign up for our weekly newsletter.

Write to Christopher Mims at

Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

Leave a Reply

Your email address will not be published. Required fields are marked *