SynTouch Is Giving Robots the Ability to Feel Textures Like Humans Do
There’s just nothing like holding a new product in your hands. You can look at a thousand photos, watch a million videos and still not get a sense of the texture and feel of, say, a pair of raw denim jeans. But for the companies making such products, ensuring a consistency of feel can be a hassle. That’s where robots come in.
If you’re a textile vendor, you can ask a manufacturer to dye a set of sheets a particular color, using a standard such as the Pantone Color Matching System to specify exactly the color you want. But if you try describing how you want those sheets to feel, well, that’s another matter entirely. The process of describing and evaluating textures is often subjective. A vendor will send samples to customer, who might pass them around to several different people to feel and, eventually, try to come to a consensus about the best ones, or the ones that come closest to the desired texture.
Organizations like the International Organization for Standardization offer some industrial standards for textures, but Matt Borzage, co-founder of the robotics company SynTouch, says they standards often fall short. “We know this because most companies revert to shipping physical samples to customers or flying their in-house experts from factory to factory instead of communicating using their standard measurements,” he says.
SynTouch has another solution: a haptic sensor that provides robots with a sense of touch. The company used this sensor to develop the SynTouch Standard, a taxonomy of more than 500 materials ranging from synthetic fabrics to natural materials like stone. The standard is based on 15 factors, including coarseness, smoothness, friction and thermal properties. The idea is to create a standardized process to measure and classify the texture of any flat surface, taking the subjectivity out of the question of whether two objects feel the same.
SynTouch is a spin-off of the Medical Device Development Facility of the University of Southern California, where the team initially focused on prosthetics. And one of its core insights is this: When you touch something, you are doing more than sensing the surface of that object. You’re also changing it, however subtly. Your finger emits heat, and no matter how gentle you are, you exert an almost imperceptible amount of pressure. In other words, you aren’t just feeling the material, you’re feeling its reaction to your touch. Syntouch’s BioTac sensor tries to emulate this by radiating heat and exerting pressure so the surface it measures changes in much the same way it would if a person were touching it.
The company is still working in the prosthetics industry and is focusing on giving artificial hands “reflexes” by making them to respond to different haptic sensations. But SynTouch is exploring other areas. Borzage says the company’s customers for the SynTouch Standard taxonomy include automakers, consumer electronics firms, and apparel companies. Some want to standardize a product, while others want to figure out if a synthetic material—an artificial leather, for example—feels like the real thing. It’s one more example of robots doing a job once only humans could do. But now it’s also robots feeling what once only humans could feel.
With the 4 sensing elements, you can detect up to 4 compounds at the
same time, e.g. ethylene for fruit freshness, biogenic amines for
meat/fish/poultry freshness, and maybe humidity and carbon dioxide.
Cameras gave computers eyes. Microphones gave them ears. Touchscreens gave them tactile perception. Now the Massachusetts-based company C2Sense has invented a tiny chip that gives computers a sense of smell.
The first goal of the company, says co-founder and CTO Jan Schnorr, is to use machines to sniff out spoiling food. And that could have a bigger impact than you might think.
Food spoilage can be contagious. You know the saying “one bad apple can spoil the whole batch”? It’s true. As fruit ripens, it releases a musky gas called ethylene. When fruits are exposed to ethylene, they ripen more quickly and give off more ethylene themselves, creating a domino effect that speeds up the ripening process for every piece of fruit nearby.
C2Sense’s technology can detect ethylene even in trace amounts that a human wouldn’t be able to smell, enabling food sellers to spot ripening food before it spreads. A wholesaler might use these sensors to monitor crates of fruit and move those that are starting to ripen before they spread ethylene to every other crate in the warehouse, while a restaurant might use a handheld device to pinpoint individual pieces of fruit before they spoil their neighbors.
The Smell-o-Meter in Every Home
Many of us already have rudimentary smell-o-meters in our homes. We call them smoke detectors and carbon monoxide alarms. Everything we can smell, from buttery popcorn to pine trees, radiates particles that trigger chemical reactions in the cells in our nasal cavity. Depending on the specific reaction, the cells then send signals to our brains. Carbon monoxide alarms and ionization smoke detectors work in much the same way: specific particles cause chemical reactions that change an electrical current in the device, which in turn triggers an alarm.
Sensors that are able to detect ethylene have been around for years, Schnorr says, but they’ve generally either been too expensive or unable to accurately detect ethylene outside the lab where they’ll be exposed to numerous other similar gases. What C2Sense has done, he explains, is create an affordable sensor that’s also sensitive enough to detect low levels of the gas without setting off false positives.
The secret is a brand new material that Schnorr and his research team invented while he was a chemistry PhD student at the Massachusetts Institute of Technology. The new material, which is cheap to synthesize, chemically reacts to ethylene. Schnorr’s team uses this material as a resistor in a tiny electrical circuit. As the number of ethylene molecules increase, the material’s conductivity changes and the electrical current changes accordingly. They can then measure the current to gauge the level of ethylene in the sensor’s vicinity. Now the team has modified the material to detect other gases, such as the amines released by meat or ammonia. Their current prototype is able to detect up to four different types of gas on a single chip.
Last year Schnorr and his PhD supervisor Timothy Swager decided to spin the company out of MIT. “At first I thought I’d follow the typical path of working for a big company,” he says. “But then towards the end of my PhD we came across this idea of that seemed worth commercializing.” Last month the the company received a $350,000 grant from Breakout Labs, philanthropic fund started by PayPal co-founder and early Facebook investor Peter Thiel that helps scientists turn their research into companies.
Schnorr says the company’s goal is to make wireless sensor chips so cheap that they could be built into a product’s packaging, or incorporated into produce bags at the grocery store, without adding any noticeable cost at the register. Customers could then scan these chips with their phones to get a freshness reading. Now we just need a machine that can taste our food for us before we buy it.
About four years ago software developer Gal Shaul boarded a flight from Tel Aviv, Israel, to Delhi, India. Shaul worked for a medical device startup, and he’d been dispatched to troubleshoot an overheating product for one of the company’s clients. But as soon as he arrived on the scene, he knew it wasn’t a software problem at all: he could hear that the machine’s fan was clogged from across the room.
The sounds machines make reveal quite a bit about whether they’re working properly and what’s wrong with them if not. That’s why the first thing a mechanic does when you bring your car into the shop is pop the hood and listen to the engine. Shaul’s 11-hour flight to India could have been avoided if someone had thought to put a phone up to the device and let a support technician listen to it. But to Shaul, the experience revealed a more fundamental problem: the software running on the device didn’t have any idea what was going on with the hardware. The machine had no way of listening to its own sound, and therefore no way to alert its owner or its developers that something was wrong.
So he called a college friend Saar Yoskovitz, an expert in analog signal processing, the complex mathematics involved in processing non-digital signals such as sound. Together the pair founded Augury, a company dedicated to giving machines a sense of hearing. They like to refer to it as “Shazam for machines,” referring to the popular app that can listen to and recognize songs.
Augury makes a gadget that customers can attach to equipment such as commercial refrigerators or industrial scale heaters. The gadget records vibrations and ultrasonic sound and uploads it to Augury’s cloud service, where it’s analyzed to make predictions about the health of the machine being monitored. Technicians can then use the company’s mobile app to view the status of a machine and any alerts that might indicate that something is going wrong with it.
That might sound like a privacy and security nightmare, but Yoskovitz say Augury isn’t recording the full audio of the entire space in which its hardware is installed, just the vibration patterns produced by the monitored machine, along with various inaudible frequencies. A snooper would have a hard time making out anything even if some of the sound waves from a conversation did manage to find their way into the device’s contact microphones. “The noise levels in a mechanical room are so loud that people have a hard time hearing each other, so it will be hard to filter the conversation from the background noise,” he says.
All of this audio and data is analyzed and stored so that the sound of one customer’s machine can be compared with the sound of all others. The idea is that Augury won’t need to customize its software for each different type of appliance its customers want to monitor. Instead, it will be possible to simply install the sensors and listen to the device to establish an idea of what it sounds like when it’s functioning normally and alert owners of abnormalities. Over time it will also learn which specific sounds precede specific types of failure.
For example, if Augury’s software had never heard the sound of a clogged vacuum hose, it would first alert a machine’s owners or technicians that it was making an unusual sound so they could check to see if there was a problem. Then, after hearing the sound of a few clogged hoses before a device failure at different customer sites, the software will learn the sound of a clogged hose, someone will label the sound as such, and Augury will be able to send more specific alerts to its customers—including those who have never had a clogged hose problem before. And since a clogged hose will make similar sounds whether it’s part of a commercial refrigerator or an oil pump or a car, the software will be able to generalize that sound across many different types of equipment.
Yoskovitz thinks this could end up doing a lot more than just saving technicians from making unnecessary plane trips. By giving manufacturers a deeper understanding of the often complex reasons that their products fail, Augury could help companies build better products.
View the original here –