

Basically you have magnetic particles suspended in a soft gel surface, and a magnetometer beneath it can sense the displacement of those particles, translating those movements into accurate force maps of the pressures causing the movement. The ReSkin system uses a different approach. ( Update: I mixed up the names and methods of these systems initially and have recast this section to be clear: DIGIT and GelSight are related and use a camera-based method, ReSkin uses the magnetic particle approach as described below. The DIGIT project has roots dating back to 2009 we wrote about the MIT project called GelSight in 2014, then again in 2020 - the company has spun out and is now the manufacturing partner for this well-documented approach to touch. Objects shown above images of the signals produced by the robotic fingertip. You can see the fingertips themselves in the image at top it's quite sensitive, as you can see from the detailed maps it's able to create when touching various items: Sophisticated pressure sensors simply aren't popular consumer products, and so any useful ones tend to stay in labs and industrial settings.ĭIGIT was released in 2020 as an open source design it uses a tiny camera pointed at the pads to produce a detailed image of the item being touched.

While cameras and microphones are cheap and there are lots of tools for efficiently processing that data, the same can't be said for touch. We need to go towards a physical understanding of objects to ground this." "What we've become good at is understanding pixels and appearances," said FAIR research scientist Roberto Calandra, "But understanding the world goes beyond that. The sense of touch isn't much good at telling whether something is a picture of a cat or a dog, or who in a room is speaking, but if robots or AIs plan to interact with the real world, they need more than that. LeCun seems to have taken this as a challenge and started looking into it, but a clear answer emerged in time: If Facebook was to be in the business of providing intelligent agents - and what self-respecting tech corporation isn't? - then those agents need to understand the world beyond the output of a camera or microphone. The question of why exactly Facebook is looking into robot skin is obvious enough that AI head Yann LeCun took it on preemptively on a media call showing off the new projects.įunnily enough, he recalled, it started with Zuckerberg noting that the company seemed to have no good reason to be looking into robotics. And to advance the ball in this relatively new area of AI and robotics research, the company and its partners have built a new kind of electronic skin and fingertip that are inexpensive, durable and provide a basic and reliable tactile sense to our mechanical friends. According to Facebook AI Research, the next generation of robots should be much better at feeling - not emotions, of course, but using the sense of touch.
