MIT, Autodesk develop AI that can figure out confusing Lego instructions
Stumped by a Lego set? A new machine learning framework can interpret those instructions for you.
Researchers at Stanford University, MIT’s Computer Science and Artificial Intelligence Lab, and the Autodesk AI Lab have collaborated to develop a novel learning-based framework that can interpret 2D instructions to build 3D objects.
The Manual-to-Executable-Plan Network, or MEPNet, was tested on computer-generated Lego sets, real Lego set instructions and Minecraft-style voxel building plans, and the researchers said it outperformed existing methods across the board.
They utilized a practical inorganic material in the fabrication process that enables their devices to run 1 million times faster than previous versions, which is also about 1 million times faster than the synapses in the human brain.
Analog deep learning is faster and more energy-efficient than its digital counterpart for two main reasons. “First, computation is performed in memory, so enormous loads of data are not transferred back and forth from memory to a processor.” Analog processors also conduct operations in parallel. If the matrix size expands, an analog processor doesn’t need more time to complete new operations because all computation occurs simultaneously.
Seems less useful than an AI that can figure out existing Lego instructions and output more understandable ones (or be fed a set of bricks and a prompt and generate understandable plans to build models using the available parts that fall within the prompt…)