Many products in the modern world are in some way fabricated using computer numerical control (CNC) machines, which use computers to automate machine operations in manufacturing. While simple in concept, the ways to instruct these machines is in reality often complex. A team of researchers including those from the University of Tokyo devised a system to demonstrate how to mitigate some of this complexity. Draw2Cut allows users to draw desired designs directly onto material to be cut or milled. In this case, color-coded lines drawn with standard marker pens instruct the Draw2Cut system to mill designs into wood without any prior knowledge of CNC machines or their typical workflows.
Various technologies can be said to democratize some skill or ability that was previously only accessible to those with time, money, luck, or all three. Ploughs, tractors, the printing press, the internet — the list goes on. In recent years, things like 3D printing were touted to bring bespoke high-quality manufacturing into the home. Though it’s yet to be seen how realistic that is, it highlights the real desire many people have to express greater control over the things they desire. 3D printing is of course just one mode of digital fabrication, and in many more cases, fabricated items are still often made using more established techniques employing molds or CNC machines. Despite being well established, using CNC machines is far from trivial.
“Operating CNC milling machines can be difficult because it usually requires users to first create 3D models using computer-aided design (CAD) software,” said Project Assistant Professor Maria Larsson at the University of Tokyo’s User Interface Research Group. “Our latest research explores the idea that, in several situations, it would be nice if the user could just draw directly onto materials they want the CNC machine to mill and cut, without modeling anything in CAD. We were inspired by the way in which carpenters mark wood for cutting, and thought, why can’t we do a similar system for personal fabrication?”
To this end, Larsson and her team created Draw2Cut, essentially a novel vision system coupled with an intuitive workflow and simple set of steps to follow in order to create CAD plans for CNC machines. Assuming someone has an idea for the item they want to create, they can use a specific set of colors to draw their design directly onto some material. Draw2Cut then images the material and sketches and interprets the vision data to create 3D CAD plans to export to a CNC machine. The machine can be fed the actual piece of material the user drew on and will cut and mill accordingly. Though they’ve only experimented with wood, different CNC machines can work on different materials, including even metal if needs be.
“The most challenging part of this project was how to implement this workflow in practice. We found the key ingredient was to develop a drawing language where symbols and colors are assigned various meanings, in order to produce unambiguous machine instructions,” said Larsson. “In this case, purple lines mark the general shape of a path to mill, red and green marks and lines then provide instructions to cut straight down into the material or instead produce gradients, respectively. Though, as with any project interfacing the real and virtual worlds, we also faced the challenge of getting our camera set up right and calibrating things before achieving an acceptable precision of cutting, within roughly 1 millimeter.”
While Draw2Cut is not quite able to create items to the degree of quality a seasoned professional could produce, its main aim is not to replace people but to open up this mode of manufacture to more people, one of the broader themes of the User Interface Research Group.
“We involved a range of participants in designing and refining Draw2Cut. In particular, we found that Draw2Cut lowers the entry barrier for novice users, even children,” said Larsson. “Expert users too might benefit from being more able to swiftly express their design intent. And we aim to expand the possibilities with a broader range of stroke patterns and symbols in the future. It’s also possible to customize the color language for different needs. Our source code is open source, so developers with different needs can customize it accordingly.”
###
Journal article:
Xinyue Gui, Ding Xia, Wang Gao, Mustafa Doga Dogan, Maria Larsson, Takeo Igarashi, “Draw2Cut: Direct On-Material Annotations for CNC Milling”, The Association of Computing Machinery CHI conference on Human Factors in Computing Systems, https://doi.org/10.1145/3706598.3714281
Funding: This work was supported by JST ACT-X Grant Number JPMJAX210P, JSPS KAKENHI Grant Number JP23K19994, JST AdCORP Grant Number JPMJKB2302 and a collaborative research fund between Mercari Inc. R4D and RIISE, The University of Tokyo.
Useful links:
User Interface Research Group
https://www-ui.is.s.u-tokyo.ac.jp/en/
Department of Creative Informatics
https://www.i.u-tokyo.ac.jp/edu/course/ci/index_e.shtml
Graduate School of Information Science and Technology
https://www.i.u-tokyo.ac.jp/index_e.shtml
Research contact:
Professor Takeo Igarashi
Graduate School of Information Science and Technology, The University of Tokyo
7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, JAPAN
takeo@acm.org
Press contact:
Mr. Rohan Mehra
Public Relations Group, The University of Tokyo,
7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-8656, Japan
press-releases.adm@gs.mail.u-tokyo.ac.jp
About The University of Tokyo:
The University of Tokyo is Japan's leading university and one of the world's top research universities. The vast research output of some 6,000 researchers is published in the world's top journals across the arts and sciences. Our vibrant student body of around 15,000 undergraduate and 15,000 graduate students includes over 4,000 international students. Find out more at www.u-tokyo.ac.jp/en/ or follow us on X (formerly Twitter) at @UTokyo_News_en.
END