【机器人手臂帮助3D打印笔,让你见证“妙笔生花”】

【机器人手臂帮助3D打印笔,让你见证“妙笔生花”】机器人手臂可当做3D打印笔的“引导手”帮你3D打印,同时该手臂带有触觉接触面,可以让你容易的参与到3D打印中,特别在雕塑作品的运用中,能为用户提供潜在的绘画方式和绘画的详细信息,让你的3D打印作品更加精致。

Robot Arm Helps You 3D Print By "Guided Hand"

Mjc0MjAzNg

As cool as those handheld 3D printing pens are, you have to have some amount of talent (or at least practice) in order to make anything that’s much more recognizable than a mangled three-dimensional squiggle. A proper 3D printer is basically one of those 3D printing pens stapled to a robot that can move it in three axes and do a much better job making things that look nice and function well, but it doesn’t allow for much artistic participation from you. For some people, that’s the point, but if you’d like to be more directly involved, Yeliz Karadayi’s thesis project, called “Guided Hand,” is a 3D printing pen with a haptic interface that helps keep you from screwing things up too badly.

These haptic interfaces are basically little robot arms, although you can produce the same effect with robot arms of any size). It’s hard to explain how it feels to use one of these things, and the experience doesn’t come through very well on video, but basically, the end of the arm (being a robot) knows exactly where it is in 3D space, which means it can tell whether it is about to intersect a virtual 3D object or not. Most of the time, the arm is in gravity compensation mode, but if you try to move it into a virtual object, it can kick in its motors and stop you. In practice, this results in a very convincing I’m-poking-an-invisible-object kind of feeling, a lot like what you get when you move a pair of strong magnets around each other. However, the robot arm can also duplicate textures, tactile sensations, and even 3D objects that are moving around or (sometimes) trying to bite you.

Karadayi’s implementation of this robotic technology uses several different techniques to help guide the user. There’s boundary exclusion, which prevents you from drawing inside an area, as well as containment, which prevents you from drawing outside an area. Attraction helps guide the user by providing some physical feedback, making it easier to follow specific paths. Variations on this include snapping and path following, which bias the pen to help you trace lines and curves.

Mixing in simulated tactile sensations, like vibration, friction, and dampening, offer another, slightly less forceful way of guiding the user. These sensations also have effects on the output of the 3D printing pen itself, leading to (and this is a technical term) increasing squigglyness. The squigglitude also depends on the settings of the 3D printing pen itself, and whether you’re using a medium that’s air-cured or UV-cured.

The really cool thing about Guided Hand is that it can be tuned to be as invasive or non-invasive as you want it to be, as Karadayi explains, allowing you to start with a template, but make changes to it as you go:

As shown in the sculpture application, Guided Hand provides the user with plenty of potential to learn new crafty ways to print, but it is not absolutely necessary to print within the confines of a model or to print all of the model shown in the digital space, or to even have a model at all. By allowing some tolerance and freedom within the constraints, and because the user is stronger than the bounding forces and can break away from them, the user is thus able to make changes to a model in real time as it is being printed. Essentially what this means is that the model in the digital space, rather than being the target output print, could be thought of instead as more of a template. With this in mind the opportunity for prototyping arises, where with each print the user can make slight or subtle changes.

Robotic haptic interfaces like these tend to be most effective when coupled with a virtual environment that the user can experience, whether it’s in an augmented reality context (like a visual overlay), or in virtual reality:

An exciting concept in a visual overlay is that the overlay can provide more information than just the object in the haptic space, such as vector maps, or a heat map of what is left to be printed versus what is not, or advice on where in the volume to extrude ink faster and where to extrude slower. Imagine a gradient, such that as the color shifts from white to black, a number of variables can change. For example, the vibration of the haptic device could intensify, or perhaps the resistance or tolerance level intensifies, or perhaps the extrusion speed increases. Perhaps all of these things happen at once, all while the print is still being actuated and completely controlled by the user.

Sounds very cool, at least for you artistic types, but personally, I’m going to leave the 3D printing completely up to the robots.

原文链接:http://spectrum.ieee.org/automaton/robotics/diy/robot-arm-helps-you-3d-print-by-guided-hand


Comments are closed.



无觅相关文章插件