Forum Replies Created

Viewing 10 replies - 1 through 10 (of 10 total)
sort on most likes
08/05/2019 at 12:14

And one more gif of behind the scenes

08/05/2019 at 12:12

Just learned that my email notifications for this thread weren’t making it to my inbox! Oh no! Thanks to @alromh87 for staying on top of things.

Time for my update! I came back to the Netherlands for April and May to give the robot a vision system. Aaaaand, it’s currently working!

Two in development videos here:

I have to leave again in two weeks and am working on a more in depth video that explains everything. Looking forward to sharing it!

05/01/2019 at 00:35

Theoretically possible if you can get a good view of every angle, but unfortunately there are a TON of plastic objects without this symbol.

I do think this would be a strong solution to a conveyor line narrowed down enough to one item at a time. Non container plastic waste has a huge variety of shapes and sizes which I think would be the biggest challenge

I think neural networks would be perfect at picking out things like clear bottles from a waste stream. I recently spoke about an object detection type of system with someone with decades of experience in the American recycling system. On a neural network type solution he had concerns with how rapidly packaging changes and how difficult it might be to keep up to date. Regional differences would also be huge; in the Netherlands I had trouble guessing what different plastic waste items came from. Everything had different packaging compared to America!

Anyway, as we’re aiming for a modular system, any identification system could work with a pneumatic system too. The object detection will just pass the location (or future moving location) of a target to the picking system. That could be an air blast system, linear actuator, or delivery drone 😛

There’s a ROS library for using javascript in a web browser, but I’m more experienced with Python in general. I’ve done some basic things in OpenCV in the past, and am planning on diving deeper very soon!

Cartesian 3D printers are definitely cheaper. In this use case they’d struggle some with speed and strength. Delta style on the other hand…

Github is on my todo list! Current code is just a few python scripts paired with some ROS industrial files with slight modifications.

At the moment, the cheaper near infrared systems effectively require contact with the material. Matoha’s solution passes the measured spectograph through their model, and it works well if the object was placed properly on the sensor. That’s why I’m interested in integrating a sensor into a gripper.

Thanks so much for that grasping link! Will be taking a deeper look. The vacuum gripper we designed does a decent job of conforming to contours, but in the long run the ability to target surfaces at a near perpendicular angle is important.

Maybe @matoha is open to offering some more insight into their sensor?

31/12/2018 at 18:57

Thanks all for the responses! I’ve been taking a few days off now that I’m back home, but will offer some thoughts on everyone’s suggestions in the new year. Keep em coming!

29/12/2018 at 00:46

Hello again! Lots of news to update!

Key accomplishments in the past month:
-Wired our vacuum generator and built macros for picking up and ejecting plastic
-Programmed more fun robot dances
-Tested a Matoha‘s Plastell device
-Designed a proof of concept pickup game
-Installed and configured ROS on FANUC’s controller
-Used Rviz’s MoveIt planner to send movement plans to the arm
-Built a basic python script to send the FANUC gripper to a point in cartesian space
-Returned to the USA because my 90 days in Europe were ending 🙁

Moving forward we’ll be building a vision system, most likely using a kinect, although a 2D system is also possible. We’ve already started interviewing candidates to join the team and take over the project, go to if you’re interested!

Keep an eye out for some more video in the next monthly news…

29/11/2018 at 19:59

Moving forward with some gripper testing!


Not the worst vacuum for a test, but today I did confirm our more automatic vacuum generator works. Will be wiring it up and integrating into the controller in the coming days!

27/11/2018 at 19:12

And here’s a beautiful picture of One Arm!

27/11/2018 at 17:43

Hello again! Sorry for the long delay here, but we had to wait on our grant funding to arrive before moving forward with such a notable purchase.

Big news though! We have a robot! To be more precise, we have a FANUC M-10iA paired with their R-30iA controller. Manufactured in 2011, this bad boy has sleek curves, a sizable reach, and a 10 kg payload capacity.

Now what?

Well, a project this large requires baby steps. Mounting and getting familiar with some basic programming came first. @rockethammer happened to drop by our workspace on just the right day to start making programs using the teach pendant, and we taught the robot how to give high fives!

There’s a lot of components to this whole project, so lets break it down:

Keeping things open source is the general mantra of Precious Plastic, which means we’re going with ROS to link everything together. Really this just means the robot arm is just a modular part of the entire system, and this means that other arms would be able to step in as long as they run ROS.

Since ROS is a pretty big aspect, getting it installed on the controller is rather important. A couple of days ago I found out there’s a small required software feature called User Socket Messaging that appeared standard but apparently doesn’t come preinstalled on every controller. Working as quickly as possible to get this added onto our controller. After ROS is running, we’ll be able to send position data from a computer to the robot.

Near infrared sensing
Identifying plastic type is fairly central to this project, and we’re still thinking near infrared (NiR) sensors are best. Unfortunately the ones used in industry are expensive. This is out of my abilities currently, but we’re in touch with several teams across the world working on cheaper NiR options that we could potentially integrate. These less industrial solutions generally require near contact with the plastic to get a reading, and if things get light enough there’s potential to design a vacuum gripper with the sensor attached.

I’m also exploring using a kinect to generate point clouds and use that for object detection. Will be using an older project of mine as a base for this.

There are also 2D solutions that can work just as well, but I’ll only be diving into this side of things once we have ROS control.

While getting ROS control is highest on my priority list at the moment, I’ve had time to start work on gripper design. Our robot vendor helped out with giving us some spare parts to assemble a vacuum gripper, and we’re currently milling an aluminum attachment mount. Next up is wiring up our vacuum generator. Pictures to come.

The end goal of course is grabbing objects as they move by on a conveyor belt, but this is much further down the line. We’ll start with stationary waste.

That’s all for now! This is a huge project for one engineer, and I’ve been fortunate to get support here and there from other members of our V4 Army. If anyone else has skills to contribute from afar, feel free to leave it here or message me directly.

29/10/2018 at 18:11


Thanks for the ping, especially because it looks like my long response to you was swallowed into the void. Had some longer thoughts on why we would go in each direction. Also I’d love to hear more about what your contact worked on.

Short update: we selected a refurbished FANUC 6DOF a week ago, and are working on finalizing purchase! Yes it will be slower than a delta, but there are a few limiting aspects of using a delta we wanted to avoid.

Longer update later this week; last week everyone’s work slowed to a crawl due to Dutch Design Week.


06/10/2018 at 19:11


Hi Arthur! I’d love to talk. Can you send an email to my address in the first post?

Viewing 10 replies - 1 through 10 (of 10 total)