We've just launched our map. Add yourself by clicking here!


Sorting plastic with robotics (V4)

Tagged: ,

This topic contains 37 replies, has 17 voices, and was last updated by  Nicolas 6 months ago.

Bradford hbird

Sorting plastic with robotics (V4)

04/10/2018 at 18:05

Hello to the community! My name is Bradford and I’ve arrived in Eindhoven for robotics research. The loose plan at the moment is to provide a framework for robotic of sorting plastics by material. It’s a huge goal and we’re still laying out the groundwork to take it step by step. I’ll provide updates in this thread as we go!

But first, some background:

The most common sorting method seen in industry uses compressed air to blast target objects out of a fast moving stream of waste. Check out an example here: https://youtu.be/zIPGUv35A5E?t=100

It’s incredibly impressive to watch and well suited to large scale plants. There are even versions that can do it with plastic that has already been shredded into 2 mm flakes! Machines commonly use near infrared sensors sensors to identify plastics, although some streams use standard cameras that may sort by color. These systems are also typically binary by design. Each machine can only sort the incoming stream into two groups as controlled by the machine operator. The cost and space required for these industrial sorters is significant, meaning they are not typically found outside of large facilities.

On the opposite end of the spectrum, manual labor is also typical for countries with lower wages. Humans all over the world sort through the trash of others and collect bottles and other recyclable materials. https://youtu.be/dxsJVaL5vLc?t=483

There’s also sorting options that utilize the differences in plastic densities. There’s some discussion on that here: https://davehakkens.nl/community/forums/topic/plastic-sorting-a-personal-project/

With robotic arms getting more common and processors becoming cheaper, we feel there’s potential for a robotic solution in between industrial pneumatic air blasting and manual sorting.

We want to advance to physical testing as soon as possible as I only have 3 months here in the Netherlands (Visas are complicated :P). That means the first step is choosing hardware! I’ve spent the past few days researching a huge variety of arms. Here are some quick thoughts:

SCARA vs Delta vs 6DOF:

These are the three most common robot types used for pick and place operations. 6DOF arms are the most common and are highly adaptable for different tasks. SCARAs are a little more specialized for picking up and placing, but tend to have more limited reach. Delta robots are typically known for their speed and resemblance to spiders. A more detailed breakdown is here: https://www.crossco.com/blog/6-axis-vs-scara-vs-delta-vs-collaborative-robot-cage-match

A few options that look ideal for this research:

Franka Emika Panda: Feature packed, safe to work around, and appears easy to interface with. ROS integration is huge as I have experience there.

Dobot M1: Reasonably priced and seems easy to interface with, but the company has had some troubles shipping working products and I have yet to find any evidence of someone using it in the wild.

Buying an industrial machine secondhand: Still diving into other robots that could fall under this option. The industrial options I’ve looked at so far are likely out of our price range if bought new. There’s a few websites I’ve looked at that specialize in secondhand sales through Europe.

This is a deep project and there’s still a ton of challenges moving forward! For instance:
Choosing a suitable gripper
Interfacing with a near infrared sensor
General programming

If you have some knowledge to offer or want to help, feel free to shoot me an email at [email protected]. Always open to suggestions or ideas.

37 replies
21 subscribers
8 saved
sort on date
31/12/2018 at 01:44

Hi, dont understand me wrong, i love robotic arms, but they seem expensive and overengeneered to me. Why not let the plastic parts go along a conveyor.
First the plastic passes the sensor, then it passes a series of pneumatic actuators: pistons or blowing the plastic with compressed air in their bin.

12/04/2019 at 22:27

@s2019 right now we are aming at getting wider spectral response, since NIR is being developed already, maybe we could work both and compare to see what makes more sense in terms of cost/performance in order to make the technology available OpenSource Worldwide. Thanks for the info, will dive into it 😎

@pporg  Brad got us an xBox 360 Kinect so we are now using pcl (point cloud library) inside ROS. Maybe later we could look into monocular vision and deep learning.

Now some news:
I was lucky enough to have some optics expert friend @ Twente University so I could use some labs, so now we have sensor board ready for some test. Stay tuned

29/12/2018 at 00:46

Hello again! Lots of news to update!

Key accomplishments in the past month:
-Wired our vacuum generator and built macros for picking up and ejecting plastic
-Programmed more fun robot dances
-Tested a Matoha‘s Plastell device
-Designed a proof of concept pickup game
-Installed and configured ROS on FANUC’s controller
-Used Rviz’s MoveIt planner to send movement plans to the arm
-Built a basic python script to send the FANUC gripper to a point in cartesian space
-Returned to the USA because my 90 days in Europe were ending 🙁

Moving forward we’ll be building a vision system, most likely using a kinect, although a 2D system is also possible. We’ve already started interviewing candidates to join the team and take over the project, go to next.preciousplastic.com if you’re interested!

Keep an eye out for some more video in the next monthly news…

29/11/2018 at 19:59

Moving forward with some gripper testing!


Not the worst vacuum for a test, but today I did confirm our more automatic vacuum generator works. Will be wiring it up and integrating into the controller in the coming days!

30/07/2019 at 23:43

I’m curious if there was any success in building the Raman effect instrument.

10/04/2019 at 22:13

“Two heads are better than one”, or so they say….

So Brad and I, we have been working hard on pushing up robotics plastic sorting.

Brad is working on making the robot “look” @ things and picking them up, also we received the electronics for our new sensor so tomorrow we will be visiting an optics laboratory to build them, talk about our proposed sensor and maybe even make some test.


Hopefully will be back to you with good news soon, stay tunned!! 😉

PS. We will have to find a new use for all that electronics packaging

03/04/2019 at 22:55

Hey there, I’m Alejandro from Mexico and I have joined Bradford @v4 team to get OneArm sorting plastic, we are setting up git repo and planning on building an advanced optics sensor and backend system to be able to identify plastic, so stay tunned!!!

05/03/2019 at 23:53

@hbird, I would be interested to help out when you have the github repo set up. I have a background in ROS development (not so much control, more in drivers) and am looking to get more into Computer Vision.

It might be an interesting direction to use a Delta printer style robot to pick up the object and have the sensor mounted to the gripper to perform the check en route to the sorting bins.

09/01/2019 at 22:50

I am very interested in finding a low-cost sorting solution for recycling plastic. I am working on a concept of home-based shredding machine that will sort plastic intelligently prior to shredding. The shredded plastic will be washed (with solvent or water) and packed so that it retains its purity during storage. A labeller (built into the machine) can print the type and quantity of plastic for easy reference. I am also exploring the option of giving away this plastic “for free“.

I know it sounds preposterous to give away something that takes time and money to create. But think about it for a minute. Won’t you love your recycled plastic to go back into plastic products rather than end up at a landfill? Free delivery of good-quality recycled plastic will be a no-brainer for any moulder and designer to adopt. What do you think?

15/10/2018 at 09:49

Hey Bradford,

Thinking about what you would be doing here (probably picking largish chunks of plastic off of a conveyor belt?) the best robot would probably be a Delta. I worked with a guy recently who if I remember correctly was doing exactly this but on an industrial scale. I’ll see if I can ask him for more details tomorrow but I think it was a Delta robot picking bottles from a line of mixed plastic running under a camera.

Second to the delta would be the scara. I’ve heard of instances where the scara can outperform a Delta in speed because you can make it more rigid by leaving out certain degrees of freedom and just really truck along one plane. This would also open you up to potentially designing your own custom mechanism if you wanted to get crazy since scara kinematics are so easy. That would really open you up to making some inexpensive mechanisms. Picture making a custom scara with some H Bridges and potentiometer position feedback😂🤣 Could even make some molds to injection mold the frame. Now I might be on to something.

I think a full 6 DOF robot would be massive overkill and also really slow. In my experience I’ve never seen one move anywhere close to the speeds of comparitievly simpler and certainly cheaper Delta’s and scaras. Plus they’re usually much stronger. Although my company did just develop a little guy that might work for this. Check out the Yaskawa Motomini. It might be a bit small, although I suppose you could develop some sort of diverting system so the robot would only ever have to place the cubes in one place at the top of a slide and then by having diverting paddles the plastic would find it’s way into the right bin. That’s another idea (this post is getting very stream of consciousness). If you found a way to make all the plastic come down singlefile, we could sort it with a series of diverters instead of trying to pick it with a robot. This takes a huge expense out of the equation. It might not be as fast and creating the initial separator might be interesting but then you wouldn’t need a camera to track the position of the plastic on the belt either. Eliminate machine vision and this project is starting to look more acheivable. Hook up a few laser diodes with different emission frequencies and you’ve got yourself a plastic sorter.

If you do choose to stick with a strict robotic route I was at a packaging expo today and saw some flexible grippers which looked like they could handle picking really varied sized and shapes. Company was called soft robotics. Could definitely be useful for sorting through a bunch of different sized plastic chunks. It would be much easier to just be able to say ‘close’ instead of ‘how much do I need to close to grip this bottle but not make it explode’.

08/05/2019 at 12:14

And one more gif of behind the scenes

08/05/2019 at 12:12

Just learned that my email notifications for this thread weren’t making it to my inbox! Oh no! Thanks to @alromh87 for staying on top of things.

Time for my update! I came back to the Netherlands for April and May to give the robot a vision system. Aaaaand, it’s currently working!

Two in development videos here:

I have to leave again in two weeks and am working on a more in depth video that explains everything. Looking forward to sharing it!

07/05/2019 at 21:39

Thank you for the update. If you made a schematic layout of your instrument, if you don’t mind posting it, it would be very helpful for following along.

I was curious, how does the sensitivity of the detectors you chose compare to the typical SLR used in the third link you posted. They were doing a transmission measurement and commented on needing the sensitivity. Do you expect the reflected signal Raman effect to also need the high sensitivity?

Great work.

07/05/2019 at 19:47

Some progress on the sensor development, I have finished with the design for optical components placing, now I’m finishing manufacturing and starting to build a clean room, that is required for optics handling (Final prototype should consider a dust tight case). I will start a new forum thread as soon as I get some Spectra.

On the Robot side of things Brad has achieved improvements in object segmentation, path planning and functional demo. He is now preparing git repo and video documentation.

More coming soon!!

24/04/2019 at 13:59

I am glad you’re interested in the topic, We will be exploring the Raman effect for both 532nm and 785 nm, so in case photoluminescence on the visual could arise as an issue we have the 785 and we will also have a platform to study different plastics on different wavelengths.

As of benchmarking and driving the robot we have tested matoha system and hoping opensourcing their  project could be of interest to them.

We strive to provide an OpenSource / low cost solution capable of classifying plastic, going with the PreciousPlastic spirit of enabling plastic recycling to everyone for free.

We are working on the first prototype and will upload more info as we get our first spectra, meanwhile we are looking at using the robot for some other cool stuff to boost plastic recycling.

In case someone wants to learn a lot about the supporting technology while we figure it out here are some links, just be warned  there are some advanced optics and lots of hacking needed:


14/04/2019 at 21:22

With a monochromatic source in the visible, are you still expecting to do sorting based on plastic type?
I don’t know anything about these units https://nirvascan.alliedscientificpro.com/, but maybe get one as a benchmark for what you are developing and to drive the arm until your sensor is operational.

14/04/2019 at 09:16

We will try TCD1304 CCD as sensor and 532nm laser as light source.

13/04/2019 at 23:31

Interesting, which sensor and light source did you choose?

11/04/2019 at 19:10

Have you selected an algorithm for processing the spectral response? I found this interesting description. https://waset.org/publications/11237/identification-and-classification-of-plastic-resins-using-near-infrared-reflectance-spectroscopy

31/12/2018 at 18:46

Like it was mentioned before, I think a Delta Robot would be the best bet here. They are very inexpensive to fabricate, they are super fast and the configuration makes it easy to mount additional hardware to the head. Suction system? No problem, overhead vision system piece of cake to mount, plastic categorization tool, whatever that might be; just mount it on the head. Makes for a much modular and upgradeable design.

As far as machine vision goes, detecting objects in a conveyor belt is not difficult at all. This information can be easily converted to gcode to be executed by the Delta robot.

OpenCV can be used with Javascript to speedup the development of the Machine Vision part of the equation.

To expand a little bit, you can use javascript as the main interface for the whole thing, which would make very easy to distribute and use later down the road. I currently have a Delta 3d printer I could repurpose and a webcam I can use to prototype a proof of concept.

31/12/2018 at 08:32

I am also exploring solution but looking at machine learning and neural network to sort out plastic and use conveyor belt solution instead of robotics arm

Cheaper solution would be to use some raspberry or buy AWS deeplens and use pneumatic




31/12/2018 at 00:45

What about the little symbol that gets printed on the base of most plastics that identifies its type? Is it worth developing a system with a few cameras from different angles to spot the symbol and use it to identify the type? You’d need a decent size data set of classified pictures to train a neutral network with, and it means going down the conveyor belt route with a robot to send the identified plastics to the correct destination.

27/11/2018 at 19:12

And here’s a beautiful picture of One Arm!

06/10/2018 at 19:11


Hi Arthur! I’d love to talk. Can you send an email to my address in the first post?

06/10/2018 at 12:36

Hi BradFord !

I’m Arthur and i’m 23 years old. I’m currently working as an engineer for a little company in France which create some prototype  for the industry. As far as i’m concerned, i work on robot, especially with ABB’s robots. My company is one the first client of ABB, that’s the reason why we have some price on their robots. For example, we can buy some secondhand robots for a ridiculous price. If you want, i can show you some machines we developped with robots.

Focusing on Precious Plastic, can you develop the goal of the robot in order to sort plastic ? Because I think the crucial point is the sensor which have to sort the different plastics.

I’m very enthusiast thinking about creating a robotic solution for Precious Plastic. I hope i will help you !!

Have a good day !


31/12/2018 at 05:35

i`m realy need a MATOHA sistem for my factory!! 🙁 please gift me one… !! 🙂

27/11/2018 at 17:43

Hello again! Sorry for the long delay here, but we had to wait on our grant funding to arrive before moving forward with such a notable purchase.

Big news though! We have a robot! To be more precise, we have a FANUC M-10iA paired with their R-30iA controller. Manufactured in 2011, this bad boy has sleek curves, a sizable reach, and a 10 kg payload capacity.

Now what?

Well, a project this large requires baby steps. Mounting and getting familiar with some basic programming came first. @rockethammer happened to drop by our workspace on just the right day to start making programs using the teach pendant, and we taught the robot how to give high fives!

There’s a lot of components to this whole project, so lets break it down:

Keeping things open source is the general mantra of Precious Plastic, which means we’re going with ROS to link everything together. Really this just means the robot arm is just a modular part of the entire system, and this means that other arms would be able to step in as long as they run ROS.

Since ROS is a pretty big aspect, getting it installed on the controller is rather important. A couple of days ago I found out there’s a small required software feature called User Socket Messaging that appeared standard but apparently doesn’t come preinstalled on every controller. Working as quickly as possible to get this added onto our controller. After ROS is running, we’ll be able to send position data from a computer to the robot.

Near infrared sensing
Identifying plastic type is fairly central to this project, and we’re still thinking near infrared (NiR) sensors are best. Unfortunately the ones used in industry are expensive. This is out of my abilities currently, but we’re in touch with several teams across the world working on cheaper NiR options that we could potentially integrate. These less industrial solutions generally require near contact with the plastic to get a reading, and if things get light enough there’s potential to design a vacuum gripper with the sensor attached.

I’m also exploring using a kinect to generate point clouds and use that for object detection. Will be using an older project of mine as a base for this.

There are also 2D solutions that can work just as well, but I’ll only be diving into this side of things once we have ROS control.

While getting ROS control is highest on my priority list at the moment, I’ve had time to start work on gripper design. Our robot vendor helped out with giving us some spare parts to assemble a vacuum gripper, and we’re currently milling an aluminum attachment mount. Next up is wiring up our vacuum generator. Pictures to come.

The end goal of course is grabbing objects as they move by on a conveyor belt, but this is much further down the line. We’ll start with stationary waste.

That’s all for now! This is a huge project for one engineer, and I’ve been fortunate to get support here and there from other members of our V4 Army. If anyone else has skills to contribute from afar, feel free to leave it here or message me directly.

11/04/2019 at 19:27

I’d also love to have more details on this progress. I could add my 2 cents here and there about AI, image processing (OpenCV and friends) 🙂

05/01/2019 at 00:35

Theoretically possible if you can get a good view of every angle, but unfortunately there are a TON of plastic objects without this symbol.

I do think this would be a strong solution to a conveyor line narrowed down enough to one item at a time. Non container plastic waste has a huge variety of shapes and sizes which I think would be the biggest challenge

I think neural networks would be perfect at picking out things like clear bottles from a waste stream. I recently spoke about an object detection type of system with someone with decades of experience in the American recycling system. On a neural network type solution he had concerns with how rapidly packaging changes and how difficult it might be to keep up to date. Regional differences would also be huge; in the Netherlands I had trouble guessing what different plastic waste items came from. Everything had different packaging compared to America!

Anyway, as we’re aiming for a modular system, any identification system could work with a pneumatic system too. The object detection will just pass the location (or future moving location) of a target to the picking system. That could be an air blast system, linear actuator, or delivery drone 😛

There’s a ROS library for using javascript in a web browser, but I’m more experienced with Python in general. I’ve done some basic things in OpenCV in the past, and am planning on diving deeper very soon!

Cartesian 3D printers are definitely cheaper. In this use case they’d struggle some with speed and strength. Delta style on the other hand…

Github is on my todo list! Current code is just a few python scripts paired with some ROS industrial files with slight modifications.

At the moment, the cheaper near infrared systems effectively require contact with the material. Matoha’s solution passes the measured spectograph through their model, and it works well if the object was placed properly on the sensor. That’s why I’m interested in integrating a sensor into a gripper.

Thanks so much for that grasping link! Will be taking a deeper look. The vacuum gripper we designed does a decent job of conforming to contours, but in the long run the ability to target surfaces at a near perpendicular angle is important.

Maybe @matoha is open to offering some more insight into their sensor?

Viewing 30 replies - 1 through 30 (of 37 total)

You must be logged in to reply to this topic.

Support our projects on Patreon so we can keep developing 💪