We've just launched our map. Add yourself by clicking here!


Sorting plastic with robotics (V4)

Tagged: ,

This topic contains 37 replies, has 17 voices, and was last updated by  Nicolas 10 months ago.

Bradford hbird

Sorting plastic with robotics (V4)

04/10/2018 at 18:05

Hello to the community! My name is Bradford and I’ve arrived in Eindhoven for robotics research. The loose plan at the moment is to provide a framework for robotic of sorting plastics by material. It’s a huge goal and we’re still laying out the groundwork to take it step by step. I’ll provide updates in this thread as we go!

But first, some background:

The most common sorting method seen in industry uses compressed air to blast target objects out of a fast moving stream of waste. Check out an example here: https://youtu.be/zIPGUv35A5E?t=100

It’s incredibly impressive to watch and well suited to large scale plants. There are even versions that can do it with plastic that has already been shredded into 2 mm flakes! Machines commonly use near infrared sensors sensors to identify plastics, although some streams use standard cameras that may sort by color. These systems are also typically binary by design. Each machine can only sort the incoming stream into two groups as controlled by the machine operator. The cost and space required for these industrial sorters is significant, meaning they are not typically found outside of large facilities.

On the opposite end of the spectrum, manual labor is also typical for countries with lower wages. Humans all over the world sort through the trash of others and collect bottles and other recyclable materials. https://youtu.be/dxsJVaL5vLc?t=483

There’s also sorting options that utilize the differences in plastic densities. There’s some discussion on that here: https://davehakkens.nl/community/forums/topic/plastic-sorting-a-personal-project/

With robotic arms getting more common and processors becoming cheaper, we feel there’s potential for a robotic solution in between industrial pneumatic air blasting and manual sorting.

We want to advance to physical testing as soon as possible as I only have 3 months here in the Netherlands (Visas are complicated :P). That means the first step is choosing hardware! I’ve spent the past few days researching a huge variety of arms. Here are some quick thoughts:

SCARA vs Delta vs 6DOF:

These are the three most common robot types used for pick and place operations. 6DOF arms are the most common and are highly adaptable for different tasks. SCARAs are a little more specialized for picking up and placing, but tend to have more limited reach. Delta robots are typically known for their speed and resemblance to spiders. A more detailed breakdown is here: https://www.crossco.com/blog/6-axis-vs-scara-vs-delta-vs-collaborative-robot-cage-match

A few options that look ideal for this research:

Franka Emika Panda: Feature packed, safe to work around, and appears easy to interface with. ROS integration is huge as I have experience there.

Dobot M1: Reasonably priced and seems easy to interface with, but the company has had some troubles shipping working products and I have yet to find any evidence of someone using it in the wild.

Buying an industrial machine secondhand: Still diving into other robots that could fall under this option. The industrial options I’ve looked at so far are likely out of our price range if bought new. There’s a few websites I’ve looked at that specialize in secondhand sales through Europe.

This is a deep project and there’s still a ton of challenges moving forward! For instance:
Choosing a suitable gripper
Interfacing with a near infrared sensor
General programming

If you have some knowledge to offer or want to help, feel free to shoot me an email at [email protected]. Always open to suggestions or ideas.

37 replies
21 subscribers
8 saved
sort on most likes
06/10/2018 at 12:36

Hi BradFord !

I’m Arthur and i’m 23 years old. I’m currently working as an engineer for a little company in France which create some prototype  for the industry. As far as i’m concerned, i work on robot, especially with ABB’s robots. My company is one the first client of ABB, that’s the reason why we have some price on their robots. For example, we can buy some secondhand robots for a ridiculous price. If you want, i can show you some machines we developped with robots.

Focusing on Precious Plastic, can you develop the goal of the robot in order to sort plastic ? Because I think the crucial point is the sensor which have to sort the different plastics.

I’m very enthusiast thinking about creating a robotic solution for Precious Plastic. I hope i will help you !!

Have a good day !


06/10/2018 at 19:11


Hi Arthur! I’d love to talk. Can you send an email to my address in the first post?

15/10/2018 at 09:49

Hey Bradford,

Thinking about what you would be doing here (probably picking largish chunks of plastic off of a conveyor belt?) the best robot would probably be a Delta. I worked with a guy recently who if I remember correctly was doing exactly this but on an industrial scale. I’ll see if I can ask him for more details tomorrow but I think it was a Delta robot picking bottles from a line of mixed plastic running under a camera.

Second to the delta would be the scara. I’ve heard of instances where the scara can outperform a Delta in speed because you can make it more rigid by leaving out certain degrees of freedom and just really truck along one plane. This would also open you up to potentially designing your own custom mechanism if you wanted to get crazy since scara kinematics are so easy. That would really open you up to making some inexpensive mechanisms. Picture making a custom scara with some H Bridges and potentiometer position feedback😂🤣 Could even make some molds to injection mold the frame. Now I might be on to something.

I think a full 6 DOF robot would be massive overkill and also really slow. In my experience I’ve never seen one move anywhere close to the speeds of comparitievly simpler and certainly cheaper Delta’s and scaras. Plus they’re usually much stronger. Although my company did just develop a little guy that might work for this. Check out the Yaskawa Motomini. It might be a bit small, although I suppose you could develop some sort of diverting system so the robot would only ever have to place the cubes in one place at the top of a slide and then by having diverting paddles the plastic would find it’s way into the right bin. That’s another idea (this post is getting very stream of consciousness). If you found a way to make all the plastic come down singlefile, we could sort it with a series of diverters instead of trying to pick it with a robot. This takes a huge expense out of the equation. It might not be as fast and creating the initial separator might be interesting but then you wouldn’t need a camera to track the position of the plastic on the belt either. Eliminate machine vision and this project is starting to look more acheivable. Hook up a few laser diodes with different emission frequencies and you’ve got yourself a plastic sorter.

If you do choose to stick with a strict robotic route I was at a packaging expo today and saw some flexible grippers which looked like they could handle picking really varied sized and shapes. Company was called soft robotics. Could definitely be useful for sorting through a bunch of different sized plastic chunks. It would be much easier to just be able to say ‘close’ instead of ‘how much do I need to close to grip this bottle but not make it explode’.

29/10/2018 at 17:23

Hey Bradford!

Any updates for us?? Any problems you’re stuck on and want to bounce off the crowd?? What’s the current state of your solution?

29/10/2018 at 18:11


Thanks for the ping, especially because it looks like my long response to you was swallowed into the void. Had some longer thoughts on why we would go in each direction. Also I’d love to hear more about what your contact worked on.

Short update: we selected a refurbished FANUC 6DOF a week ago, and are working on finalizing purchase! Yes it will be slower than a delta, but there are a few limiting aspects of using a delta we wanted to avoid.

Longer update later this week; last week everyone’s work slowed to a crawl due to Dutch Design Week.


27/11/2018 at 17:43

Hello again! Sorry for the long delay here, but we had to wait on our grant funding to arrive before moving forward with such a notable purchase.

Big news though! We have a robot! To be more precise, we have a FANUC M-10iA paired with their R-30iA controller. Manufactured in 2011, this bad boy has sleek curves, a sizable reach, and a 10 kg payload capacity.

Now what?

Well, a project this large requires baby steps. Mounting and getting familiar with some basic programming came first. @rockethammer happened to drop by our workspace on just the right day to start making programs using the teach pendant, and we taught the robot how to give high fives!

There’s a lot of components to this whole project, so lets break it down:

Keeping things open source is the general mantra of Precious Plastic, which means we’re going with ROS to link everything together. Really this just means the robot arm is just a modular part of the entire system, and this means that other arms would be able to step in as long as they run ROS.

Since ROS is a pretty big aspect, getting it installed on the controller is rather important. A couple of days ago I found out there’s a small required software feature called User Socket Messaging that appeared standard but apparently doesn’t come preinstalled on every controller. Working as quickly as possible to get this added onto our controller. After ROS is running, we’ll be able to send position data from a computer to the robot.

Near infrared sensing
Identifying plastic type is fairly central to this project, and we’re still thinking near infrared (NiR) sensors are best. Unfortunately the ones used in industry are expensive. This is out of my abilities currently, but we’re in touch with several teams across the world working on cheaper NiR options that we could potentially integrate. These less industrial solutions generally require near contact with the plastic to get a reading, and if things get light enough there’s potential to design a vacuum gripper with the sensor attached.

I’m also exploring using a kinect to generate point clouds and use that for object detection. Will be using an older project of mine as a base for this.

There are also 2D solutions that can work just as well, but I’ll only be diving into this side of things once we have ROS control.

While getting ROS control is highest on my priority list at the moment, I’ve had time to start work on gripper design. Our robot vendor helped out with giving us some spare parts to assemble a vacuum gripper, and we’re currently milling an aluminum attachment mount. Next up is wiring up our vacuum generator. Pictures to come.

The end goal of course is grabbing objects as they move by on a conveyor belt, but this is much further down the line. We’ll start with stationary waste.

That’s all for now! This is a huge project for one engineer, and I’ve been fortunate to get support here and there from other members of our V4 Army. If anyone else has skills to contribute from afar, feel free to leave it here or message me directly.

27/11/2018 at 19:12

And here’s a beautiful picture of One Arm!

28/11/2018 at 21:30

I will follow you carefully… Thanks

29/11/2018 at 19:59

Moving forward with some gripper testing!


Not the worst vacuum for a test, but today I did confirm our more automatic vacuum generator works. Will be wiring it up and integrating into the controller in the coming days!

29/12/2018 at 00:46

Hello again! Lots of news to update!

Key accomplishments in the past month:
-Wired our vacuum generator and built macros for picking up and ejecting plastic
-Programmed more fun robot dances
-Tested a Matoha‘s Plastell device
-Designed a proof of concept pickup game
-Installed and configured ROS on FANUC’s controller
-Used Rviz’s MoveIt planner to send movement plans to the arm
-Built a basic python script to send the FANUC gripper to a point in cartesian space
-Returned to the USA because my 90 days in Europe were ending 🙁

Moving forward we’ll be building a vision system, most likely using a kinect, although a 2D system is also possible. We’ve already started interviewing candidates to join the team and take over the project, go to next.preciousplastic.com if you’re interested!

Keep an eye out for some more video in the next monthly news…

31/12/2018 at 00:45

What about the little symbol that gets printed on the base of most plastics that identifies its type? Is it worth developing a system with a few cameras from different angles to spot the symbol and use it to identify the type? You’d need a decent size data set of classified pictures to train a neutral network with, and it means going down the conveyor belt route with a robot to send the identified plastics to the correct destination.

31/12/2018 at 01:44

Hi, dont understand me wrong, i love robotic arms, but they seem expensive and overengeneered to me. Why not let the plastic parts go along a conveyor.
First the plastic passes the sensor, then it passes a series of pneumatic actuators: pistons or blowing the plastic with compressed air in their bin.

31/12/2018 at 05:35

i`m realy need a MATOHA sistem for my factory!! 🙁 please gift me one… !! 🙂

31/12/2018 at 08:32

I am also exploring solution but looking at machine learning and neural network to sort out plastic and use conveyor belt solution instead of robotics arm

Cheaper solution would be to use some raspberry or buy AWS deeplens and use pneumatic




31/12/2018 at 18:46

Like it was mentioned before, I think a Delta Robot would be the best bet here. They are very inexpensive to fabricate, they are super fast and the configuration makes it easy to mount additional hardware to the head. Suction system? No problem, overhead vision system piece of cake to mount, plastic categorization tool, whatever that might be; just mount it on the head. Makes for a much modular and upgradeable design.

As far as machine vision goes, detecting objects in a conveyor belt is not difficult at all. This information can be easily converted to gcode to be executed by the Delta robot.

OpenCV can be used with Javascript to speedup the development of the Machine Vision part of the equation.

To expand a little bit, you can use javascript as the main interface for the whole thing, which would make very easy to distribute and use later down the road. I currently have a Delta 3d printer I could repurpose and a webcam I can use to prototype a proof of concept.

31/12/2018 at 18:57

Thanks all for the responses! I’ve been taking a few days off now that I’m back home, but will offer some thoughts on everyone’s suggestions in the new year. Keep em coming!

31/12/2018 at 19:44

Hi everyone.

Just a short note, an idea. I am wondering if it wouldn’t be easier and much cheaper to use 3D printer like mechanism instead of the robotic arm to sort the plastic.

Just a thought. Cheers George

01/01/2019 at 23:58

Happy New Year everybody. Just saw this on the last Monthly Update video and gotta say it’s an amazing project. First of all, I want to praise the choice of ROS+MoveIt combo, because it make the discussion about which type of robot to use a moot point. This really allows the user to use whatever robot they can get their hands on, and not a specific model or geometry.The challenge lies mainly in the sensory/perception side of the system, as well as the grasping. A few ideas:

-Would a near infrared image allow distinguishing between plastic types? The NoIR camera for the Pi might be able to capture part of the spectrum, wonder if it could help separate at least some of the most common types at least.

-I recently became aware of the StereoPi project, and wondered if instead of using the setup as a stereo camera, using it as a regular picamera + NoIR camera combo. It might be interesting to train two separate DNN models to do inference based on the input images. Such a setup would be light enough to be mounted directly on the end effector, which would save some cycle time compared to the current setup. Even more, it’s possible to use the Movidius compute stick to accelerate inference enough to use raspi’s and the like.

-Because of ROS and MoveIt, this opens the possibility of having the system mounted on a mobile platform. A mobile manipulation system like that would really be an intermediate step between manual collecting of recyclables and the big industrial-scale automated systems.

-For grasping, this might be worth checking out:


Kudos for the initiative. I’ll keep a close eye on this. Is there a github repo or something like that where one could see an opportunity to contribute?


03/01/2019 at 12:54

Hey everyone,

I think I can be of some help on the electronics side. It should not be too much of a problem for me to make a low cost but still very decent sensor with carefully picked components .

Can anyone provide me with raw data we may already have gathered?
– Light response curves for known plastic types? Can we extract the data from the device designed by Matoha, or can they provide this?
– Cross-comparison results of wavelengths within the NIR range with a reference plastic type (preferably done with all types of plastic ^^)?
– Penetration depth of the light. Responsivity comparison of different thicknesses of identical material.
– Impact of colour?

The more data, the better ^^. No worries if we don’t have all but I think these are some factors we can use to improve and possibly simplify the design down the line.

With some of this I should be able to make a cheap versatile design of the sensor that already has some intellect. We can keep the analog part small and the digital with the highest resolution as possible (vs cost and accessibility to the components). Like this we can minimize the interference on the signal from the start (nearby electro motors and such). We can also pick whatever interface signal we want for the readout.

If the sensor performs well we can try cutting the processing power to control the sorting mechanism and only have the components we actually need.

Another idea: If the sensor works well we should be able to make a connected sensor for people at home so they can scan their known waste plastics, press a number from 1-7 depending on the plastic type before thrashing and help building an ever growing database to improve the sorting results.

Cheers, Jasper

05/01/2019 at 00:35

Theoretically possible if you can get a good view of every angle, but unfortunately there are a TON of plastic objects without this symbol.

I do think this would be a strong solution to a conveyor line narrowed down enough to one item at a time. Non container plastic waste has a huge variety of shapes and sizes which I think would be the biggest challenge

I think neural networks would be perfect at picking out things like clear bottles from a waste stream. I recently spoke about an object detection type of system with someone with decades of experience in the American recycling system. On a neural network type solution he had concerns with how rapidly packaging changes and how difficult it might be to keep up to date. Regional differences would also be huge; in the Netherlands I had trouble guessing what different plastic waste items came from. Everything had different packaging compared to America!

Anyway, as we’re aiming for a modular system, any identification system could work with a pneumatic system too. The object detection will just pass the location (or future moving location) of a target to the picking system. That could be an air blast system, linear actuator, or delivery drone 😛

There’s a ROS library for using javascript in a web browser, but I’m more experienced with Python in general. I’ve done some basic things in OpenCV in the past, and am planning on diving deeper very soon!

Cartesian 3D printers are definitely cheaper. In this use case they’d struggle some with speed and strength. Delta style on the other hand…

Github is on my todo list! Current code is just a few python scripts paired with some ROS industrial files with slight modifications.

At the moment, the cheaper near infrared systems effectively require contact with the material. Matoha’s solution passes the measured spectograph through their model, and it works well if the object was placed properly on the sensor. That’s why I’m interested in integrating a sensor into a gripper.

Thanks so much for that grasping link! Will be taking a deeper look. The vacuum gripper we designed does a decent job of conforming to contours, but in the long run the ability to target surfaces at a near perpendicular angle is important.

Maybe @matoha is open to offering some more insight into their sensor?

09/01/2019 at 22:50

I am very interested in finding a low-cost sorting solution for recycling plastic. I am working on a concept of home-based shredding machine that will sort plastic intelligently prior to shredding. The shredded plastic will be washed (with solvent or water) and packed so that it retains its purity during storage. A labeller (built into the machine) can print the type and quantity of plastic for easy reference. I am also exploring the option of giving away this plastic “for free“.

I know it sounds preposterous to give away something that takes time and money to create. But think about it for a minute. Won’t you love your recycled plastic to go back into plastic products rather than end up at a landfill? Free delivery of good-quality recycled plastic will be a no-brainer for any moulder and designer to adopt. What do you think?

05/03/2019 at 23:53

@hbird, I would be interested to help out when you have the github repo set up. I have a background in ROS development (not so much control, more in drivers) and am looking to get more into Computer Vision.

It might be an interesting direction to use a Delta printer style robot to pick up the object and have the sensor mounted to the gripper to perform the check en route to the sorting bins.

03/04/2019 at 22:55

Hey there, I’m Alejandro from Mexico and I have joined Bradford @v4 team to get OneArm sorting plastic, we are setting up git repo and planning on building an advanced optics sensor and backend system to be able to identify plastic, so stay tunned!!!

10/04/2019 at 22:13

“Two heads are better than one”, or so they say….

So Brad and I, we have been working hard on pushing up robotics plastic sorting.

Brad is working on making the robot “look” @ things and picking them up, also we received the electronics for our new sensor so tomorrow we will be visiting an optics laboratory to build them, talk about our proposed sensor and maybe even make some test.


Hopefully will be back to you with good news soon, stay tunned!! 😉

PS. We will have to find a new use for all that electronics packaging

11/04/2019 at 19:10

Have you selected an algorithm for processing the spectral response? I found this interesting description. https://waset.org/publications/11237/identification-and-classification-of-plastic-resins-using-near-infrared-reflectance-spectroscopy

11/04/2019 at 19:27

I’d also love to have more details on this progress. I could add my 2 cents here and there about AI, image processing (OpenCV and friends) 🙂

12/04/2019 at 22:27

@s2019 right now we are aming at getting wider spectral response, since NIR is being developed already, maybe we could work both and compare to see what makes more sense in terms of cost/performance in order to make the technology available OpenSource Worldwide. Thanks for the info, will dive into it 😎

@pporg  Brad got us an xBox 360 Kinect so we are now using pcl (point cloud library) inside ROS. Maybe later we could look into monocular vision and deep learning.

Now some news:
I was lucky enough to have some optics expert friend @ Twente University so I could use some labs, so now we have sensor board ready for some test. Stay tuned

13/04/2019 at 23:31

Interesting, which sensor and light source did you choose?

14/04/2019 at 09:16

We will try TCD1304 CCD as sensor and 532nm laser as light source.

14/04/2019 at 21:22

With a monochromatic source in the visible, are you still expecting to do sorting based on plastic type?
I don’t know anything about these units https://nirvascan.alliedscientificpro.com/, but maybe get one as a benchmark for what you are developing and to drive the arm until your sensor is operational.

Viewing 30 replies - 1 through 30 (of 37 total)

You must be logged in to reply to this topic.

Support our projects on Patreon so we can keep developing 💪