Daily Archives


The Computer Vision Waste Challenge

By | Blog | No Comments

Only 13% of valuable materials (plastics, cans, paper, etc.) present in urban world waste are recycled or recovered, meaning that millions of tons (and euros) are lost, incinerated or dumped, each year.

In many cases, that’s because current costly sorting technologies or expensive manual sorting make uneconomical a higher recovery. Automation is a big trend in an activity like waste sorting, that is clearly 3D (Dirty, Dull and Dangerous). In the long run, everyone in the industry expect plants were humans will not be touching waste, letting the hard and hazardous tasks to machines.

Also, lack of real-time information of the waste flows that are processing makes the plants operate almost with blind eyes, losing many opportunities of design optimization and day-by-day adjustments.

Generally speaking, the industry strongly needs technological contributions to expand real sustainable possibilities to recover more, and to meet Governments Waste regulations that are harder each year.

Technical challenge and its solution: AI

To automate waste sorting, we need machines able to see and handle waste. But waste streams are extremely complex both in terms of detection and manipulation. There’s virtually infinite possible objects, sizes, shapes, colors, bright, etc. Things come dirty, broken, crushed, overlapped …

From an informatics standpoint, we cannot program code to recognize and operate such a variety of items and conditions. That’s partially because a large part of our knowledge is tacit: we can’t fully explain to other human or machine how to distinguish, for example, a squashed part of a plastic bottle in a mountain of garbage. Computer vision traditional techniques are neither enough sophisticated to this end.

Then, we need to use a different strategy: to develop algorithms that allow computers to learn what they need to know. This way, the machines learn how to solve their own problems (from a huge number of examples and using structured feedback) rather than being explicitly programmed by humans for a particular outcome.

This is Artificial Intelligence, the approach Sadako Technologies, participant of HR-RECYCLER project, has used for garbage detection during last 7 years devoted to technology development for the recycling industry.

With AI algorithms based on last generation Deep Learning techniques (multi-layer convolutional neural networks) and a proprietary database of millions of segmented labeled waste images, Sadako’ technology replicates the visual recognition skills and brain process of a person, making possible that a simple camera plus a computer are able to “see” waste as humans do.

That would be impossible without using GPU-accelerated parallel high performance computing. Hardware dropping prices allow a much cheaper recognition than other conventional methods like NIR cameras or other sensors.

AI-infused vision applications in the waste field

Today, Sadako AI is in operation inside the waste robotic sorter Max-AI, a product of the US company Bulk Handling Systems (BHS). Dozens of robots in 4 continents are recycling with Sadako algorithms as their eyes and brains.

Beyond boosting robotics sorting, Sadako has developed RUBSEE, a waste flow monitoring system for the waste treatment plants, to achieve smart plants “aware” of what they are processing, so that they can optimize its design and operation. This has received the financial support of the European Commission via an SME Instrument phase 2 of the Horizon 2020 Programme.

RUBSEE is a disruptive real-time monitoring system that uses advanced Artificial Intelligence and Computer Vision to determine in every moment the composition (kind/quantity) of material present in a number of locations in the plant. It aggregates and presents the information so that in can be easily analyzed and activated, and generates automatic alerts that can help managers and technicians to detect and resolve undesirable events.

On the other hand, for the HR-Recycler European Project, Sadako is developing the vision capabilities needed by the robots to see and manipulate WEEE (Waste Electric and Electronic Equipment) objects in the human-robot collaborative recycling process targeted.


A growing number of gadgets – and more people who can afford to buy them – has led to important increase of the electrical and electronic equipment waste in most countries. The global quantity of e-waste that is generated on an annual basis is estimated equal to 44.7 Mt. Only 20% is documented to be collected and recycled. In addition, E-waste contains precious and rare metals, valuable bulky materials along with plastics that can be recycled.

Until now, the registered practices for WEEE recycling require really expensive, extensive and time-consuming manual efforts. As a result, vast amounts of WEEE materials remain unprocessed in recycling plants, often wind up in landfills, or are not processed in a safe/legal way.

HR-Recycler targets the development of a ‘hybrid human-robot recycling plant for electrical and electronic equipment’ operating in an indoor environment. SADAKO is working in the environment analysis and registration (development of novel object detection methods to identify the different WEEE objects types and their constituent parts), and in the human motion analysis and prediction.

With HR-RECYCLER, Sadako is excited to extend the impact of its AI technology to the E-Waste field, one of the most growing and potentially valuable waste streams of the world.

See recent SADAKO Video

CERTH visits BNTT factory for two recording sessions

By | Blog | No Comments


In two sessions held at the BIANATT  Aspropyrgos installations on July 8-9 and September 23-25, the procedures of dismantling FPD screens, desktop towers, emergency lighting and microwave ovens, as well as the procedure of classifying WEEE appliances, were recorded in a special high power server using high performance cameras and other peripherals.  Both sessions were coordinated and recorded by CERTH researchers.

BIANATT provided all the equipment and installed a customized construction to allow image recording from many different angles. Attached you may see photos of the installation that was used for this purpose.

The results of the trials and the prospective use of the registration material in the development of the relevant software will be presented during the 4th plenary meeting of the project partners, scheduled to take place on 12-14/11/2019 in Athens.