Australian National University PhD Student Zheyu Zhuang worked with 27 members of the Australian Centre for Robotic Vision (ACRV) team to win the Amazon Robotic Challenge.
Designed to encourage and inspire robotic research, Amazon’s Robotic Challenge sought solutions to robotic automation issues with item selection and picking. Whilst sophisticated robotic technology exists to move goods within a warehouse, retailers are now seeking improved robotic accuracy and efficiency to identify, handle and pack items.
In its third year, the Amazon Robotic Challenge brings together shared and open solutions to solve common challenges in robotic automation.
Zheyu Zhuang tells us about the challenge:
What was your team’s unique approach to the challenge?
Our robot, named Cartman is similar to the claw-crane machine you find as an arcade game that picks up toys. However, instead of a claw that disappoints many people, Cartman has a suction cup and a two-fingered parallel gripper on either side of a rotating mechanism. This design offers the flexibility of grasping an object at the right pose with the appropriate tool.
The unique feature about Cartman is not just the physical design but also the vision system. Our vision system worked off a very small amount of hand annotated training data. We only needed seven images of each unseen item for us to be able to correctly identify the item during the competition and this was crucial when we only had thirty minutes to train the robot to identify them.
How did the ARCV robot compare to some of the other solutions put forward?
We were the only team to use a custom-built Cartesian Robot. The benefit of using a custom-built robot is that we had the full control of both software and hardware. The mechanical design can iterate well with software updates and made it easier for us to implement our unique vision system.
What was the biggest hurdle for your team to overcome?
As one of my teammates said, “The difficulty of real-work robotics is that there are always one percent of situations that we’ve never thought about”. The biggest hurdle was preparing our robot for the real-world scenario. Cartman performed well in lab environment, however during the practice run things didn’t go to plan. Rather than feature freezing, we decided to take the risk of rewriting some code and adding new logic onto the state machine to improve its performance just one night before the competition. It turned out the risk was worth taking.
How close are we to seeing pick and pack technology introduced to the market?
There are still many technical challenges to resolve before we’ll see such systems on the market. For example, one challenge of vision system is occlusion. Items are more likely to be stored in a cluttered environment, which means the target item could be hidden behind others and correctly identifying the target item from a small visible segment is challenging. Additionally, determining an appropriate grasp method based on this small segment makes this problem even harder.
Recent research could potentially solve these problems and as we live in a fast-paced technological era a solution could be available in just a few years.
What are your thoughts on robotic technology replacing the role of humans in this space?
One main concern of the society over robotic technology is whether it will cause unemployment and corresponding consequences. In my opinion, it is highly possible that automation will result in reduced jobs. However, this phenomenon is quite common in human history when we pushed our productivity to the next level. I was told, for this picking job, Amazon’s employee turnover is quite large as high intensity repetitive tasks are not very pleasant.
What’s next for you, where do you see yourself 5 years’ time?
Now that the competition is over, I will focus on doing my PhD research, which is also quite interesting. In 5 years’ time, I hope I will have the capability of solving complex technical problems and it would be great if I can participate in the next big innovation in my research field.
You can learn more about Zheyu Zhuang and his ANU journey here.