Wednesday, March 7, 2018

Rubik's Solver Software

Recently, Ben Katz and I collaborated on a Rubik's Cube solving robot to try to beat the world record time of 0.637 seconds, set by some engineers at Infineon.  We noticed that all of the fast Rubik's Cube solvers were using stepper motors, and thought that we could do better if we used better motors.  So we did:

Our solve time of 0.38 seconds includes acquiring the image from the webcam, detecting colors, finding a solution, and actually rotating the faces of the cube.  In the video, the machine is solving a "YJ Yulong Smooth Sitckerless Speed Cube Puzzle", available on Amazon for $4.55.  We used the cheapest cube we could find on Amazon Prime because we thought we'd end up destroying many of them, but somehow ended up only going through 4 cubes and 100's of solves.

Ben made a blog post that describes the hardware and build as well as the insane nonlinear minimum-time sliding mode controller which let us do 90 degree moves in around 10 ms.  We used Kollmorgen ServoDisc motors, which have a very high torque-to-inertia ratio.  The motor is coreless, so there are no heavy steel laminations on the rotor, and there's no steel to saturate, so it can accelerate insanely fast.  In a 10 ms quarter-turn move, the motor reaches over 1000 rpm. 

On the software side, I used OpenCV for the color detection and this fantastic implementation of Kociemba's Two-Phase algorithm called 'min2phase' .  We used Playstation 3 Eye webcams, which are only $7 on Amazon Prime, and work at 187 fps under Linux.  The software identifies all the colors, builds a description of the cube, and passes it to the min2phase solver. The resulting solve string is converted to a compact cube sequence message, and is sent to all motor controllers simultaneously using a USB to serial adapter connected to a differential serial IC.  This whole process takes around 45 ms.  Most of the time is spent waiting for the webcam driver and detecting colors.  All our software is on GitHub here:

The motor controllers step through the moves one by one and remain synchronized with the AND BOARD, which tells all the motor controllers when the current move is finished. 


  1. Have you ever considered trying to design a robot solver that more closely approximates human conditions? I think there would be great fun to assume no cube modifications or tethering, and see how fast you can get robot "hands/fingets" to make the 20-30 turns

  2. Hey, Jared. My name is Jack Williams and I am a reporter for Caters, a newswire service based in New York. Great work on this project! We would love to write a story about it and send the piece out over our wire to the mainstream media outlets across North America and Europe. Would you or Ben mind dropping me an email to discuss, please? Thanks.


  4. What is your reasoning for using LAB colorspace, is it faster than OpenCV's HSL/HSV conversion? Doesn't finetuning threshold bounds in HSV-space help with the issue of having to sharpie the orange side?

    1. LAB and HSV were both around the same speed, but it was much easier to tune in LAB for some reason. With our lighting setup and the playstation camera, it was impossible to tell the red and orange apart - the RGB values are sometimes exactly the same.

  5. This comment has been removed by the author.

  6. Olá amigo boa noite estou com um projeto parecido ao seu porem mais simples, você consegue me ajudar? gostaria de código para poder implementar no meu robô.