The Big Late Pi Wars Write Up

So, it’s been 5 months since Pi Wars 2018 and I have not made any blog posts at all. Why is that? Did we fail so miserably that I was ashamed to talk about it? No, the real reason is I just haven’t had time to make any blog posts. Either that or when I have had time I’ve procrastinated or done other things instead. So, how did Medway Makers do in Pi Wars? Did we win any awards? How did we rank in our class? Well, I am pleased to announce (finally, although most of you know already) that we WON !!  Yay! Yes, we were the overall winner our class (Beginner) and also came first in 2 individual challenges.

So I will go through the results for each score challenge and comment on them as I go., I shall then wrap up with an overview of the build and design process and how I felt the day went.

I cannot remember the exact order of challenges on the day so I shall present the results are per the Pi Wars 2018 results page which can be found HERE.

First, we have Pi Noon. The results page only shows 1st, 2nd and 3rd so I have no idea where exactly we ranked, but it was pretty low down. On our first round we won. On the second round we lost. As the rounds are knockout rounds that was it. The Pi Noon for 2018 introduced a central tower in the arena which was covered in spikes. This made getting to the rear of your opponent very difficult and time consuming and also meant there was a very tight space for fighting in. Our robot was very fast and maneuverable, but also a little top heavy so it almost rocked over on a few occasions, losing traction and time in the process. We were beaten fair and square by a better robot and driver.

Technical Merit – Joint 15th – So this challenge involved taking your robot to be looked at by a team of judges, you explained how your robot was designed and built and were awarded based in this. On this we came joint 15th. This is very disappointing considering many of the robots that came above us were nowhere near technically advanced as ours. I have no idea how this challenge was scored. Our robot was almost entirely built from scratch. We used no pre-built parts. Every single plastic part was designed in Fusion 360 and 3D printed. Considering before Pi Wars I had never used Fusion 360 or even owned a 3D Printer, and so had to learn these skills as we went along with the build, this was a great achievement. The Nerf cannon alone had 13 separate 3D printed parts, a rotating barrel (the only one in the entire competition) powered by a stepper motor, a servo controlled plunger, plus 2 flywheels (making 4 motors in total). The entire bodywork came apart in seconds due to a clever system of magnetically attached body panels. The robot was very technically advanced compared to others and many 100s of hours went into the design. We should have got more points in this challenge and did not deserve 15th place.

Artistic merit – Joint 3rd. Again, this involved a team of judges inspecting your robot. In this challenge we came joint 3rd and quite rightly too. A lot of work went into the design. The Nerf cannon alone was a work of art. The side body panels had scratch built designs and the interior was fitted with throbbing heart LED lights giving it a menacing appearance. The top had a rotating Cylon (or Knightrider) style cylindrical set of red leds giving the appearance of a scanner. This was not the design we had originally intended as the original plan was to make the robot actually look like a robot (i.e. rotating head, eyes, arms, etc.) but we ran out of time.

Blogging – Joint 11th. Again this was BS. I followed all of the blogs so I knew how well we were doing compared to others. I did not expect to win this as the Glitterator team did an excellent blog, but should have come top 5. There were teams that did hardly any blogging at all and came higher than us. How on earth was this judged and scored?

The Obstacle Course – Joint 2nd. I knew we did well in this. We had a great run and did it very quickly. The robot was very maneuverable and a lot of time had been spent fine tuning its dynamics and practicing the driving. We flew through the course in super quick time and deserved the 2nd place spot.

Slightly Deranged Golf – 2nd Place. We spent a lot of time designing the attachment for the golf challenge. It was very simple but it worked well. It was basically a horizontal ring with a large split in it to enable the robot to drive up to and capture the golf ball. There was another ring (this time only about 33% of the entire ring) that was vertical and attached to a servo. This part acted as both a capture device to hold the ball and a flicker to push it forward. It wasn’t very powerful but it didn’t need to be as you only have to propel the ball about 15cm into the hole. It clearly worked very well as we got all of our balls in the hole and did so quickly. A 2nd place spot deservedly won.

Minimal Maze – 1st Place – This was an award winning challenge. I knew we would do well on this challenge as a lot of time had been spent on the hardware, coding and testing for this one challenge. In our test runs we did the maze in a time that was faster than the fastest winning run from the 2017 competition. I therefore knew we should do well on the day. The only concern was sunlight landing on the maze as we were using IR sensors alone. We had an array of sensors (protractor sensor) that we bought from the USA.

This sensor is typically used in sumo-bot robot competitions. There is a 180 degree arc covered in IR sensors to give a complete 180 degree view. We positioned this right at the front of the robot ahead of the wheel so we could see everything in front and around us. The code for this sensor was designed to quickly scan across the field of view looking for obstacles, working out their positions and distances and then returning the most optimal path through those obstacles. The robot then turns towards that path and the scan continues. The field of view is scanned many times per second to give an obstacle free path through the maze. The robot navigated the maze super quickly with ease to blow away the competition.

Straight Line Speed Test – Unfinished – This challenge was a major disappointment. In testing the robot, again using the protractor IR sensor, did very well and was able to keep between the two walls very well. It wasn’t going to be the fastest robot but we were confident it would complete the course. On the day though it acted very weird. The robot set of down the course and got about half way down then started spinning around in a circle, hitting the wall. It did this twice. There were two possible causes for this. One was that the walls of the course were matt black and our test course was raw wood and therefore light coloured. Perhaps the IR beams were being absorbed by the walls. Or, there was a strong source of IR nearby interfering.

The Duck Shoot – Joint 6th – A lot of time was spent designing and building the Nerf cannon for this challenge. Too much time in the end. We must have spent about 2 months alone going through multiple iterations of the gun. It was a very compact design, probably the most compact gun in the competition, had a rotating barrel like a real pistol and two flywheels. It was a technical marvel and the part I am most proud of. However, it did not perform as well as expected on the day and a few targets were missed by only mm’s or they were hit but did not fall over. However, 6th is a respectable score as we were up against guns that worked better (mostly hacked Nerf guns rather than scratch builds like ours).

Somewhere Over The Rainbow – 1st Place – Another award winning challenge, but also the most controversial (more on this later). I knew we would do well on this challenge. A lot of time and effort was put into getting the OpenCV code to work efficiently using the Finite State machine model of coding. In testing the robot worked flawlessly. The biggest concern on the day was that the course was floodlit totally differently from the test course and the lights were placed centrally instead of near the corners meaning shadows fell on the balls. However, we did get a brief 5 mins to test and calibrate the robot in the morning and this was enough to get it to work perfectly. The robot worked perfectly on the day giving a flawless and fast run, completing the course exactly as specified. We were all cheering on the robot throughout the run and were very pleased to see it complete the course so well. We knew we had won this challenge even before the prize giving as the judge told us nobody had come even close to our score and we checked with him again right at the end of the competition to be told we were still unbeaten. More on this when I talk about the prize giving ceremony later.

Overall Position – 1st Place – Yep, we won it. As you can see above we scored very well in most of the challenges, winning top place in two of them and the overall 1st place award too. The picture above shows the winning robot with the two challenge awards and the overall 1st place award. The lineup of ranks in order of merit were as follows:

  • Overall – WINNER!
  • Somewhere Over The Rainbow – 1st Place
  • Minimal Maze – 1st Place
  • Obstacle Course – 2nd Place
  • Slightly Deranged Golf – 2nd Place
  • Artistic Merit – 3rd Place
  • Duck Shoot – 6th Place
  • Blogging – 11th Place
  • Technical Merit – 15th Place
  • Pi Noon – Unfinished
  • Straight Line Speed Test – Unfinished

So we came in the top 3 in 6 of the challenges. No wonder we did so well and were the overall winner.

 

The prize giving ceremony – Unfortunately our memory of this was tarnished. What should have been a very happy occasion with our 3 awards turned sour when there was a major cock up with the scores. We knew we had won the Over The Rainbow Challenge. Our robot performed flawlessly and quickly and the judge told us that nobody had so far succeeded to do the course with the OpenCV method.

We checked again with the same guy at the end of the competition before we went into the prize giving to see if anybody had beaten us and were were told that we were the outright winner. He even checked the score sheets, which he showed us, and we could see we had beaten everyone else without any doubt at all. However, when it came to giving out the prizes for this challenge the judges announced us in 2nd place and Paranoid Android (Paul Hodgskin) in 1st place.

We knew right away that this was total nonsense for two reasons. One, the judge showed us the final score sheets and we could clearly see we were the winner by a clear margin. Two, the paranoid android team were in the same put room as us and we had been having a discussion with them about this challenge earlier in the day. In that we learnt the robot had no camera and was simply going to follow the left hand wall. We knew this robot could not beat us even if it tried as there were far more points for doing it using computer vision than simply following the walls blindly.

Immediately we put our hand up and I told Mike Horne (one of the organisers) that this was not correct and that we knew we had one this. The response was that we did not even compete in this challenge! I said that this was incorrect and that not only had we competed, but we had video evidence and we knew we had WON it. They then tried to find our score sheet to check and could not. It turned out they didn’t even have our score sheet and our score had not even been taken into account! How on earth can you a) lose a score sheet b) not pick up on the scoring mechanism that one teams scores had not even been checked? b) Mark a team as not competed? How can you do this if you didn’t even have a score sheet to check! This was clearly a major flaw in the scoring system and I seriously hope that this is fixed for 2019. Also, the judge who scored us was nowhere to be found. Where was he? All judges should be present in the room at the end in case of situations such as this. if he was, he would have been able to simply say that yes, we were the outright winners. But he had scarpered.

To add insult to injury, despite the fact we were able to prove we won it by showing them the video (luckily we filmed the entire thing) we we then given a ‘special award’ (not the 1st place award) and told we were JOINT 1st with the other robot, even though it had not used OpenCV and couldn’t possibly have won it. I was fuming about this and so was my teammate Tom. To put in all those 100s of hours of designing, printing, testing and spending many £100’s on the robot to have this done to us was out of order. They should not have issued ANY awards for this challenge until they could find the sheet or view the evidence. it took WEEKS of arguing with the organisers to get them to a) change the scores and award 1st place to us and 2) get another award laser cut and sent out to us. They were VERY reluctant to do so and the email and Twitter exchanges over this show that it was only by me applying pressure about it and talking about it publicly that they changed the scores and awards. If I had kept quiet they would have left it as it was. I later found out that the 2nd place guy was a friend of theirs plus Cambridge Makerspace. Was this decision biased? Who knows. However, they finally relented and gave us the 1st place score as the official results show. It was a real shame that the prize giving and the end to the weekend we had built up to over 12 months had been ruined by this. I think it also showed to everyone else in the room that there were clear flaws in the scoring system. I wonder how many other teams should have been scored differently?

A point to note here – the organisers are members of Cambridge Makerspace. Several of the contestants are also members of Cambridge Makerspace. One of those contestants designed parts of the courses. How is this fair? These people are allowed to see and practise on, the courses for weeks or months in advance whereas the rest of us only get to see most of the courses on the day. For example, the guy that designed most of the obstacle course came first in that challenge. What a surprise! How is this fair? Either the organisation of the competition and the building of the courses has to be kept separate and secret from everyone else in Cambridge Makerspace or members of Cambridge Makerspace should not be permitted to enter the competition. There is a clear and unfair bias here. It is due to this unfair bias and the very shady way in which the awards were scored and given that we decided not to enter again. It is not right to put so much effort and money into a competition, even a fun one, if it is not taken seriously by the organisers. I am sorry if that upsets anyone but it is my own personal opinion.

The design and build process – So the original robot design was based around a chassis design by Tom Sparrow. In fact, the final robot used one of Tom’s perspex chassis bases and the rest of the robot was built up from this. Our original design was to build a robot that actually looked like a robot should look, i.e. semi-humanoid or at least having some human attributes such as a head, eyes, etc. I envisaged a kind of R2D2/Wall-e/Asimo mash up. This was the intention right up to a month or so before the competition.

However, we had wasted so much time with design problems, especially with the gun, that we just didn’t have time to spend on the aesthetics. We burnt through several motor driver boards, ESC’s, Raspberry Pi’s and a load of other stuff in testing, mainly due to bad wiring. It was in the early stages of the build that I acquired a 3D Printer. No less than a Prusa i3 MK2. Clearly, owning a 3D printer was going to be a game changer for the Pi Wars robot. However, I had never used one before in my life nor had I ever designed a 3D part using CAD software (I have used Cinema 4D and PovRay in the past for 3D raytracing so at least understood the basics).

So, now came many hundreds if not thousands of hours of learning how to use the printer and how t fine tune it to get excellent quality prints. Also, learning Fusion 360 and how to design functional parts I could print in the Prusa. Part the way through the year I also stripped down and rebuilt the Prusa, using 3030 aluminium extrusions to turn it into a ‘Haribo 3030’ version. This made it much stronger, sturdier and less prone to vibration.

This was the best thing I have taken away from Pi Wars. I can now class myself as a competent CAD designer using Fusion 360 and the prints I have produced with the modded Prusa are top quality. Apart from the robot base designed by Tom the rest of the robot was entirely designed in Fusion 360 and printed on the Prusa. The cannon alone was a technical marvel with 13 separate moving parts and 4 motors. I am very glad I had the chance to use Pi Wars as a way to pressure me into using these pieces of software and equipment and learn to be competent at them in a relatively short space of time. These skills will be invaluable for future projects.

So, although the takeaway from Pi Wars was a mixture of negative and positive experiences, overall I am glad I did it. I have learnt a lot along the way and enriched myself in the process. A huge thank you to Tom Sparrow who’s design input, many hours of course building, enthusiasm and positive attitude all contributed to our success. Medway Makers is unlikely to enter Pi Wars, at least for a long time yet. However, Maidstone Hackspace is entering the 2019 competition and I am helping out with the team. I look forward to the 2019 competition and hope the organisation, especially the judges, have had the problems ironed out. We will see.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.