Saturday was the Abington firefighting competition. I had entered this contest last year with very little time to prepare, and was looking forward to trying again. I made considerable progress (as blogged here over the past few months), and showed up with a robot that was stronger and simpler. Most importantly, it was smarter.
Tests had gone very well at home. Audio start, full navigation of the maze, blowing out the candle, and the crowd-pleasing return to start all functioned reliably. So our drive to Philadelphia Friday night was relaxed and I slept well. The kids had a robot also, a LEGO Mindstorms creation named Dark Lord who used a sponge to extinguish the candle. (Dark Lord is white in color so I don’t understand the name.)
On Saturday I showed up as early as possible to calibrate, get ready, and talk with the other roboticists. First step, attach the battery and power on the robot. Except…it wasn’t powering on. Battery was fully charged, as was the spare, and neither worked. After some increasingly frantic work with the voltmeter, I realized that one of the wires on my main power switch had snapped. The switch had worked well for over a year. Fortunately I had a soldering iron and extra wire with me; unfortunately, I did not have a third-hand. Paul Boxmeyer of Lycoming Robotics was kind enough to loan me a clamp but it was still very difficult to attach a wire to a small switch without shorting the terminals. Not fun. Eventually though, I fixed it. And then discovered that the other wire had snapped.
I did fix the switch, but this consumed a tremendous amount of time. Next step, calibrate the sensors. IR was basically the same as at home since I had white walls in my maze. The candle, however, was another matter. When I came in Paul took a look at my robot and expressed surprise that I didn’t have an adjustable potentiometer. That was a prescient remark. I found that I had trouble seeing the candle in the maze room, no doubt due to all the fluorescent lights. Still, boosting the threshold level in software allowed me to see my practice candles from the necessary distance.
I finally had time to do a practice run. The official candles were much thinner than mine, but I succeeded in the quick single-room test and hoped for the best. Alas, it was not to be. In the first official run, the robot went smoothly to the room, looked for the candle, and left it burning there. It simply couldn’t see it. Later Paul explained to me why my circuit had failed; I would have been better off with only one photodiode rather than four in parallel. They really act like transistors. And a pot would have allowed for better tuning.
For the second run, I knew I would miss the candle, so I asked the judges to put the candle in the fourth room so that I could show off the navigation. My proudest moment was watching the robot pirouette through the first three rooms. Its movement was so beautiful. The carpets were far thinner than I had remembered and the robot barely seemed to notice them; the ones I had at home were much more difficult. On the fourth room — I discovered that my home maze construction had an error. I was 4″ off spec, and Saturday’s maze was another 3″ off. With that, the robot was too far over and scraped a wall, forcing a turn. The recovery code kicked in and kept FireCheetah moving, but with misinformation of its location the robot could not do anything useful. This run was quite frustrating. If I’d had the time to do one full practice run, I would have seen the problem and fixed it with a simple edit. But the switch chose a bad moment to fail.
This year I had more time to watch the competition itself, and it was quite enjoyable. You really empathized with these little creatures and wanted them to succeed. Some were more interesting than others. As I feared, the no-carpet option meant that a lot of robots went for dead reckoning, and that is a bit dull in my humble opinion. Others bumbled around in a charming way. Many of them had trouble, as I did, finding the candle. Or in putting it out. Once again I am convinced that a fan is the way to go. So many sponges, balloons, and other exotic methods failed to work. Almost all other robots there were kits: Vex, LEGO, Boebot, and Paul’s students used his standard base. One woman came up to me later and said she thought my robot was the cutest, which was quite nice of her. They do begin to feel like pets after a while.
The kids’ robot did very well, which was no surprise. Oh the reliability of remote control! They had some competition this year with another girl who also had a LEGO robot. Since their division was untimed, everyone got a trophy anyway. Paul won the Senior Division, along with several other awards for innovation including the multi-robot division where he was unchallenged. One of his students won the Junior Division as well. Note, she used a fan. The other students used sponges, most of which failed.
Steve Rhoads and Foster Schucker from the Vex robotics club were there also, and this time we successfully exchanged contact information. They are cheerful guys and I enjoyed seeing their new Arduino-based robot. They are trying to use Arduino while still retaining kit-level reliability — no home soldering here — and it’s an interesting challenge. They have found a very nice IR grid sensor and I look forward to seeing that at work. Best of all, they are planning to hold some firefighting competitions this summer. I am happy to hear this because I would like to succeed next time, without waiting an entire year, and cross it off the list.
Obviously, I need a better way to detect the candle. Paul suggested that my circuit could work with some tweaks, and that’s probably true, but fiddling with pots is not what I want to do. Since I don’t have the same type of lighting at home to test with I want something more reliable, and that means spending more money on a better sensor. One option is the UV Tron, at least for detection. At the moment I am considering the MLX90620 thermopile. Interfacing with it is something of a software challenge, and it could provide continuous directional feedback as the robot approaches the candle.
Preparing for a competition is very intense and after it is over there is always a letdown. What to do next? In firefighting, I could go well beyond the thermistor; it would be very easy now to shrink the robot significantly, always good when navigating a maze. I still have a ton of I/O from my Mega clone, and could add more sensors. This would allow better obstacle detection. My second run failed due to a wrong number that was hard-coded. I didn’t have many of them, but there were a few. A great robot would not need any of that, it could center itself as it approached a new hallway. Heck, it would be possible to slap an Asus (Kinect clone) on the robot, it fits the size limitation. An exotic candle extinguisher (with a fan as backup!) gets extra points. Or maybe it would be fun to make a walking robot — something completely different.
Beyond firefighting, there are a number of challenges at RoboGames that look fun. The space elevator robot or table bot would all be very new. I like the creative aspect of these and they are less pass/fail than firefighting — which is another contest option at RoboGames.
I’ve ended up with some nice C++ code for any differential drive robot. It’s already on Github but I need to separate it out from the firefighting class to make it of general use. There is a local robotics hackathon on Sunday so maybe I will have it ready by then. The link will be announced here when it’s done.
Here is a video of the one successful run I did in the official maze. Go FireCheetah!
Finally the time has come — the Abington Firefighting Competition is two days away. FireCheetah is tested and ready to go! The kids have their own robot this year, a LEGO Mindstorms creation named Dark Lord, and he is looking good too.
All about sensors
With navigation mostly solved early on, there was more time to look at the sensor side of things. There is more to it than I initially realized. Take candle detection, for instance. I thought my homemade UV sensor was flaky due to varying values in the same location. Turns out that was due to changes in the abundant sunlight in my testing area. I added a sensor hood to reduce the effect, but even so the ambient UV level caused by the sun is significant. I need to change my threshold for fire detection by a factor of two depending on whether I am testing in the day or at night. (This caused some serious hair pulling until I figured it out.)
The sonar, also, has its quirks. Every now and then, a wrong value creeps in. This is particularly bad for times when I use the side sensors the line up with the wall. Both values need to be accurate. In the end I was able to resolve this by making sure that two subsequent readings for each sonar are consistent and within a valid range. If the result yields an alignment change of over 45 degrees I discard it completely, and resulting turns are capped to 30 degrees. It’s worth noting here that common sensor library functions like “take an average” or “use the median” don’t solve this problem.
IR sensors are famous for variability but strangely enough I have had few problems with mine. Still, the precise values for the maze walls on Saturday may be different.
I plan to spend most of my morning practice time in the arena calibrating the sensors, especially for the candle. I won’t line up for the maze practice until later. Last year many robots navigated the course successfully and failed to find the candle, which could certainly happen if I don’t calibrate properly.
And many things depend on battery level. I plan to run with a full battery, so that is what I try to test with.
The sensor lessons confirmed the value of copious logging combined with repetitive testing. It’s a little boring — the kids told me many times, “Mom, your robot is done!” — but it’s the best way to drive out bugs. Below is a picture of the maze I used:
Starting the robot with a 3-4 kHz sound, instead of a button, is good for a 20% time bonus. I figured this would be fairly simple, and I tried to build a circuit with an LM567 tone detector chip. Everyone says this is easy…well, after trying all the capacitors I had, and laying in a stock supply of new ones with all the right values, I never got it working. Plus, it was pretty darn ugly. Circuits with lots of components are just asking for trouble.
After two days of frustration I found a better way. With an electret microphone, a resistor, and an Arduino DTMF decoder library, the problem was solved. Honestly I hate RC circuits and analog electronics, not surprising that software worked better. This trigger has been flawless. Never a false start, and the robot responds to the tone before my ears do! I use the TrueTone app on my phone to provide the sound.
Putting out the candle
You get a 25% bonus for using something other than air to extinguish the candle. I am sticking with air, specifically a fan, because you get to try more than once if you don’t succeed the first time!
Returning to the start
Another 20% bonus is available if your robot returns to the start after putting out the fire. Also, it looks very cool. This turned out to be pretty easy. I already have a planner, so I simply needed to add a list of return path nodes from each room. When the robot enters a room, it saves the entry point coordinates. It moves around to get the fire out. Then it uses dead reckoning to find its way back to the entry point, facing the opposite direction.
This is the only time I use dead reckoning. Although I’d use it elsewhere in principle, the sensor methods proved to be more reliable. They are less sensitive to slip from carpets or variable maze dimensions.
Homemade robots are delicate things. Things can shake loose or move over time. I’ve done what I can to make things as solid as possible, but I will have to use a checklist before each run just to be sure everything is right. The left hub setscrew can come loose, it’s rare, but when that happens it’s a disaster.
The drive train can definitely get over the carpet now, unless the set screw is loose. Of course my carpets may not be the same as what is present at the competition, but from my old videos they look similar. And speaking of carpets…
A last-minute rules change
Yesterday all the contestants received an e-mail notice that carpets would now be optional in the high school and senior division, rather than required, due to “confusion in the rules”. What???
Last year, the carpets were harbingers of doom for many robots. So much so, that any robot who made it through twice was guaranteed a prize; and my design reflects that, with lots of sensor checks and — therefore — a slower speed. A carpet-free robot can use dead-reckoning and potentially be very fast. Of course, this is not the cutthroat world of Trinity. Nevertheless I suspect many teams will take this option and that means more competition.
Using the carpets does come with a significant time bonus, 30% off. That is not as great as the multi-room mode bonus (not a bonus exactly, you just avoid a 3 minute penalty). Unless I show up for practice and find the carpets far worse than my test ones, I will use the carpets. After all that is what I designed for and want to conquer, and what is life without some risk. Audio start, carpets, and return home combine to form a 0.45 factor applied to total time.
What’s a robot blog post without some video? Here is FireCheetah navigating the full mockup of the Abington/Trinity maze, putting out the candle, and returning home. One take, I swear.
Whatever happens on Saturday, preparing for the competition was a wonderful experience and I learned a great deal. I’m looking forward to seeing some of the people I spoke with last year. This time I hope to relax and enjoy and watch more of the competition.
The Abington firefighting competition is one month away. I’ve been working hard on the controller software, and at this point FireCheetah reliably navigates on continuous surfaces (floor or rug). That is well ahead of where things were last year! Along the way I came across a few things that may be useful to other roboticists.
A better ping library for ultrasonic sensors
There are quite a few Arduino libraries out there for ultrasonic sensors. They are important for this robot, as a key loop involves driving forward until we are a certain distance away from a wall. Ultrasonic sensors work via echo-response, sending out a ping and waiting to see how long it takes to come back. Usually that happens fairly quickly, but if there are no obstacles within the range of the sensor, the code will keep listening until it times out. It turns out that the library I was using before had a fixed timeout value of one second(!), from the Arduino function pulseIn. One does not want a robot running a full second without any feedback. Pretty poor library design, as pulseIn allows you to customize the timeout value.
I stumbled across a far better library, NewPing. The home page lists its advantages:
- Works with many different ultrasonic sensor models: SR04, SRF05, SRF06, DYP-ME007 & Parallax PING)))™.
- Option to interface with all but the SRF06 sensor using only one Arduino pin.
- Doesn’t lag for a full second if no ping echo is received like all other ultrasonic libraries.
- Ping sensors consistently and reliably at up to 30 times per second.
- Timer interrupt method for event-driven sketches.
- Built-in digital filter method ping_median() for easy error correction.
- Uses port registers when accessing pins for faster execution and smaller code size.
- Allows setting of a maximum distance where pings beyond that distance are read as no ping “clear”.
- Ease of using multiple sensors (example sketch that pings 15 sensors).
- More accurate distance calculation (cm, inches & microseconds).
- Doesn’t use pulseIn, which is slow and gives incorrect results with some ultrasonic sensor models.
- Actively developed with features being added and bugs/issues addressed.
I immediately implemented the setting of a maximum distance, and the lag time went away.
The author had some impressive thoughts on why one should avoid using delay() in code.
Simple example “hello world” sketches work fine using delays. But, once you try to do a complex project, using delays will often result in a project that just doesn’t do what you want. Consider controlling a motorized robot with remote control that balances and using ping sensors to avoid collisions. Any delay at all, probably even 1 ms, would cause the balancing to fail and therefore your project would never work. In other words, it’s a good idea to start not using delays at all, or you’re going to have a really hard time getting your project off the ground (literally with the balancing robot example).
The NewPing library offers an event-driven mode, which I might try, though it is arguably more than I need and complexity runs the risk of adding bugs. The thinking here did inspire me to get rid of delays in all my PID loops — which get called over and over again from different pieces of code. Instead they do a check to see if sufficient time has elapsed before resampling and correcting. I have to think that once one goes down the road of balancing ball robots (which I’d love to do one day) a multi-threaded operating system is the way to go.
Of course, with all this, the biggest improvement I made with the sensors was discovering and fixing a bug from last year’s code: I had incorrectly used a tangent instead of sine. Now the wall alignment works every time.
Upgrading the Arduino avr-gcc compiler
When compiling my project over and over again in Eclipse, I always saw a strange warning: “only initialized variables can be placed into program memory area”. This was a little disturbing and eventually I decided to track it down. Turns out that the avr-gcc that ships with Arduino 1.0.3 is not the latest version, and it has bugs. Although I could have lived with the warning. I decided to upgrade after reading this post about the buginess of old avr-gcc with Arduino Megas. I had experienced the difficulties with global constructors that the author mentions. And so, I followed his directions to upgrade to avr-gcc 4.7.0 and avr-libc 1.8.0. I did not upgrade the Arduino IDE as I had a newer version than he did, but I did have to add
to HardwareSerial.cpp so that it would compile correctly.
As I’ve mentioned before, developing on microcontrollers is not for the faint of heart. I don’t think I’ve ever had to upgrade a compiler in all my years of commercial software development. Ugh.
The carpets in the competition are a challenge, as hitting one at a key point can cause the robot to lose track of odometry. This year I bought some carpet samples in the hopes of doing more realistic testing. As discussed on my previous post I redid the drive train with no gearboxes and the wheel base moved to the rear. Things are looking very stable. I am not ready to declare 100% success yet. It is particularly challenging to go from a polished floor onto a raised carpet. The Abingdon maze last year had flat rubber/carpet flooring, so much easier with the extra traction, but I don’t want to take any chances. Much more testing is required.
Once again software improvements are key. I now assume the carpet will toss the robot around and it may lose sight of the wall temporarily, but it is smart enough now to look around for a while with its sensors before giving up. As long as the robot can mount onto the carpet successfully, all’s well. Right now the left wheel is having trouble doing this from a smooth floor; that motor is slightly loose due to a missing retaining ring so I will replace it and see it what happens.
If you create a mobile robot with a job to do, at some point you will be faced with the need to get it to a specific location. Firefighting competitions require the robot to navigate a known maze. Much of my limited time last year was spent trying to get FireCheetah to do the supposedly simple task of driving in a straight line. Sounds easy, right? For a differential drive robot with two wheels, just deliver the same torque to each wheel.
As it turns out, this obvious method doesn’t work very well. My robot did fine when I could hug a wall using sensors, which is most of the maze. Eventually, though, there are spots where it has to manage on its own. And there my robot had two big problems:
- It drifted left substantially, and
- On smooth floors it would skip to a stop, meaning that the final position was unpredictable
It was a maddening problem. Sometimes the robot would go where I wanted…and sometimes not. I tried a few things. I added a fudge factor to the motor voltages so the left side would go a bit faster. I tried a PID controller to even out encoder ticks — distance travelled — on the right and left side. I reduced the motor voltage, and therefore velocity, to minimize the skidding. But I couldn’t go too slow because I knew there would be carpets and the robot would not be able to move if the torque was too low. Heaven help me if I stalled the motors, because then the robot would be really unpredictable for the next hour. All in all not a situation to inspire confidence, and I knew going into the competition that I was unlikely to succeed in the autonomous task. (We were ultimately done in by an unrelated software bug introduced at the last minute.)
This year I was determined to solve the navigation problem early on. One step was upgrading the motors. These also came with better encoders, devices that tell you how much the engine shaft has rotated, and therefore how much the wheel has rotated. Ignoring slip, this gives you distance traveled. However, this did not solve the problem by itself — the robot still drifted left.
Time to put in some better software! My original code was loosely based on Mike Ferguson’s code for Crater, a winner at Trinity. He was dealing with far more limited processing power — he had an Atmel 168, I have a Mega — and so he had no PID of any kind. And indeed, in a fancier version of Crater he noted the problems with keeping the robot straight. I thought the answer might lie here.
As luck would have it, Coursera’s Control of Mobile Robots started up, and one of their examples gave me the solution. I am going to summarize it here, along with some extra details, for anyone with a differential drive robot. I won’t use their slides though, as they used a horribly confusing notation (sometimes v is angular velocity, sometimes linear, on the same page). Instead I recommend the following tutorial on differential drive kinematics. From there we get
where sr and sl give the displacement (distance traveled) for the left and right wheels respectively, b is the distance between wheels (from center-to-center along the length of the axle), and theta is the angle of the robot’s heading with respect to the x-axis.
With this equation in hand, we can rewrite our PID controller with the aim of controlling theta, the robot’s heading. In my case I simply used the proportional term. I chose a rational sampling speed based on expected period of encoder ticks. Almost instantly, the robot could drive straight! It was magical.
Along with this, I used PID for the overall linear velocity. That way I can move at the same moderate speed on all surfaces. The robot moves smoothly on a hard floor — and bumps up the torque to maintain that speed when it hits the carpet.
In the spirit of Mike Ferguson and other people who have shared their code, I am putting my libraries on GitHub so others can use them. I am not going to post the link just yet however, as they are under “active development” to put it mildly…
Physical reality intrudes
After delighting in these new capabilities for a few days, suddenly the robot stopped moving as reliably. This was a terrible time. I thought I might have introduced a software bug. Eventually I traced it back to my Faulhaber motors. These come with right-angle gearboxes and some of the set screws had gotten loose. This meant the motor shaft was turning but the wheels weren’t, or not always. Once the set screws were tightened, the robot behaved well again. This did point out one disadvantage of integrated encoders — I love them, but the external ones look at the wheel rather than than the motor shaft, and wheel rotation is really what we care about.
The gearboxes effectively extend the motor shaft, but are otherwise useless and add potential mechanical issues, such as play in the gears. I removed them from a pair of motors, which left only 3mm of shaft extending. However I found some narrow Polulu universal hubs that fit, and drilled holes in a pair of Lite-Flite wheels to attach them. The final product looks like this:
These have passed some simple tests but are not yet attached to the base. The base needs some reworking as well. I am using a platform from Budget Robotics which puts the wheels in the middle, and that means two casters for front and back. This platform was originally designed for a robot with skids and few motion requirements. Experimenting with some carpet samples has reaffirmed the principle that “three points make a plane” and it would be better to have only one caster — otherwise, it is easy for a wheel to get lifted off the floor and once that happens you are utterly dead. The robot thinks that side has moved when it hasn’t and the odometry is completely wrong. So I am going to attach the wheels further back on the robot, and cut off part of the base to do it.
Although navigation is much improved, I do not plan to rely on dead reckoning. There are the carpets to worry about, and more fundamentally, a robot should use its sensors to adapt to the environment. FireCheetah has quite a few sensors and used properly should be capable of some robust recovery behavior. I will go through and confirm that “normal” navigation through the maze works — then time to have fun knocking the robot sideways and let’s see if it can get back on track.
The Heterogeneous Parallel Programming course at Coursera is more or less wrapped up. There were so many problems with the grading system that any “final resolution” has not been made as far as grades, and it is a little anti-climactic. I did very well on the assignments, but don’t know exactly what we get as far as certification, if anything. The final lectures were on a less interesting topic, and I skipped most of them. Overall though it was a great learning experience.
I did find some odd bugs. While on vacation, I was unable to access the Amazon cloud instance. No one else seemed to have this problem. I eventually discovered that if I VNCed into our corporate servers back in Maryland, I was indeed able to reach the instance. Apparently there is some sort of geographic filtering going on, perhaps to prevent multiple people from using the same Coursera or Amazon account? At any rate it did spur me to complete our corporate VPN.
There were some errors in the course material. This is inevitable and not a big deal. However there is so much noise in the discussion forums that it was hard to communicate with the staff. For obvious reasons the people with misgraded assignments or stuck on a problem were the loudest.
To a certain extent I am seeing similar greatness and problems in the new course I am taking, Control of Mobile Robots. The professor is engaging and this is wonderful material. However, one of the key summary slides has a serious error. For part of it, v is used as a linear velocity, then vr and vl are introduced — but they are angular velocity! This is never stated explicitly and does not make much sense. Since this is the reference page I’d normally use once the class is over, it’s a very annoying bug. Other students noticed it as well. However, it turns out that the TA lecture to walk people through doing the quiz also had an error, and far more people were screaming about that, and long story short I don’t think the slide will ever be fixed. Yet another misleading document to live on forever online…
That said, the material is marvelous and perfectly timed. This week included a broad overview of a robot behavior called “Go To Goal”, aka dead reckoning. This is exactly what I need to get FireCheetah going in a straight line when there is no wall to follow.
I am also taking Grow to Greatness: Smart Growth for Private Businesses, Part I which arguably is the most relevant to my real life. That, however, will be audit only. It is interesting but extremely verbose. One difficulty with streamed video lectures is that one can’t easily fast forward through the fluff. And this one has quite a bit of fluff.
It turns out that as a Coursera student (free enrollment), you can get a student evaluation of MATLAB for only $99, including several toolkits. This is an amazing bargain, as this software package can easily run over $1,000. MATLAB can solve math equations, plot data, and do many other useful tasks.
I have used Mathematica in the past, but the learning curve is quite steep and it’s easy to forget, so I never made much use of it once my thesis was done. With MATLAB installed, I see why it has become the default in the corporate engineering world. It has an easy to use GUI with lots of help, and they have some excellent introductory tutorials on their site. The command-line is there with all of its power, but you are not forced to use it all the time. This makes playing with data very, very easy.
For more details. see this page. (You may have to register for the course first.)
One of the biggest headaches on the hardware side of robotics is finding parts that fit perfectly. If you are not building from a kit, you generally wind up with special orders or hand-machining. I’ve acquired a few tools over the years for this, like the Dremel, a drill press, and vertical band saw; but I don’t have a mill or a lathe (they are heavy and a little dangerous with kids around). I recently took a course to use the Nova Labs laser cutter and that is a great tool for cutting out flat parts. But my machining in general is time-consuming and limited.
A few weeks ago I attended a DC Robotics meetup on hardware prototyping. And I was captivated by their 3D printer. This is a machine that extrudes melted plastic in a precise way to build a computer-designed model. If you can design it, you can (within limits) print it out. I have seen many 3D printers before, but this was the first time I could see the practical applications. Gears, shaft holders, brackets, all easy to create at home.
My initial intent was to go out and buy one. However, after talking with more knowledgeable people and researching online, MakerBot has a poor reputation — partly because they have portrayed their machines as easy to setup and use. They are not. 3D printing is still experimental and requires extensive initial configuration, MakerBot more than most.
However, the latest generation of machines are at least stable once you have them setup. And then the printing is straightforward. It turns out that Nova Labs is doing a group build of the Mendel Max 1.5, and with the bulk purchase it will only be $750. Low price and expert help, who could ask for more? So I have signed up for it. No date yet but at the end I’ll own something like this:
The new drivetrain is working very well. I’ve increased the clearance and FireCheetah is happily running on thick carpets all over the house, at only half the available power. Eclipse with the Arduino plug-in is working extremely well and certainly beats the old Notepad method.
I found a simple solution to the XBee crosstalk during Arduino updates. Since I don’t need any remote control for the robot this time, I simply hook up the Arduino transmission to the XBee reception — but not the reverse. No need for reconnecting wires or using a switch.
But I am the most excited about the software changes. Improved navigation is the key to success. If it works, that will be a post of its own, because I have found little useful information online.
Last year I participated in the Abingdon firefighting robotics competition, in which a robot must navigate a maze, find a candle, and put it out. We entered two divisions: K-6 where kids are allowed to remote control the robot, and the senior division where all actions must be autonomous. The kids won their division, but I did not do so well in mine. Ever since then I have been thinking about a rematch. Now the time has come.
For the previous contest I only had two months to build a robot from scratch, order all the parts, learn the firefighting paradigm, figure out a wiring system, get a software environment set up, and ensure that the robot could perform in both autonomous and non-autonomous mode. Two months is not a long time for all of this, and after a lot of sleep deprivation I was proud to get something together that nearly worked. Now I have the luxury of more time, and only need to make improvements instead of starting from the beginning.
A better drivetrain
One of my biggest problems last year were the motors. I used Solarbotics GM8s with matching encoders, wheels, and mounting brackets. Although they were supposed to be a set, they never fit together very well. The encoder connectors were so frail I had to make my own. The motor did not fit square in the bracket (Solarbotics suggests drilling out the case a bit…rather annoying for a custom part…also didn’t work perfectly). The wheels were too thin to navigate the carpet. The motors were extremely sensitive, even a short stall rendered them unreliable for hours.
So I decided to start with a high quality motor, then try several wheel options connected to the shaft with a hub. At the competition it was suggested that I use a Faulhaber motor and I found these on Goldmine for only $12 a pair. It has a high quality encoder built in and integrated with the main motor connector.
These are right-angle output gear motors that come with a gear attached to the output shaft. My hex keys were unable to remove it, and the one that I special ordered also failed. I finally used my Dremel to carefully cut into the set screw and get that gear off. In one case I unfortunately nicked the output shaft, rendering that motor incapable of fitting tightly with the wheel hub, but I had spares and the others were successful.
I bought a number of different wheel systems to test but have started with the Lynxmotion hub. This has a set screw to keep it on the output shaft. I learned that most motor output shafts are not perfectly round, but D-shaped, and it is very important to put the hub set screw on that flat area, otherwise it won’t grip properly and the wheel spin will not be true. For wheels, again there are many options. I am starting with a neoprene set that is thicker and slightly smaller than the ones from last year.
Finally, I had to figure out had to attach the motors to the platform. With the right-angle gearbox in place I cannot access any of the screw holes built into the motor. After some failed experiments I was able to use a technique from The Robot Builder’s Bonanza. I drilled some holes in a plastic strap and screwed it down snugly over the circular motors. Normal screws did only a so-so job, thus I ordered longer screws from McMaster Carr — a great hardware resource as their shipping charges are modest and they get items to you fast.
The big honking battery in the middle works very well but is also heavy and probably more voltage than I need. I have some much smaller 7.4V LiPos to try.
Early results are very promising. Here is a video of the robot doing circles on my office rug. With the Solarbotics setup it was never able to move here. And this is without any casters installed — the platform is dragging on the carpet!
Using the Arduino Eclipse plugin
Aside from the motors, software bugs were the other large problem. Indeed, a last-minute error there killed FireCheetah’s navigation in the competition. Navigating to the right software file to make changes was also extremely time consuming. The default Arduino IDE is designed to have everything in one file. I had 20 files, not including the Robot Operating System code.
On my previous laptop I had Microsoft Visual Studio and after the competition last year tested out the Visual Micro Arduino plugin with some success. If you have Visual Studio, I’d try that first. However, my new laptop does not have Visual Studio and I no longer qualify for the academic version. It seemed rather wasteful to buy Visual Studio for this one project. The best-known open source alternative is Eclipse, so I thought I would give it a try.
There are some rather involved and outdated instructions on integrating Eclipse with WinAVR on the Arduino site. There is also an Arduino eclipse plug-in. It was pretty clear that this wouldn’t be as polished as Visual Micro but I figured it was worth a shot.
Installing the Arduino Eclipse plug-in is not for the faint of heart. A couple of tips if you decide to try it:
- Put Eclipse in a directory with no spaces — NOT Program Files.
- Put the Arduino environment in a directory with no spaces too. Confirm that you can use the Arduino IDE to compile and upload a simple program.
- Eclipse suggests a workspace directory. ACCEPT IT. If you go elsewhere, Windows 7 in particular may not let you write there and this causes all sorts of problems. This is especially true as the first time you use a particular board combination the plug-in will try to create a separate project for that board so that you can link to its static library.
- If you use an Arduino library, you must Import it, not just add a reference to it.
- The plugin instructs you to set certain Arduino files to be indexed up front. This is not possible in the latest version of Eclipse. Therefore functions like Serial will all be highlighted in red as “undefined” even though the build may work. To fix this problem, index manually and/or add the path to Arduino.h manually.
- The very first time you create an Arduino sketch, you must have the board plugged in, as otherwise you won’t have a COM port to choose from and the configuration cannot be completed.
- If you have to start over and re-install Eclipse — which I had to do about 5 times, not knowing everything here — you must completely delete Eclipse, the workspace directory, and any .eclipse or Eclipse directory in the Documents or Users area.
Painful as this was, the results were worth it. Eclipse is quite a nice IDE, and to be able to see the first lines of function code simply by moving my mouse over a reference to it, navigating to an object’s source file with one click, and finding all uses of a a particular variable or function, are tremendous productivity enhancements. The build process also found a serious bug right away (don’t ignore those warnings!). I have successfully compiled and built code to the robot with this. The only key tool I haven’t tried yet is the serial monitor, because the plugin website does not have written instructions on that. After downloading a 32M video file, it appears you pull it up via Window -> Views -> Other -> Arduino -> Serial monitor. I do prefer written instructions, but hey, this is free software.
Easier physical connection to the Arduino board
To update the robot’s software, it’s necessary to connect a USB cable from the Arduino to the computer. The Arduino sits under the motor shield and behind an IR sensor and quite a few wires. Snaking the USB cable in was a bit of a pain. At Abingdon someone suggested using a very short USB cable left connected to the Arduino, then using a USB extender as needed to connect that to the computer. This works nicely.
Another hassle was the XBee. We need this to be connected for teleoperation and for Serial debugging when the robot is running disconnected from the computer. However, to upload new Arduino code it is necessary to unplug the XBee from the Arduino or the upload won’t complete. I intend to put in a switch, instead of pulling the wires each time.
Instead of using my home-built high-side switch to turn on the fan, I will use the ULN2803s I ordered when writing my post on adding power and pins to the Arduino. It’s conceptually simpler, and safer (my other version did not have kickback diodes). I’m not sure why I didn’t do this before, I simply didn’t remember it. That’s sleep deprivation in action.
I never built a sound activation circuit last year, but I had ordered individual components and started wiring them to a breadboard — quite a few components, and messy. Since then I’ve found a simpler audio circuit that uses an LM567, which I now have and will use instead.
It is nice to be able to work at a more relaxed pace and hopefully do more things right this time.
This was a short but sweet project — very empowering.
For years I have owned a Dell Precision M4300. It is not a sexy computer by today’s standards, but it has a very large screen, comfortable keyboard, runs fast, and has all the ports I need. Therefore, no reason to replace it. I figured at some point something would die on it — as it has for all my previous laptops — and then I’d upgrade. But this guy has kept on ticking.
A few weeks ago, the laptop finally developed a problem. The lid would no longer stay propped up at the correct angle. It either fell all the way back, or closed. Not a comfortable way to work! I could prop the lid up with pillows but clearly this was going to need to be fixed. I thought about upgrading now, but it seemed a little wasteful to do it for what was clearly a mechanical issue. Also, I am pretty lazy. This is my development computer for work and as such has quite a lot of customized programs installed on it, the kind that don’t come with Windows Installers. Even with a migration utility it would be a real hassle to set everything up on a new machine, plus I’d have to get a new operating system and no one had anything good to say about Windows 8. (Yes, I need Windows for work…I do have a USB drive with Linux that plugs in also.) I also have an old, matching docking station which would need to be swapped out too.
So next option, repair. I have no objection to paying for skilled labor, but losing the machine for a couple of weeks wasn’t appealing either. I decided to do a little research. After Googling around I found quite a few YouTube videos about laptop lid issues. These were not directly applicable — my hinges were completely shot so tightening elsewhere would not help — but one anonymous genius suggested using the service manual to learn how to open up your laptop. And indeed, Dell has a very nice service manual for the Precision M4300 which is freely available. Opening up a laptop is not for the faint of heart but suddenly this seemed quite manageable.
There is a thriving market for Dell parts and I was able to find replacement hinges for $20 a pair. I propped up the lid for the next few days until they arrived, then I got to work. There were lots of little screws to be taken out, a few places where some force was needed (not too much), and where necessary I took photos to make sure I would be able to reassemble things. I had to go to the point where the LED screen was almost removable from the lid, but eventually got to the hinge:
I replaced the hinge, put everything back together, and held my breath while the computer rebooted. Success! The laptop was fine — with a nice steady screen angle. Total time to tear down / reassemble was well under an hour.
I am very impressed with Dell for making the service manuals so accessible. The fact that Dell is so well known made it easy to find replacement parts, also. This will certainly be an incentive to keep buying from them in the future.
I’ve posted before about Stanford and Udacity free online courses. Tomorrow I am going to be starting another one, offered by Coursera. The topic is Heterogeneous Parallel Programming. From the course description:
All computing systems, from mobile to supercomputers, are becoming heterogeneous parallel computers using both multi-core CPUs and many-thread GPUs for higher power efficiency and computation throughput. While the computing community is racing to build tools and libraries to ease the use of these heterogeneous parallel computing systems, effective and confident use of these systems will always require knowledge about the low-level programming interfaces in these systems. This course is designed for students in all disciplines to learn the essence of these programming interfaces (CUDA/OpenCL, OpenMP, and MPI) and how they should orchestrate the use of these interfaces to achieve application goals.
Putting this in plain English, the graphics engine sitting on your computer’s graphics card is extremely good at processing large amounts of parallel data, and quickly. We’ve all seen the increase in realism with computer graphics. It turns out that these techniques can be used elsewhere, for example, with high-volume scientific data. The catch is that you have use a radically different programming paradigm, otherwise you get no performance benefit. I brushed up against this in a computer graphics course but we had little time to go in depth. Hopefully this will be a chance to gain a deeper understanding.
It will also be good to get back into some maker projects. The firefighting robot burned me out a bit, frankly. Of all the challenges I’ve undertaken in the past decade, it was the most demanding both physically and mentally (yes, it took more of a physical toll than Mountains of Misery; lack of sleep will do that). A good deal of creative energy has gone into learning the art of cooking, which culminated in a successful Thanksgiving. At some point I will post about all of that, as I’ve managed to do some interesting things beyond the sous-vide experiments. For now I am looking forward to this programming course, and an upcoming Nova Labs series that will ultimately grant me access to their new laser cutter.
After two months of work, the big day was finally here. Friday night we drove up to Philadelphia with lots of tools and the robot nestled in a box at my feet. The kids were very excited and we all brought matching robot t-shirts to wear. Saturday morning we showed up bright and early for the Abington firefighting robot competition. I was not sure what to expect and wanting to be absolutely certain of getting a spot with power and a place to test the robot. As it turned out, the organizers had some very nice lab classrooms for us and I didn’t need my power strip and long extension cord. I tested all the sensors on the robot and everything seemed in order.
According to the website there were 54 robots registered for the competition. There were a large number of Vex robots and even more Lego NXT robots. Only a handful of robots were non-kit ones like mine. Most of the Vex robots were part of a local club. I had some very good conversations with the guys who run the club. They were very interested in my bot because of its use of expanded rigid PVC and the Arduino platform, a switch that they are considering. It turns out that Vex robots are extremely expensive. They showed me one of their entries, which had a fairly large metal frame and large wheels, along with a number of plug in sensors, I am guessing about 5 in total. They told me the cost of the bot was $850! In contrast I had encoders, far more sensors, and wireless communication, at a parts cost under $150, and — had I had the time — the ability to add more things like line sensors for less than a dollar. Now it is certainly true that I spent more than $850 on this project as part of equipping my workshop with loads of parts “for the next project”, the power saw, and replacing Parts That Burned Out as I learned how to use my power supply correctly. (The list of fallen includes a motor, two encoders, a switching voltage regulator, and an XBee with breakout board.) Still, that is quite a price difference.
When I made it over to the practice arena, there were some bad surprises in store. Most of the maze floor was hardwood; well and good, as that is what I had at home. They also had carpet. I knew this was coming, but had expected relatively thin carpet. Not true. This was thick, and worse, it was uneven in height with patterns. On my first practice run, the robot stopped dead. Since I was not running at top speed, I changed the code to boost the speed greatly (knowing that I might pay a price in accuracy in the hardwood portion of the maze).
I knew that narrow wheels on that carpet was not going to work well. They would get stuck in the grooves. As luck would have it, that rug was on the path to the single room I had chosen. And so, while I had wanted to avoid any last-minute code changes, I decided to go for another room instead, where there was no carpet. The first thing that happened after that was the robot started running backwards continuously, saying that it was stalled. Turns out I had overwritten an array, which had nothing to do with the stall code, but this was another example of the sensitivity of the system to any coding error. Fixed that, but lost more time.
Another weird thing, when I would start my practice runs, the robot would instantly swerve left into the wall. Since this was practice, I could simply hit the reset button and then it ran fine. This happened on both practice runs. Not a good feeling.
The competition started with the high school students. Another surprise, as I saw with a sinking heart that there were now two carpets, and no way to avoid them. We watched as 11 robots in a row failed to put out the candle. Robot 12 succeeded…then there was another string of failures. That robot was the only one in the division to make it, and later in the day it was not able to do it again in the second round.
It was surprising how many robots made it into the room but were unable to put out the candle. Some of them were thrown off by candle reflections on the white walls — I had uncovered this potential problem in testing at home. The biggest problem, though, is that so many of them avoided using a powerful fan. You get a 25% time reduction factor for using non-air methods. So there were lots of balloons and a few sponges. Only problem is, if you aren’t exactly in the right position, those methods don’t work well. Some folks used fans, but wimpy ones; I am guessing those are the ones that worked most easily with the kits. By contrast, I had a fan designed for providing thrust on airplanes and a dedicated servo to sweep it rapidly back and forth. It was loud as hell but if I got anywhere near the candle, it was going out.
The interesting thing is that there was a much easier way to get a time reduction — 20% — by having your robot start with tone activation instead of a button press. This was a no-brainer for score improvement, and I had bought all the parts (a few dollars) but didn’t have time to build it. Yet almost none of the contestants used it. I realized that this is probably because the kits don’t offer a plug-in version of a tone detector.
Next was the junior division. They did not have the carpets to deal with, and so, there were a decent number of successes. About halfway through this, I found out that Laurel and Holly’s division had been moved up so that they were now going to be next, not the original schedule. I ran back to reprogram the robot for remote control, plug in the laptop XBee and haul everything back. All in all, quite a bit of running around.
It was worth it though, because the kids did a great job of driving the robot and putting out the candle, even though we’d had zero time to practice. Here is the video:
Then first run for the senior division. Once again, I put my robot in and wham — straight into the left wall. I later figured out that this was a software bug introduced the morning before, when I hastily added the required button switch. That put a delay in the code, and I put it in the wrong place, so the sensors were calibrating while outside the maze.
At the time, though, I had no idea of what was going on. So with dark thoughts about my motors I quickly put in a edit that if the robot hit a wall, it would back up and then turn right.
Surprisingly enough, this worked decently for the second run and the robot at least made it down the hallway heading straight. But, then it hit the dreaded carpet, and a wheel got stuck in the grooves. And that was that!
Only one robot in the senior division successfully completed two runs (I think). A man named Paul B. had brought a beautiful pair of robots in for a swarm demonstration. Two robots working in coordination searched each room for the candle and put it out, choreographing their moves, and returning home to nestle back together at the start. Needless to say they were also well capable of putting out the candle acting alone. They moved wonderfully smoothly, with no carpet issues.
After the show I talked with Paul a bit. The first thing I asked about were the motors, and these were indeed high-grade medical motors that cost over $100 each new. (Mine were $8!) Like me he had encoders, and interestingly enough they were about the same counts per revolution. Since his were integrated into the motors, it was much cleaner. I had 10 wires going into my breadboard from my encoders, and while they never got loose, it was ugly. His robots didn’t seem to have any wires anywhere, though there must have been some. Lovely PCB boards with surface mount components made for a spare, elegant design.
Although I would have liked to do better, we had a great time at the event. The kids got two very nice trophies, and I met a lot of cool people, too many to mention here. The organizers were friendly, not to mention that we got free food, and the entire event was free! I’m very glad I came here instead of Trinity.
Goals for next year:
- Have more time to get everything done and tested!
- Use different wheels. Those were my biggest limitation. Small, wider, and better traction.
- The wheels were part of the motor/encoder package. Since I was not at all thrilled with the motors, I will upgrade to a higher-quality pair with integrated encoders.
- Even with good motors, relying on a robot to go straight is never robust design. I will spend much more time building in recovery behavior. With all the sensors I have it is possible to choose a reasonable course of action. Most robots in the show hit a wall at some point and then they were doomed.
- My battery was a little big and heavy. Might change it. On the other hand I have to say it performed well.
- Designing PCB boards would be fun, though of course this assumes you have a fixed plan for that component!
I’m happy to say I’ve crossed off a big item on the list already, namely moving to a real development environment for the Arduino instead of the toy one that comes with it. Turns out Visual Studio has an Arduino plug-in that works great and is effortless to install.
And now, time to catch up on real life.
As I write, we are on our way to the Abington Firefighting Competition. Up until last night, I was not certain that we would make it. In the previous post I mentioned various gremlins with the motor control system. Running at a lower voltage more or less made them go away. It worked so well that I could no longer use current surges for stall detection, because they weren’t large enough. I did add the heatsink too when it arrived. Nevertheless, there were enough little problems that it was hard to get an unbroken block of time to debug. And I had many new features to add, including the fire extinguisher.
I am very proud of my fire sensor. It is based on the Rockhopper design, but since there were no schematics I had to figure a few things out. I found that using four photodiodes instead of one did increase sensitivity, and gave a sharper angular peak — very desirable if oriented properly. As usual, there was some trial and error. I discovered that the photodiode responds to the frequencies emitted at the base of the candle flame. Thus the tea candles that I originally tested with were a disaster, because when they burned for a little while the interior hollowed out and covered that part of the flame. The diodes are also sensitive to candle height, so I spread them out to cover the maximum allowable range in the competition, 2 inches.
Figuring out how to mount everything on increasingly limited space was a challenge as well. Ultimately the flame sensor went on a butter container, with the fan mounted on the side. All of these were then attached to a servo, which sits in a rectangular opening cut nicely by my new power saw. Again, all very time consuming.
So by yesterday I realized that I would not have time to get the navigation reliable for the entire maze. And therefore, I have lowered my sights a bit. Abington offers a “single room mode”, where you can pick ahead of time the place you would like the candle to be. As long as your robot can make it to that one room, then, you are in good shape. There is a 3-minute penalty added to your time, which means that you are not going to win unless everyone else crashes (or chooses the same option). I was disappointed to have to do this, but I suspect I’m not the only one. Looking on YouTube, there are a suspiciously high number of videos with robots finding the candle in the first room they look at, and it’s the easiest room to get to.
I have had a number of surprises working on this project. Everything felt very rushed, as I had only two months to work and no experience. I had to make design compromises. These included using an Arduino Mega, motor shield, and breadboards, as opposed to wiring everything individually. One surprise was how well this worked. I thought there would be issues with wires popping out of the breadboard or falling out of the Arduino headers. As it turned out, they worked extremely well. I cut the wires precisely and used the proper housings and…it just worked. What did not work well were the screw terminals. I had many problems with the battery leads coming out and the same with the motor leads. Since screw terminals are supposed to be more reliable, this was an unpleasant discovery.
The Arduino works with C or C++. After years of working with Java and its automatic memory management, I knew going back to C++ would be painful. It was worse than expected. If you write a buggy program on a computer, you usually get a nice error message. On a microcontroller, your only clue may be when things suddenly to act weird. And in the worst case, you can completely lose the ability to communicate with the board and get rid of the faulty code. At one point I thought a coding error had bricked an expensive board. Finally I found a suggestion of sticking a large capacitor between the ground and reset pins to prevent automatic resets, and that eventually worked. Quite a scare though. Flawed code is quite easy to write, with no warnings about dereferenced pointers or default values. I learned to put in a few seconds delay at the beginning of the program to allow some time for communication before any errors surfaced. That was not 100% foolproof, though, as some global objects get constructed first. So I had to make the constructors dead simple and call separate setup functions on the objects later. Even then, sometimes, problems.
I did get to appreciate how much work the Arduino folks did to make a microcontroller usable by “normal” people. They are hard to work with. The trade-off is that you lose power; for example, the board runs at a much slower clock speed than needed, because to change that would mean rewriting older code. And the Arduino IDE is hardly suitable for a large code base (which I eventually had), with no line numbers and certainly no way to find function usages, etc. There was no time to configure another IDE this round, but next time, that will happen.
This was one of the hardest things I have ever done. The intellectual challenges were constant, with no time to delay or rest. It was harder than my big challenge last year, Mountains of Misery; as hard as it was, that was only ten hours with the last one being truly hard. This required intense drive for weeks, and there were many times I was tempted to throw in the towel.
This morning I tuned the candle-finding algorithm, put in the required red button start switch, and added two joystick functions the kids will need for their remote control portion: turning the fan on/off and swiveling the servo. The Robot Operating System has done very well for the remote control part, it was fun to work with.
Here is a video of FireCheetah navigating the maze and putting out a candle. I got it working even better by the afternoon, but this should give a good idea. Let’s hope he does as well tomorrow!