EyeRobot – The Robotic White Cane

Picture of eyeRobot - The Robotic White Cane
Abstract:
Using the iRobot Roomba Create, I have prototyped a device called eyeRobot. It will guide blind and visually impaired users through cluttered and populated environments by using the Roomba as a base to marry the simplicity of the traditional white cane with the instincts of a seeing-eye dog. The user indicates his/her desired motion by intuitively pushing on and twisting the handle. The robot takes this information and finds a clear path down a hallway or across a room, using sonar to steer the user in a suitable direction around static and dynamic obstacles. The user then follows behind the robot as it guides the user in the desired direction by the noticeable force felt through the handle. This robotic option requires little training: push to go, pull to stop, twist to turn. The foresight the rangefinders provide is similar to a seeing eye dog, and is a considerable advantage over the constant trial and error that marks the use of the white cane. Yet eyeRobot still provides a much cheaper alternative than guide dogs, which cost over $12,000 and are useful for only 5 years, while the prototype was built for well under $400. It is also a relatively simple machine, requiring a few inexpensive sensors, various potentiometers, some hardware, and of course, a Roomba Create.


Step 1: Video Demonstration

Picture of Video Demonstration

Step 2: Operation overview

Picture of Operation overview
User Control:
The operation of eyeRobot is designed to be as intuitive as possible to greatly reduce or eliminate training. In order to begin motion the user simply has to begin walking forward, a linear sensor at the base of the stick will pick up this motion and begin moving the robot forward. Using this linear sensor, the robot can then match its speed to the desired speed of the user. eyeRobot will move as fast as the user wants to go. To indicate that a turn is desired, the user simply has to twist the handle, and if a turn is possible, the robot will respond accordingly.


Robot Navigation:
When traveling in open space, eyeRobot will attempt to keep a straight path, detecting any obstacle that may impede the user, and guiding the user around that object and back onto the original path. In practice the user can naturally follow behind the robot with little conscious thought.

To navigate a hallway, the user should attempt to push the robot into one of the walls on either side, upon acquiring a wall the robot will begin to follow it, guiding the user down the hallway. When a intersection is reached, the user will feel the robot begin to turn, and can choose, by twisting the handle, whether to turn down the new offshoot or continue on a straight path. In this way the robot is very much like the white cane, the user can feel the environment with the robot and use this information for global navigation.


Step 3: Range Sensors

Picture of Range Sensors
Ultrasonics:
The eyeRobot carries 4 Ultrasonic rangefinders (MaxSonar EZ1). The ultrasonic sensors are positioned in an arc at the front of the robot to provide information about objects in front of and to the sides of the robot. They inform the robot about the range of the object and help it find a open route around that object and back onto its original path.


IR Rangefinders:
The eyeRobot also carries two IR sensors (GP2Y0A02YK). The IR rangefinders are positioned to face out 90 degrees to the right and left to aid the robot in wall following. They can also alert the robot of objects too close to its sides that the user may walk into.


Step 4: Cane position sensors

Picture of Cane position sensors
Linear Sensor:
In order for the eyeRobot to match it’s speed to that of the user, the eyeRobot senses whether the user is pushing or retarding its forward motion. This is achieved by sliding the base of the cane along a track, as a potentiometer senses the cane’s position. The eyeRobot uses this input to regulate the speed of the robot. The idea of the eyeRobot adapting to the speed of the user through a linear sensor was actually inspired by the family lawnmower.

The base of the cane is connected to a guide block moving along a rail. Attached to the guide block is a slide potentiometer that reads the position of the guide block and reports it to the processor. In order to allow the stick to rotate relative to the robot there is a rod running up through a block of wood, forming a rotating bearing. This bearing is then attached to a hinge to allow the stick to adjust to the height of the user.


Twist Sensor:
The twist sensor allows the user to twist on the handle to turn the robot. A potentiometer is attached to the end of one wooden shaft and the knob is inserted and glued into the upper part of the handle. The wires run down the dowel and feed the twist information into the processor.


Step 5: Processor

Picture of Processor
Processor:
The robot is controlled by a Zbasic ZX-24a sitting on a Robodyssey Advanced Motherboard II. The processor was chosen for its speed, ease of use, affordable cost, and 8 Analog inputs. It is connected to a large prototyping breadboard to allow for quick and easy changes. All power for the robot comes from the power supply on the motherboard. The Zbasic communicates with the roomba through the cargo bay port, and has full control over the Roomba’s sensors and motors.


Step 6: Code Overview

Picture of Code Overview
Obstacle avoidance:
For obstacle avoidance the eyeRobot uses a method where objects near the robot exert a virtual force on the robot moving it away from the object. In other words, objects push the robot away from themselves. In my implementation, the virtual force exerted by an object is inversely proportional to distance squared, so the strength of the push increases as the object gets closer and creates a nonlinear response curve:
PushForce = ResponseMagnitudeConstant/Distance2
The pushes coming from each sensor are added together; sensors on the left side push right, and vice versa, to get a vector for the robot’s travel. Wheel speeds are then changed so the robot turns toward this vector. To ensure that objects dead in front of the robot do not exhibit a “no response” (because the forces on both sides balance), objects to the dead front push the robot to the more open side. When the robot has passed the object it then uses the Roomba’s encoders to correct for the change and get back onto the original vector.


Wall Following:
The principle of wall following is to maintain a desired distance and parallel angle to a wall. Issues arise when the robot is turned relative to the wall because the single sensor yields useless range readings. Range readings are effected as much by the robots angle to the wall as by the actual distance to the wall. In order to determine angle and thus eliminate this variable, the robot must have two points of reference that can be compared to get the robots angle. Because the eyeRobot only has one side facing IR rangefinder, in order to achieve these two points it must compare the distance from the rangefinder over time as the robot moves. It then determines its angle from the difference between the two readings as the robot moves along the wall. It then uses this information to correct for improper positioning. The robot goes into wall following mode whenever it has a wall alongside it for a certain amount of time and exits it whenever there is an obstacle in its path, which pushes it off its course, or if the user uses the twist handle to bring the robot away from the wall.


Step 7: Parts List

Picture of Parts List
Parts Required:
1x) Roomba create
1x) Large sheet of acrylic
2x) Sharp GP2Y0A02YK IR rangefinder
4x) Maxsonar EZ1 ultrasonic rangefinders
1x) ZX-24a microprocessor
1x) Robodyssey Advanced Motherboard II
1x) Slide potentiometer
1x) Single turn potentiometer
1x) Linear bearing
1x) Solderless breadboard
)))) Assorted Hinges, dowels, screws,nuts, brackets, and wires


Step 8: Motivation and Improvement

Picture of Motivation and Improvement
Motivation:
This robot was designed to fill the obvious gap between the capable but expensive guide dog and the inexpensive but limited white cane. In the development of a marketable and more capable Robotic White Cane, the Roomba Create was the perfect vehicle for designing a quick prototype to see if the concept worked. In addition, the prizes would provide economic backing for the considerable expense of building a more capable robot.


Improvement:
The amount I learned building this robot was substantial and here I will attempt to lay out what I have learned as I move on to attempt to build a second generation robot:
1) Obstacle Avoidance – I have learned a lot about real time obstacle avoidance. In the process of building this robot I have gone through two completely different obstacle avoidance codes, starting with the original object force idea, then moving to the principle of finding and seeking the most open vector, and then moving back to the object force idea with the key realization that the object response should be non-linear. In the future I will correct my mistake of not doing any online research of previously used methods before embarking on my project, as I’m now learning a quick Google search would have yielded numerous great papers on the subject.
2) Design of the stick sensors – Beginning this project I thought my only option for a linear sensor was to use a slide pot and some sort of linear bearing. I now realize that a much simpler option would have been to simply attach the top of the rod to a joystick, such that pushing the stick forward would also push the joystick forwards. In addition a simple universal joint would allow the twist of the stick to be translated into the twist axis of many modern joysticks. This implementation would have been much simpler then the one I currently use.
3) Free turning wheels – Although this would have been impossible with the Roomba, it now seems obvious that a robot with free turning wheels would be ideal for this task. A robot that rolls passively would require no motors and a smaller battery and thus be lighter. In addition, this system requires no linear sensor to detect the users push, the robot would simply roll at the users speed. The robot could be turned by steering the wheels like a car, and if the user needed to be stopped brakes could be added. For the next generation eyeRobot I will certainly use this very different approach.
4) Two spaced sensors for wall following – As discussed earlier problems arose when trying to wall follow with only one side facing sensor, thus it was necessary to move the robot between readings to achieve different points of reference. Two sensors with a distance between them would simplify wall following greatly.
5) More sensors – Although this would have cost more money it was difficult trying to code this robot with so few windows on the world outside the processor. It would have made the navigation code much more powerful with a more complete sonar array (but of course sensors cost money, which I didn’t have at the time).


Step 9: Conclusion

Picture of Conclusion
Conclusion:
The iRobot proved an ideal prototyping platform for experimenting with the concept of a Robotic White Cane. From the results of this prototype it is apparent that a robot of this type is indeed viable. I hope to develop a second generation robot from the lessons I have learned from using the Roomba Create. In future versions of eyeRobot I envision a device capable of doing more than just guiding a person down a hallway, rather a robot that can be put in the hands of the blind for use in everyday life. With this robot, the user would simply speak their destination and the robot would guide them there without conscious effort from the user. This robot would be light and compact enough to be easily carried up stairs, and tucked away in a closet. This robot would be able to do global navigation in addition to local, being able to guide the user from start to destination without the users prior knowledge or experience. This capability would go well beyond even the guide dog, with GPS and more advanced sensors allowing the blind to freely navigate the world,
Nathaniel Barshay,
(Entered by Stephen Barshay)
(Special thanks to Jack Hitt for the Roomba Create)


Step 10: Construction and Code

Picture of Construction and Code
A few extraneous words on construction:
The deck of made by a piece of acrylic cut in a circle with an opening at the back to allow for electronics access, and is then screwed into the mounting holes beside the cargo bay. The prototyping board is screwed into the screw hole at the bottom the bay. The Zbasic is mounted with an L bracket’s with the same screws as the deck. Each sonar is screwed into a piece of acrylic, which is in turn attached to a L bracket attached to the deck (the L brackets are bent back 10 degrees to give a better view). The track for the linear sensor is screwed right into the deck and the slide pot is mounted with L brackets beside it. A more technical description of the construction of the linear sensor and control rod can be found in step 4.


Code:
I have attached the full version of the robots code. Over the course of an hour I have attempted to clean it up from the three or four generations of code that were in the file, it should be easy enough to follow now. If you have the ZBasic IDE it should be easy to view, if not use notepad starting with the file main.bas and going through the other .bas files.

Portable USB Solar Charger

Picture of DIY Portable USB Solar Charger ($20 - 4 Ports)
IMG_20130823_131144.jpg
IMG_20130811_221101.jpg
Dreamed of making a cheap and “EXTREMELY RELIABLE” portable USB solar charger? Here’s a quick tutorial, revealing how I made mine with a budget less than $20!
___________________________________________________________________________________
I have so many uses for it. When we travel and go camping, it serves us an unlimited supply of charging power for our handheld devices, such as iPhones, iPads, Speakers and Android Devices. It can charge anything! Anytime, anywhere! When an outrageous storm comes in, blackouts are inevitable, it’s a good thing to have a solar charger!

By the help of of our trusty USB powerbank, charging during night time is possible, it acts as a battery reservoir, and charges during day.
It only takes 40-120 minutes to fully charge your powerbank, and it also comes with a 4 bar battery indicator!
It’s a sustainable + reliable source of energy, ideal for charging USB devices.

Features:
– 10 Volt 3W Solar Panel (Water Proof – Shock Resistant)
– 2800mAh PowerBank (2A Output – iPhone 5 compatible)
–  Self-sustainable – Close to Unlimited USB Power 😀

Also, please support and visit my site: ASCAS.ph (incompatible with IE -Under Renovation)
Enjoy Reading 😀 Cheers!

Step 1: Tool & Materials

Picture of Tool & Materials
It’s recommended to use a solar panel rated at least 3W-10W at 6-10 volts. This is done to shorten your charging time. My parts and materials cost me 725php ($17/ USD). The links below are just alternatives. In the next page, I gave a list of cheap, quality powerbanks from DealExtreme.com products w/free shipping”. The price is up to you, try to hunt down clearance sales.

My Parts – (w/ Alternative Links):
1.) Solar Panel (mine= 10V 400mAh | 3W)
2.) USB PowerBank (2800 mAh – w/ battery indicator)
3.) 4 Port USB Hub
5.) 7805 Regulator Chip
6.) Micro USB Cable (A stripped end)
7.) Leatherman Multitool (from: Instructable Prize)
8.) A short length of stranded wire.
9.) Superglue (Gorilla or MightyBond)

Step 2: Choosing Your PowerBank (Optional Step)

Picture of Choosing Your PowerBank (Optional Step)
sku_235335_1.jpg
sku_231652_5.jpg
sku_205731_1.jpg
sku_235269_5.jpg
Here’s a variety of powerbanks I recommend. Products derived from “DealExtreme.com” + free shipping
You can chrage your powerbank even if devices are plugged to it. You can skip this step, if you already have a powerbank 😀

The Price List:
1.) 2600mAh External Battery Mobile Power Bank ($6.30)
2.) Compact 3000mAh Portable Rechargeable Power Bank w/ LED Indicator ($9.20)
3.) Power Bank Aluminum Alloy Housing Case w/ Protective Board ($10.40)
4.) Ultrathin External 4000mAh Power Battery Charger w/ Touch Control ($14.30)
5.) Ultrathin External 4000mAh Power Battery Charger w/ Touch Control (Black Edition $14.30)

Step 3: Soldering The 7805 Regulator

Picture of Soldering The 7805 Regulator
images.jpg
F3TAHEEHKM941EF.LARGE.jpg
IMG_20130811_215116.jpg
IMG_20130811_215320.jpg
IMG_20130811_202800.jpg
Since my solar panel produces 10 volts (3W), while the powerbank needs to be fed with 5V of USB power, a regulator must be added in order to charge the powerbank. Without the 7805 regulator, the powerbank’s internals might get damaged due to over-voltage.

1st.) Follow the simplified schematic diagram above, read them carefully!
2nd.) Solder the micro USB plug first, to the 7805
3rd.) Solder two wires on your 7805, to be connected to your solar panel (+ & – )
4th.) Use a small droplet of superglue to mount the regulator in your solar panel’s terminal block.
6th.) Trim the heat-sink mount of your 7805 chip if necessary.
5th.) Solder the two wires of your 7805 to the solar panel. Observe polarity! (+ & – )

FYI, The switching regulator, gets more juice from your solar panel!
Since the 7805 is limited to 1 Ampere, you might want to buy a HIGH-EFFICIENCY 5v Switching Regulator from DealExtreme.com for only ($3.80 + Free Shipping)!

OPTIONAL: You can now charge your device directly from the panel, even without the powerbank! 

Step 4: Mounting Your Devices

Picture of Mounting Your Devices
IMG_20130811_220325.jpg
It’s now time to mount your devices. I used a heavy double-sided adhesive to mount the powerbank and the USB hub behind the solar panel. If you plan to mount these devices permanently, it’s ideal to use hot-glue or epoxy, for them to stay still.

1st.) Mount The PowerBank
2nd.) Connect the charging cable of the solar panel to the powerbank’s charge-input.
3rd.) Plug-in your USB Hub/Port to your powerbank’s output.
4th.) It’s now time to charge your devices! Just plug them in you USB Hub!

Step 5: Unleash 4 Ports of USB POWER! It’s Ready 😀

Picture of Unleash 4 Ports of USB POWER! It's Ready :D
IMG_20130811_221101.jpg
IMG_20130823_125834.jpg
Plug your devices and your done! Thanks for reading!

Battery powered hand warmer

Picture of Battery powered hand warmer

This is my first instructable so be kind. I wanted to make a hand warmer that was battery powered and rechargeable. I welcome any and all criticism. If you have any questions please feel free to ask.

Step 1: Parts List

Picture of Parts List
20141011_155537 (1).jpg
20141011_142139.jpg

Tools & Parts

  • Heat resistant tape
  • Hot glue gun
  • knife
  • Soldering iron
  • Balsa wood
  • 28 awg wire (about three feet)
  • Switch
  • AA battery holder
  • Cover cloth
  • AA batteries

Step 2: Prep Work

Picture of Prep Work
20141011_142956.jpg
20141011_143311.jpg
  1. Start by outlining the battery holder on the balsa wood with a pencil. You will want to leave about 1/2 inch extra on the length to make room for the switch.
  2. Cut the stenciled rectangle out with a knife.
  3. Take 3 ft. of wire (I used 28 gauge copper bead wire) and wrap it around the wood (be sure to keep the wires from touching as you spiral it across the balsa).
  4. Tape the ends of the wire with a heat resistant tape to the wood.

Step 3: Install and Drive on!

Picture of Install and Drive on!
5439d4832f7cc6e6b800002b.jpeg
  • After finding a suitable place for the copper wire heating source. You will need to cut an appropriate slot for the switch.
  • After prepping the switch, glue in place with either hot glue or “CA”.

Step 4: Glue and Solder

Picture of Glue and Solder
20141011_155845.jpg
20141011_155326.jpg
  • Solder all connections in place and prep for the cloth cover. (heat resistant tape is amazing at this stage)!!!

Step 5: Finish with Hot Glue

Picture of Finish with Hot Glue
20141011_161403.jpg
  • Hot glue the cloth to cover the heating coils.
  • Make sure to wrap fully around the heating coils(Don’t worry, the coils should not produce enough heat to cause harm… theoretically around 100 -140 degrees Fahrenheit)
  • After the cloth is secure, hot glue the heating aspect to the battery holder.
  • It should last between 1 and 2 hrs.
  • Enjoy the warmth during the winter!!!!!!

(Disclaimer: Ensure to properly check connections and aggregate temperature).

Shapeoko 2 + Arduino UNO R3 + grbl 9g = 8bit Laser Diode Photo Engraving

Picture of Shapeoko 2 + Arduino UNO R3 + grbl 9g = 8bit Laser Diode Photo Engraving
DSC00563.JPG
DSC00483.JPG
DSC00503.JPG
DSC00478.JPG
DSC03689.JPG

Me and my son invented/developed a new concept of “On-The Fly” 8bit Laser Diode engraving photos over two years ago. We have came a long way since those days at the beginning of our experimentation and here is the instructions to our latest build on a Shapeoko 2.

Photos can be engraved on different materials using a variable intensity controlled Laser Diode to get 8bit shading. The standard laser photo engraving process prior to our development was to TTL modulate (Pulse) the material with burnt spots/dots using a dithered black & white image to get the allusion of shades.

Commercial CO2 laser engraving machines are still are using this old school method today to engrave photos. Higher end CO2 engraving machines do use 256 separate power levels to 3D engrave, but with the excessive laser power, it’s like a bull in a china shop to be able to implement this 8bit photo engraving process successfully like we have when using a considerably lower wattage Laser Diode.

Since our development of this concept, many hobbyist, makers and businesses has followed our progression and applied this very unique process on there own machines to engrave 8bit gray-scale images using a Laser Diode. There are two presented here on Instructables, one very successful campaign on KickStarter, and many, many hobbyist throughout the world are using our software programs to Laser Diode engrave photos with our very unique 8bit engraving process. More examples of our Laser Diode engravings can be seen here using this method.

This method of varied intensity of a laser diode requires a proper “Image to Gcode” program, a computer controlled CNC machine, motor controller, modulated laser diode driver and a Laser Diode in the 1W-5W range to do the 8bit shading on the materials. We prefer a 445nm wavelength Laser Diode and the one we used in this project has a max output of 2.5W.

With the years of experimenting on different machines with ball screws, linear ways, stepper & servo motors and controller software, we decided to experiment with a stepper belt drive Shapeoko 2 run by an Arduino UNO R3 this time.

We are the second owners of this Shapeoko 2 and our very good friend John Champlain purchased it new from Inventables. This is the actual machine that he used for the development of our grbl related engraving software programs. John uses an electronic DAC circuit he designed and built for varying the modulation voltage to the laser diode driver for varied intensity Laser Diode control. John is the first to be successful using our varied Intensity controlled Laser Diode concept with grbl and an Arduino UNO. The Arduino UNO R3 that we are using on this build was purchased by John from Radio Shack and shipped to us for our testing and experimentation, so we used it on this Shapeoko 2 build.

We used the stock 3 axis v5 grbl motor shield that comes with the Shapeoko, the UNO is flashed with grbl 9g, our Image to Gcode raster engraving programPicLaser Lite, our PicEdit Lite image editing program and our PicSender program to handle streaming the large raster gcode files to the Arduino & grbl.

These three outstanding software programs were written by our very good friend,John Champlain for Arduino grbl controlled CNC raster engraving machines andPicLaser Lite has the option to generate Gcode for other CNC controllers as well.

We were really surprised of the performance and excellent results we are able to achieved with our Laser Diode photo engraving experimentation using the Shapeoko 2 controlled by an Arduino UNO.

Some modifications to the Shapeoko 2 and allot of experimenting with the settings was needed to get everything tuned in just right and here is our instructions how we were able to achieved success.

Step 1: Changes to the Shapeoko’s Table

Picture of Changes to the Shapeoko's Table
DSC00502.JPG
Shapeoko Table.jpg
DSC04455.JPG

As a machinist for 40 years now, clamping materials in machines comes natural for me. The MDF board table that comes stock was just not to my liking, so a new table was needed with a way to clamp my engraving materials in place precisely.

We found McMaster Carr sells aluminum T-slot track for a 1/4″ bolt, so I calculated how many we would need to cover the travel of the Shapeoko with 1″ spacers in-between them. Nine was what we needed with the 12″ travel of the X axis. The lengths are 24″ long and sticking out the front and back, but this was not a problem.

The 20mm square aluminum extrusion framing on the Shapeoko 2 for the original MDF bed had slots on all four sides. I needed only 2 slots 180 degrees from each other and the other two sides without slots to screw the T-Slot track down to. McMaster Carr also sells this 20mm framing just as I needed. So we could attach these on the end plates for the Y axis MakerSlide supports and to tie the center of the table together. We ordered 3 of them 24″ long. Again, these being longer then the original ones, sticking out the sides further did not effect anything. No cutting to a shorter length was necessary. The MMC part number for the T-Slot track is 1850A14 and the 20mm square framing is 5537T117.

The spacers we used in-between the T-Slot track is 1/2″ square aluminum tubing and we used two between each T-Slot track. We had that in stock here at our shop leftover from a previous job and just cut them to the same 24″ length.

To tie all these table parts together, it took some calculating and drilling holes in the T-Slot track & 20mm framing for #8 Pan head sheet metal screws. I included a drawing with these general dimensions. The Shapeoko’s dimensions between end plates may vary slightly, so some adjustments may be required.

We ran all the screws in loosely, then used a bar clamp to tighten all the T-Slot track and spacers up tight and made sure everything was square and flush then tightened up all the #8 screws.

As shown in the picture, two pieces of aluminum stock was added for the material starting reference place and to insure the material would be square in the Shapeoko. We added a Shop Fox Cam Clamp to hold the material in place and it’s placement is adjustable in the T-Slot Tracks.

Safety is number one priority for everyone here including our pets, so we added aLaser light shield to the Shapeoko 2 and mounted them with L brackets from our local hardware store.

Step 2: Adding a MA3 Magnetic Shaft Encoder to the Z axis

Picture of Adding a MA3 Magnetic Shaft Encoder to the Z axis
DSC00514.JPG
DSC00517.JPG
Shapeoko encoder bracket.jpg

When John was doing his experimenting with this Shapeoko, he used an electronic DAC to control the modulation from the Z axis step and direction pins on the Arduino. We prefer a more mechanical/electrical option and used this successfully though all our builds since concept. The MA3 has 10bit (1024) resolution for the analog output voltage. MA3 Shaft Encoder

The MA3 outputs a 0-5v analog voltage based on rotations, so to be able to vary this output voltage from Z axis depths in the code, we needed to timing belt the MA3 to the Z axis stepper motor. This gives us a way to move the Z axis and Laser Diode up and down to adjust for material height to maintain proper focal distance.

We found that since the Stepper motor on the Shapeoko is direct drive to the Z axis screw, a 1-4 ratio worked the best. We found a 40 tooth MXL timing belt pulley on eBay with the required 5mm bore to fit the nema 17 stepper motor.

The stepper motor sticks up to far in our opinion, so we lowered it by 13mm by replacing the three 50mm standoffs with 37mm standoffs instead. The flex coupling was not modified in any way to lower the stepper motor down. It just took some readjusting on the stepper and Z axis screw shafts. The MMC part number for these standoffs is 92080A445.

The MA3-A10-125-B part number we used has a .125″ diameter shaft and MMCpart number 1375K11 MXL timing belt pulley has 10 teeth that will fit the encoders shaft. It’s also to be used with a .125″ wide timing belt which is plenty strong enough for this use. The MMC 1/8″ wide timing belt is Kevlar reinforced and the part number is 1679K87. The trade size is a 100MXL.

An aluminum bracket had to be made to mount the encoder and stepper cooling fan. We just used a piece of .125″ thick aluminum angle 2.500″ wide and cut one side 1.625″ long and the other side 4.00″ long. Since the MA3 is a magnetic type encoder, we needed to maintain a center to center shaft distance of 3.00″ to avoid any magnetic interference from the stepper motor.

The fan we mounted to cool the stepper is also lower to avoid this magnetic interference also, but we bent the aluminum slightly to aim the blowing air at the stepper motor. Since we are pushing this stepper motor pretty hard with increased amps, higher accelerations and it’s changing directions so quickly, we had to add heat sinks to all four sides. We found some heat sinks on Amazon that fit perfectly to the nema 17 stepper. With some thermal transfer paste applied to the stepper motor’s, stainless safety wire was able to tighten up all four of them up around all four sides of the stepper motor. A cutout on one heat sink was needed to clear the stepper motors wires.

Step 3: Changing belts and pulleys on the X & Y axis

Picture of Changing belts and pulleys on the X & Y axis
DSC00532.JPG

The resolution of the X&Y axis movement and stretching of the timing belts on the stock Shapeoko gave some image reproduction quality issues, so we did some changes.

Since the stock pulley on the stepper motors are 20 teeth, we wanted to change them to the least tooth count as possible, so we could increase the steps in the grbl settings for finer incremental moves. We found a 15 tooth MXL pulley at MMC so we could increase the steps, but still travel the same distance. MMC also sells 1/4″ wide MXL Kevlar reinforced timing belts to minimize the stretching. These belts can be found on page 1076 in there online catalog.

The MMC part number 1375K34 15T timing belt pulley has a 3/16″ bore, so we had to use a 5mm reamer to open up the bore to fit the nema 17 stepper motor shaft.

Since the pulley ratios & belts were changed on the X&Y axis, we used a dial indicator to set up the steps/mm for those axis’s. Belt tension plays a part in this, so we did it this way instead of calculating what it should be. The travel was tested at 1.00″ movement back and forth and it worked out to a 52.850 step/mm for both the X&Y axis with the jumper set at 8X on the grbl shield. This has been confirmed by the engraving size matching the PicLaser Lite settings we used.

Step 4: Adding a Laser Diode to the Shapeoko

Picture of Adding a Laser Diode to the Shapeoko
DSC00501.JPG
DSC00539.JPG

The Shapeoko 2 already comes with a spindle mount, so adding a Laser Diode is fairly easy by bolting it the spindle clamp tapped screw holes. The pictures are self explanatory, but others may want to do this different based on the diode they are going to use, or materials that they have access to. We had a piece of 1/2″ thick aluminum plate and a CPU heat sink that we bored the laser Diode’s 12mm module’s diameter size in-between them at a depth of the module’s length. This way It’s sandwiched between the two parts to hold it in place. A through hole was needed for the wires to exit also. Some thermal paste between the Laser Diode’s module and heat sink/aluminum plate is recommended.

A fan to blow smoke away, cool the heat sink and diode and to keep contaminates away from the lens is needed. We also screwed the Flexmod P3 laser driver’s power MosFET down to the 1/2″ aluminum plate with thermal paste between them to act as a heat sink instead of the one that was supplied with it. The upper spindle clamp screw holes made a nice place to attach a junction wiring board to. We also used these tapped holes for mounting the volt meter also.

The key dimensions here are to have the Laser Diodes lens approximately 2.75″ to 3.00″ from the table when the Z axis is all they way down and bottomed out. We made a 1/4″ aluminum plate that screws to the bottom of the Z axis MakerSlide that bottoms out on the Spindle Mounting Plate. This is for a reference when setting placement of the Laser Diode from the table surface, focusing the lens based on the table surface and moving the Laser Diode up from there for the material height.

Step 5: Wiring the Components Together and Laser Diode Amp Settings

Picture of Wiring the Components Together and Laser Diode Amp Settings
DSC00483.JPG
DSC00517.JPG
DSC00487.JPG
DSC00491.JPG

Since we are using a Flexmod P3 to control the lasers intensity, here is the manual that explains how to set it up properly. Flexmod P3 Manual

It’s rating of input voltage is between 5vdc-24vdc. We use a 12vdc power supply with 12.5a max output on this build. Since we are powering the heat sink cooling fan and grbl shield/Arduino cooling fan also with this power supply, we like to have extra amps so the fans do not slow down when the laser power goes up and draws more amps.

The settings we used on the Flexmod P3 is 150ma threshold and 1.4a at full 5v modulation voltage. This was for the NDB7875 9mm Nichia diode that we used. This diode is capable of higher wattage output, but we lowered the max amps to make it more dependable and for the power range we needed for this particular build.

On this setup we are getting approximately 2W output max using a AR coated 3 element glass lens. The 150ma threshold gives us a nice laser pointer to line up on the material for our starting engraving point.

We also recommend using a Lasorb for ESD protection. These diodes are expensive and we learned our lesson not using one in our years of experience using them. When they get static shocked, they turn into very expensive LED’s

😦

Step 6: Setting up grbl 9g, grbl Shield & Focusing the Lens

Picture of Setting up grbl 9g, grbl Shield & Focusing the Lens
DSC00544.JPG

First I need to explain how the Laser Diode will get focused properly. As explained in Step 4, the laser Diode’s lens needs to be properly positioned off the table.

I will explain how to focus the lens further on, but will jump to the grbl settings first because this effects this process. In the PicSet screenshot shows all of our settings on our Shapeoko 2 for Varied Intensity controlled laser engraving.

The Z axis jumper is set for 4X on the grbl shield and with the 1.25mm pitch screw, the steps should be 640, but with that step setting, the Z axis stepper will loose steps with the higher accels and very fast movement changes. We divided that number by three and set it at 213.333 steps/mm.

With the 1-4 ratio to the MA3 encoder, it gives us .037 gcode movement through the whole range of the MA3 encoder’s 0-5v output to the modulation for the Laser Diode driver. Since we use a Minimum depth of Z.0000″ and Maximum depth of Z-.0255 in the PicLaser Lite settings for 8bit shading, the extra distance allows us to start at higher starting burning power for different materials where we zero the Z axis. It also gives us a safety at the higher end, so the encoder does not jump back to 0v again.

With the steps/mm set at 1/3, the Z axis only moves .0085″ total with the full .0255″ movement in the gcode and will not effect the focal distance on the material.

We have the direction reversed in our grbl setting because of how the encoder is driven from the stepper. Up jog moves down and down jog moves up. I know this part may be confusing, but it would have been more difficult to mount the MA3 flipped 180 degrees to work the other way.

To focus the laser diode the Z axis needs to move all the way down until the Z axis bottoms out on the 1/4″ aluminum plate and the stepper slips. Zero the Z axis inPicSender there. We use a .375″ thick piece of black anodized piece of aluminum to shoot the laser beam on for focusing. Since the steps are at 1/3 of what it should be, we jog up a total of 1.125″. The voltage to the modulation will be between 0-5v, but we jog slightly up or down, what ever direction is closes to get 1v on the voltmeter. A little power coming from the diode is needed to see the focal spot for adjustment.

Before turning on the laser with the On(M03) in PicSender, the proper laser safety glasses are absolutely necessary. Make sure no one else or any pets are anywhere near by when this or any other lasering operation is performed.

When your sure you and everyone else is safe, turn on the laser and make sure the beam projects onto the black anodized aluminum. Without burning your fingers by blocking the beam coming out of the lens, carefully rotate the lens one way or the other until it is the focal point is the smallest size possible. We are getting a .005″ diameter on our setup and that is the optimum size for engraving photos. We use a little hot melt glue on three or four sides of the lens thread to the laser diode module to insure it will not move and change the focus.

Attached is two Gcodes for testing the burn line width and one for use with varied intensity Laser Diode control and the other for TTL control.

The Gcode burns 6 line pairs and steps over .001″ each time. The first pair steps over .01″ and the last .005″. Looking real close at them with an eye loop or magnifying glass, when you see the two burn lines meet, that is the burn line width for your Pixel Size setting in PicEdit Lite of your engraving and the Pixel Resolution setting in PicLaser Lite to generate the Gcode. It should be the last pair of lines that meet with the .005″ step over to achieve the best results for Laser Diode engraving image reproduction.

Step 7: Our Software Settings for Varied Intensity Laser Engraved Photos

Picture of Our Software Settings for Varied Intensity Laser Engraved Photos
Wagner1.JPG
Wagner3.JPG
PLL3.JPG
PLL2.JPG

This will walk through the settings we use in our programs to Laser engrave photos. These instruction will explain how to engrave with “Varied Intensity” We will explain “TTL” (pulsing) in a later step. Our programs will generate Gcode for both processes.

First finding or taking a photo worth taking the time to laser engrave, is a challenge sometimes. We looked around here in our shop and seen our little friend and helper Wagner setting in his LazyBoy Captain’s chair just wagging his little tail a mile a minute trying to tell us something. It just had to be, he wanted us to Laser engrave him for this instructables. Since he was a pup, he has always been a happy little guy, wagging his tail and that’s why we called him Wagner.

With our 16mp Sony Cyber-Shot, I proceeded to take pictures of Wagner. Selected the best one and brought it into PicEdit Lite for re-sizing. If you notice in the picture, Wagner’s tail is blurred, but that’s alright, it will engrave the image just like it is. We recommend high quality and high resolution photos to use in this Laser engraving process.

First is to adjust Pixel size based on the size of material we are engraving on. The 1/4′ thick Poplar we are using is 5.500″ wide, so we cut it 7.500″ long. We found sanding the Poplar with 180g sandpaper, or any other type of wood we are laser engraving, helps bring out the detail when engraving. We try to use wood that does not have allot of grains that effects the image reproduction. We found select boards of Poplar can be used for our use.

The second picture shows the default PicEdit Lite settings and pixel size just as the photo was taken. With our focal spot and burn line size from the 9mm Nichia LD being a .005″ size, we will calculate the pixel size for .006″ Pixel Resolution for that setting in PicLaser Lite for engraving at a 45 degree angle.

Pixel resolution is how the gcode will be generated for the step over and step ahead incremental moves in our program. If we use a Horizontal or Vertical engraving angle with a .006″ Pixel Resolution and a .005″ laser burn line, this will cause lines showing up in the engraving. By using 45 degree angle in the settings, the .005″ laser burn line will overlap slightly and no lines will show up in the laser engraving.

To calculate the engraving size based on our material size, a simple multiplication is required. Our Poplar is 5.500″ wide and we want the image to be 5.25′ tall, so just multiply 5.25 X 1.666=8.7465. Since PicLaser Lite calculates at 100 steps per inch, multiply 8.7465 X 100 = 874.65. Close enough to 875, so that is what I will use in the ‘Height” Pixel setting. Typed in 875 and then clicked in the “Width” box and it change the width pixel size by automatically maintaining the Aspect Ratio. The width pixel size is 1167, so divide that by 100=11.67 then divide that by 1.666= 7.000″. The engraving size will be 7″X5.25″. This will be confirmed in PicLaser Lite.

Adjusting the image’s Sharpness, Contrast, Brightness and Gamma takes some trial and error, These settings will be based on the original image and burning power range of your laser diode.

The third picture shows the image adjustments we used based on Wagner’s picture and for our Laser Diode’s power range on the Shapeoko 2. In PicEdit Lite there is a “Preview Gray Scale” selection to get a good idea how your engraving will come out.

In the forth picture shows the settings we used in PicLaser Lite to 8bit Laser Engrave Wagner. Notice in picture five, with the Pixel Resolution set at .006″, the image will engrave at 7.002″X5.25″. This confirms our calculations for the pixel sizes.

Step 8: Setting up the Shapeoko for Laser Engraving

Picture of Setting up the Shapeoko for Laser Engraving
PS2.JPG
DSC00512.JPG
DSC00504.JPG

First we need to clamp our 7.5″X5.5″X.25″ piece of Poplar and set all axis’s starting point. PicLaser Lite generates the gcode based on the lower left corner of the image to be engraved as the X&Y starting Zero point. Our stops are setup for this as shown in the first picture.

Next is jogging the Z axis down to start the setting for the correct focal point for the material. The Z axis is reversed in motion, so we use the Z+ jog button in PicSenderto jog down. Set the Z axis to incremental moves and jog until it bottoms out on the 1/4″ aluminum plate and you hear the stepper motor slip, then Zero the Z axis. Since the $102=213.333 (z, steps/mm) 1/3 of what it normally would be, we need to jog up .750″ instead of .250″ for the material thickness so the focal point is correct.

When we do this, the voltage from the MA3 shaft will fall in-between 0-5v. We then jog the Z axis in the direction closest to 0v. If it’s above 3v, jogging past 5v will cause it to jump back to 0v again. Our starting point for the Poplar is Z-.002 from 0v which is 300mv on the meter. This is the power level where the laser beam just starts to burn the lightest shades in the image onto the wood. We jog the Z axis there and then zero it.

Since our machine is not near the computers keyboard, we use a USB mini number keypad to do the jogging. We need to positioning the laser beam on the lower left corner of the material. Since or Z starting burning point is 400mv, we jog back close to 0v on the meter to use the laser beam as an alignment pointer. We jog at large increments to get close to the corner of our piece of Poplar, then set the X&Y jog increment to .0100.

Time to put the Laser safety glasses on again. Turn On(M03) and start jogging to the corner. When the laser beam projects on the corner, zero the X&Y axis there and Off(M03) the laser. Now the Z axis can be jogged back to zero with 300mv on the meter.

Since our Wagner engraving size is 7.002″X5.25″, we need to jog X&Y to the starting zero point. For centering on our piece of Poplar, X needs to be jogged .249″ positive and Y needs to be .125″ positive. Zero the X&Y axis again there.

Step 9: Laser Engraving 8bit Shades with Varied Intesity

We generated the Wagner Laser engraving Gcode in PicLaser Lite, but we are going to add some enhancements to this engraving that our full Image to Gcode software program PicEngrave Pro 4 has as an option when generating the Gcode. This option is called FRC (Feed Rate Change).

Our best friend, John Champlain developed/invented this very unique Gcode process for engraving photos with laser intensity control and it has really made our “On-The-Fly” laser engravings stand out! A special thanks from us goes out to you John!

We wrote a stand alone software program that does the same thing by taking the gcode already generated and adds a varied feedrate to the end of each line in the Gcode file based on the Z axis min & max depths. It has a percentage setting to change the feedrate for the minimum and maximum depth range. It slows the feedrate down in darker areas when the laser power increases and speeds it up in lighter areas when the laser power decreases which allows us to expand the lighter and darker shade range. It really helps us to fine tune our varied intensity Laser engraving process.

On the Wagner Gcode we used a 60IPM feedrate with a 30% reduction at full depth. White shades will run at 60IPM and black shades will run at 42IPM. All shades in-between will vary in that 60-42 inches per minute range based on the Z axis Gcode depths that controls the intensity of the laser diode.

Step 10: 1bit TTL (pulsing) Laser Engraving

Here I will explain how to Laser Diode engrave using a Dithered Black & White image using the standard TTL (pulsing) method. We will use the same Wagner image, but we will dither it first. PicEdit Lite has 11 different Algorithms to choose from, but we will use the “Atkinson” option and leave the default setting at .125 for this engraving.

Editing for size, sharpening, contrast, brightness and gamma must be all done first and then saved. Then reopen the image file again, select Atkinson and then select the “Dither” button. You will notice after it’s dithered, the image will consist of allot of dot patterns to give the allusion of shades. More dots are condensed in darker areas an less condensed in lighter areas. This is the type of images commercial CO2 Laser engraving machines require and use.

We brought this image into PicLaser Lite to generate the code, but this requires a little different minimum and maximum depth setting for the Z axis. We only want the laser to come “ON” in the black spot areas and “OFF” in the white areas, so a Z-.0015″ max is used this time. The MA3 magnetic shaft encoder goes from 0v to 5v and then back to 0v. Reversing rotation direction changes this, so a different grbl settings is needed for TTL so it will go from 0v at the Z.0000 in the Gcode for white and then the Z-.0015 will go full power with 5v for black. As the code is running it pulses the laser “ON” and “OFF” to engrave the image this way.

PicSet allows us to to save different profile grbl settings, so we need to load the ones for TTL engraving. I went through the same routine of setting the focal point based on material height, except jogging direction is reversed this time and we needed to set the Z zero starting point a little different this time.

We jogged Z until the volt meter jumped to 5v, then jogged back in the other direction .0001″ incremental moves until the volt meter jumped back to 0v again. From there, we jogged .0005″ more in the same direction and then zeroed the Z axis.

For TTL modulation (not varied) laser diode drivers, the Z axis direction Pin #7 can be used to pulse the Laser Diode with Z-up and Z-down moves in the the Gcode, that’s If $3= in 9g grbl is set properly. The Z negative direction in the Gcode will make Pin 7 go high (5v) turn Laser Diode “ON” and Z positive direction will make Pin 7 go low (0v) and turn the Laser Diode “OFF”. This will pulse the laser “ON” and “OFF” creating the Black and White shade allusion with the Gcode generated from the Dithered image.

Be Careful, Be Safe, and Have Fun!!

Speech Controlled Quadropod

Picture of Speech Controlled Quadropod

This is my first post on Instructables and I am super excited to share my knowledge!

My original robot post is here: Spryo SpoonTail which is just my robot showing some tricks but with a tethered control.

In this Instructable I am going to show you how to make a quadrapod of your own and using speech recognition via bluetooth to control your bot.

Gotta thank Oddbot for introducing me to robotics and helping me on a lot of projects. I am gonna reference his Instructable in certain stages to make this one shorter (with his permission ofcourse!).

Step 1: Understanding The Project

Picture of Understanding The Project

This project is not simple, in fact if you haven’t worked with Arduino and python, this project would be quite challenging, but trust me the end product is completely worth it. Additionally, you will discover how simple and powerful both Arduino and python really are.

It is worth mentioning that the main reason I am making this Instructable is to show that remote control via Bluetooth using speech recognition is quite possible and ain’t that hard. Also if you already have a robot of your own or even an RC car you have hacked, you can use this tutorial just for the speech recognition – control part.
Acknowledgements:

Step 2: Getting Started

Picture of Getting Started
software_required.jpg

Hardware Required:

Software Required:

  • Arduino – programming the quadropod
  • Python 3.4 – communication
  • Eclipse or Geany – python interface
  • Bitvoicer ($5) – speech recognition or Google Speech Recognition via python (free but involves more coding)

Skills Recommended:

  • Arduino knowledge
  • Python language
  • Patience (a lot required)

Note:

I am not advertising anything here, I am just linking you the products that I have used for my project, you are free to use tools/software/products of your liking or even make some on your own (think 3D printing).

Step 3: Chassis Assembly

Picture of Chassis Assembly
DSCF0527[1].jpg
DSCF0541[1].jpg

If you choose to use your own robot like an RC car, teddy bear or even a home automation device, feel free to skip this step.

Remember Oddbot who I mentioned before? He’s basically my virtual mentor. What you are going to do now is follow steps 1 to 7 from his Instructable – Playful Puppy Robot or you could use this PDF Manual to assemble to chassis of the quadrapod.

That’s it for this step.

Step 4: Wiring

Picture of Wiring
photo 2 (1).JPG
board[1].jpg
wiring_hc_board.jpg
photo 1 (2).JPG

Lets connect the servos first, they go in like this:

  • Front Left Hip servo – pin 46
  • Front Right Hip servo – pin 52
  • Rear Left Hip servo – pin 28
  • Rear Right Hip servo – pin 13
  • Front Left Knee servo – pin 47
  • Front Right Knee servo – pin 53
  • Rear Left Knee servo – pin 29
  • Rear Right Knee servo – pin 12

Note: The servos can be connected to which ever pins you find convenient, you just have to change the code to correspond to that changed pin.

Then the HC-06 bluetooth module (figure 4):

  • GND – GND
  • VCC – +5V
  • TXD – RX0 / D0 (Signal)
  • RXD – TX0 / D1 (Signal)

Figure 1: Shows the layout of the majestic DAGU Spider Board (note I drew the GND, 5V and Signal regions on the board for those of you who are slightly lazy to go through the manuals which I anyway really really recommend you to do).

Figure 2: Shows the splendid HC 06 Bluetooth module which will be our “RC” for this project. For this specific project, we will only be using the module to receive messages.

Figure 3: Wiring of the HC 06 to an Arduino UNO (in cause you just want to test it with an UNO or similar board for your specific project).

Figure 4: Plugging the HC 06 module to the Spider Board (GND and VCC connections are upto you as long as they are in that same line).

Figure 5: What it should kind of look like with everything plugged it (don’t worry about the shinny two ‘eyes’ in the front part of my bot, its just for show, and the servo on the back is for controlling his tail but the tail broke 😦 will fix it soon).

Step 5: Coding

Arduino:

Launch Arduino and open a new sketch. Unfortunately I do not have sufficient time to teach you the basics of Arduino. You can look up some tutorials if you feel like. So this is what we are going to do, once you opened a new sketch: File >> New or Ctrl + N.

Copy Paste the following code (or just download the attached file). I owe Oddbot again for helping me out with the motion part of the code looooong looong back.

The code:

/*  Title: Speech Controlled Quadropod
    Author: Adith Jagadish Boloor
    Comments: This code is free for you to use, modify or simply stare at when you are really bored.
*/

#include <Servo.h> // Servo Library

// Starting positions of the servos
#define FLHcenter      1500
#define FRHcenter      1500
#define RLHcenter      1520
#define RRHcenter      1450

#define FLKcenter      1150
#define FRKcenter      1220
#define RLKcenter      1220
#define RRKcenter      1210

// Here you can change servo pin numbers corresponding
// to your specific connections
#define FLHpin 46      // Front Left  Hip  servo pin
#define FRHpin 52      // Front Right Hip  servo pin
#define RLHpin 28      // Rear  Left  Hip  servo pin
#define RRHpin 13      // Rear  Right Hip  servo pin

#define FLKpin 47      // Front Left  Knee servo pin
#define FRKpin 53      // Front Right Knee servo pin
#define RLKpin 29      // Rear  Left  Knee servo pin
#define RRKpin 12      // Rear  Right Knee servo pin

// Buffer for the speech message
char INBYTE;

// LEDs!!!!! If you choose to use 'em
int led1 = 27;
int led2 = 11;
int led3 = 45;
int led4 = 26;

// Some constants for the bot's motions, yes you may
// play with them to get the best performance
int Time = 90;
int LShift = 280;
int RShift = 280;
int Raise = 550;

// Define servos
Servo FLHservo;                                               
Servo FRHservo;
Servo RLHservo;
Servo RRHservo;
Servo FLKservo;
Servo FRKservo;
Servo RLKservo;
Servo RRKservo;

void setup()
{
  // Attach servos (digitally)
  FLHservo.attach(FLHpin); 
  FRHservo.attach(FRHpin);  
  RLHservo.attach(RLHpin);  
  RRHservo.attach(RRHpin);  
  FLKservo.attach(FLKpin);  
  FRKservo.attach(FRKpin);  
  RLKservo.attach(RLKpin);  
  RRKservo.attach(RRKpin); 
  
  // Attach LEDs (digitally)
  pinMode(led1, OUTPUT);
  pinMode(led2, OUTPUT);
  pinMode(led3, OUTPUT);
  pinMode(led4, OUTPUT);
  
   
  Serial.begin(9600);
  
  // Center servos
  FLHservo.writeMicroseconds(FLHcenter);
  FRHservo.writeMicroseconds(FRHcenter);  
  RLHservo.writeMicroseconds(RLHcenter);  
  RRHservo.writeMicroseconds(RRHcenter);  
  FLKservo.writeMicroseconds(FLKcenter+700);  
  FRKservo.writeMicroseconds(FRKcenter+700);  
  RLKservo.writeMicroseconds(RLKcenter+700);  
  RRKservo.writeMicroseconds(RRKcenter+700);
  delay(500);
  
  // Prints out the controls mapped to different letters
  Serial.println("wasd - movement");
  Serial.println("z - sleep");
  Serial.println("x - play dead");
  Serial.println("i - sit");
  Serial.println("u - stand");
  Serial.println("q - wake up");
}

void loop() {
  Serial.println("Ready to accept command");
  while (!Serial.available());   // stay here so long as COM port is empty   
  INBYTE = Serial.read();        // read next available byte
  delay(50);

  if (INBYTE == 'q') // wake up - flashes LEDs if you used them
  { 
    digitalWrite(led1, LOW);
    digitalWrite(led2, LOW);
    digitalWrite(led3, LOW);
    digitalWrite(led4, LOW);
    delay(200);
    digitalWrite(led1, HIGH);
    digitalWrite(led2, HIGH);
    digitalWrite(led3, HIGH);
    digitalWrite(led4, HIGH);
    delay(200);
    digitalWrite(led1, LOW);
    digitalWrite(led2, LOW);
    digitalWrite(led3, LOW);
    digitalWrite(led4, LOW);
    delay(200);
    digitalWrite(led1, HIGH);
    digitalWrite(led2, HIGH);
    digitalWrite(led3, HIGH);
    digitalWrite(led4, HIGH);
    delay(200);
    digitalWrite(led1, LOW);
    digitalWrite(led2, LOW);
    digitalWrite(led3, LOW);
    digitalWrite(led4, LOW);
    delay(200);
    digitalWrite(led1, HIGH);
    digitalWrite(led2, HIGH);
    digitalWrite(led3, HIGH);
    digitalWrite(led4, HIGH);
    delay(200);
    digitalWrite(led1, LOW);
    digitalWrite(led2, LOW);
    digitalWrite(led3, LOW);
    digitalWrite(led4, LOW);
    delay(200);
    digitalWrite(led1, HIGH);
    digitalWrite(led2, HIGH);
    digitalWrite(led3, HIGH);
    digitalWrite(led4, HIGH);
  }
  else if (INBYTE == 'z') //Sleeps
  {
    FLHservo.writeMicroseconds(FLHcenter); 
    FRHservo.writeMicroseconds(FRHcenter);  
    RLHservo.writeMicroseconds(RLHcenter);  
    RRHservo.writeMicroseconds(RRHcenter);  
    FLKservo.writeMicroseconds(FLKcenter+700);  
    FRKservo.writeMicroseconds(FRKcenter+700);  
    RLKservo.writeMicroseconds(RLKcenter+700);  
    RRKservo.writeMicroseconds(RRKcenter+700);
    delay(1000);
    digitalWrite(led1, HIGH);
    digitalWrite(led2, HIGH);
    digitalWrite(led3, HIGH);
    digitalWrite(led4, HIGH);
    delay(200);
    digitalWrite(led1, LOW);
    digitalWrite(led2, LOW);
    digitalWrite(led3, LOW);
    digitalWrite(led4, LOW);
    delay(200);
    digitalWrite(led1, HIGH);
    digitalWrite(led2, HIGH);
    digitalWrite(led3, HIGH);
    digitalWrite(led4, HIGH);
    delay(200);
    digitalWrite(led1, LOW);
    digitalWrite(led2, LOW);
    digitalWrite(led3, LOW);
    digitalWrite(led4, LOW);
    delay(200);
    digitalWrite(led1, HIGH);
    digitalWrite(led2, HIGH);
    digitalWrite(led3, HIGH);
    digitalWrite(led4, HIGH);
    delay(200);
    digitalWrite(led1, LOW);
    digitalWrite(led2, LOW);
    digitalWrite(led3, LOW);
    digitalWrite(led4, LOW);
    delay(200);
  }
  else if (INBYTE == 'u') // stand Up
  {
    // Center servos
    FLHservo.writeMicroseconds(FLHcenter); 
    FRHservo.writeMicroseconds(FRHcenter);  
    RLHservo.writeMicroseconds(RLHcenter);  
    RRHservo.writeMicroseconds(RRHcenter);  
    FLKservo.writeMicroseconds(FLKcenter);  
    FRKservo.writeMicroseconds(FRKcenter);  
    RLKservo.writeMicroseconds(RLKcenter);  
    RRKservo.writeMicroseconds(RRKcenter); 
    delay(1000);
  }
  else if (INBYTE == 'x') // plays dead X
  {
    FLHservo.writeMicroseconds(FLHcenter);
    FRHservo.writeMicroseconds(FRHcenter);  
    RLHservo.writeMicroseconds(RLHcenter);  
    RRHservo.writeMicroseconds(RRHcenter);  
    FLKservo.writeMicroseconds(FLKcenter+700);  
    FRKservo.writeMicroseconds(FRKcenter+700);  
    RLKservo.writeMicroseconds(RLKcenter+700);  
    RRKservo.writeMicroseconds(RRKcenter+700);
    delay(1000);
  }
  else if (INBYTE == 'i') //sits
  {
    FLHservo.writeMicroseconds(FLHcenter);
    FRHservo.writeMicroseconds(FRHcenter);  
    RLHservo.writeMicroseconds(RLHcenter-600);  
    RRHservo.writeMicroseconds(RRHcenter+600);  
    FLKservo.writeMicroseconds(FLKcenter);  
    FRKservo.writeMicroseconds(FRKcenter);  
    RLKservo.writeMicroseconds(RLKcenter+700);  
    RRKservo.writeMicroseconds(RRKcenter+700);
  }
  else if (INBYTE == 'h') //shake Hands
  {
  FLHservo.writeMicroseconds(FLHcenter-600);
  FRHservo.writeMicroseconds(FRHcenter);  
  RLHservo.writeMicroseconds(RLHcenter-600);  
  RRHservo.writeMicroseconds(RRHcenter+600);  
  FLKservo.writeMicroseconds(FLKcenter+600);  
  FRKservo.writeMicroseconds(FRKcenter);  
  RLKservo.writeMicroseconds(RLKcenter-700);  
  RRKservo.writeMicroseconds(RRKcenter+700);
  delay(1000);
  FLKservo.writeMicroseconds(FLKcenter+300);
  delay(200);
  FLKservo.writeMicroseconds(FLKcenter+600);
  delay(200);
  FLKservo.writeMicroseconds(FLKcenter+300);
  delay(200);
  FLKservo.writeMicroseconds(FLKcenter+600);
  delay(200);
  }
  else if(INBYTE == 'w') // run forward
  {
  LShift=300;
  RShift=300;
  for(int i=0;i<5;i++)
  {
    Run();
  }
  }
  else if(INBYTE == 'a') // turn left
  {
  LShift=-300;
  RShift=300;
  for(int i=0;i<3;i++)
  {
    Run();
  }
  }
  else if(INBYTE == 'd') // turn right
  {
  LShift=300;
  RShift=-300;
  for(int i=0;i<3;i++)
  {
    Run();
  }
  }
  else if(INBYTE == 's') // run back
  {
  LShift=-300;
  RShift=-300;
  for(int i=0;i<5;i++)
  {
    Run();
  }
  }
}

void Run()
{
  FRKservo.writeMicroseconds(FRKcenter+Raise);         // raise front right leg 
  RLKservo.writeMicroseconds(RLKcenter+Raise-10);      // raise rear  left  leg
  FLHservo.writeMicroseconds(FLHcenter+LShift);        // move  front left  leg backward
  RRHservo.writeMicroseconds(RRHcenter-RShift);        // move  rear  right leg backward
  delay(Time/2);
  FRHservo.writeMicroseconds(FRHcenter+RShift);        // move  front right leg forward
  RLHservo.writeMicroseconds(RLHcenter-LShift);        // move  rear  left  leg forward
  delay(Time);
  FRKservo.writeMicroseconds(FRKcenter);               // lower front right leg 
  RLKservo.writeMicroseconds(RLKcenter);               // lower rear  left  leg
  delay(Time);
  
  FLKservo.writeMicroseconds(FLKcenter+Raise);         // raise front left  leg
  RRKservo.writeMicroseconds(RRKcenter+Raise);         // raise rear  right leg
  FRHservo.writeMicroseconds(FRHcenter-RShift);        // move  front right leg backward
  RLHservo.writeMicroseconds(RLHcenter+LShift);        // move  rear  left  leg backward
  delay(Time/2);
  FLHservo.writeMicroseconds(FLHcenter-LShift);        // move  front left  leg forward
  RRHservo.writeMicroseconds(RRHcenter+RShift);        // move  rear  right leg forward
  delay(Time);
  FLKservo.writeMicroseconds(FLKcenter);               // lower front left  leg
  RRKservo.writeMicroseconds(RRKcenter);               // lower rear  right leg
  delay(Time);  
}

Just before you can upload the code, do the following:

Open device manager and look up Ports (COM & LPT). Note the active port for your Arduino. It might say USB Serial Port (COM3) so for me the COM port is 3, THIS MIGHT BE DIFFERENT FOR YOU.

Tools >> Board >> Arduino Mega or Mega 2560 (image 2)

Tools >> Processor >> ATMega 1280 (image 3)

Tools >> Port >> COM3 (3 for my case only!!) (image 4)

Now you can hit the upload button and if all goes smooth, it should upload successfully and the robot should be on its belly with four legs stretched out symmetrically with a steady LED blinking from the HC 06.

If you get this error while uploading:

avrdude: stk500_getsync(): not in sync: resp=0xe0

Just disconnect the HC06 module, upload the code and hook it back on.

Ok I realized something while I was testing and writing this tutorial. You don’t need both BitVoicer and Python, you just need either of them. BitVoicer does everything for you and its speech recognition engine is quite powerful but it costs $5 while python’s speech recognition uses Google Speech API so it can sometimes give some really weird results and has some more coding, so the choice is yours.

Bluetooth

Okay just before we move on to the speech recognition bit via BitVoicer or Python, lets link the bluetooth of your PC to the HC 06 of your bot. Make sure the bluetooth on your PC works before proceeding because it took me 5 hours to realize that my laptop didn’t come with an inbuilt bluetooth and I had to order a module.

Now power up your bot either through USB or the battery pack. OMG when does this tutorial end!!!! Patience! You’re almost there.

  1. Right Click on Bluetooth Devices icon in the tray bar and click add new device.
  2. Let it search and search and search until HC – 06 pops up
    If it doesn’t show up after sometime try checking if you can find the device using your mobile phone (every phone except Apple phones should be able to detect it). If you don’t you know that there’s something wrong with your HC-06 and you need to fix it by checking the connections.
  3. Double click it.
  4. On the next dialogue box select Enter the device’s pairing code and type in 1234. It should then install the HC 06 to your computer.
  5. Now this part is important open up device manager (Start >> Device Manager)
  6. Go to Ports (COM & LPT) and note down the COM ports used by the Bluetooth. COM10 and COM21 for my case.

BitVoicer (Option 1)

You can download BitVoicer from here.

First things first, if you are not a American English speaker you might want to change the language/accent that BitVoicer will try to recognize. To do this go to File >> Additional Languages and go on install your preferred language/accent.

Now we are ready to add some commands. For starters we are going to add just two commands namely stand up and sit down. You can edit them or add more commands later. The steps are as follows:

  1. Create a new Schema by going to File >> New.
  2. Change the default command data type to char.
  3. Click Add New Sentence
  4. Type in “stand up” (no quotes)
  5. Click Add New Sentence
  6. Type in “sit down” (again no quotes)
  7. Now below in Sentence Anagrams, change the data types of both commands to char if its not already char.
  8. For stand up type the command as u and for sit down type i.
  9. Your screen should look like Image 5.
  10. Save your schema in a directory that is convenient for you.
  11. Now open up preferences by File >> Preferences and fill in the stuff using. The only three things that will be different for you will be the Default Output Folder, Speech Recognition Language and the Port Name.
  12. Your Port Name will correspond to your COM Port that YOUR Bluetooth uses that you found out in step 6 in Bluetooth. Try the lower number first.
  13. Hit Save.
  14. Go to the next step.

Python (Option 2)

First download Python (for this tutorial I am going to use version 3.4) which can be downloaded from https://www.python.org/download/releases/3.4.1/ Why 3.4 and the not the 2.7 version? Well the speech recognition which we are using works only on the 3.4 version of python (ya I know that sucks but hey 3.4 > 2.7 right?)

Install it into C:\Python34 for convenience sake. Next thing you need to do is download and install eclipse which is an interface for writing your code via Python. You can download it from http://www.eclipse.org/downloads/ Install it into C:\eclipse

If you get this error while running it…

A Java Runtime Environment (JRE) or Java Development Kit (JDK) must be available in order to run Eclipse. No Java virtual machine was found after searching the following locations:
C:\eclipse\jre\bin\javaw.exe
javaw.exe in your current PATH

Go to the first answer in http://stackoverflow.com/questions/2030434/eclipse-no-java-jre-jdk-no-virtual-machine

www.stackoverflow.com is the best place to find help for any issues in python

Use this tutorial to install python into eclipse except instead of python 2.6 you should choose python 3.4. Alternatively use this tutorial to install python into Geanyif you chose to use Geany which I hear is much nicer than eclipse but I haven’t had enough time to migrate to it. There are something called ‘libraries’ that you will need to access the different functionality of python. Now lets go ahead and install some libraries that we need for this project. For those of you who already know how to install libraries just go ahead install the listed ones and go on to the code.

Libraries Required:

  • PyBluez 0.20 – Bluetooth (setup)
  • PyAudio 0.2.8 – provides bindings for speech recognition (setup)
  • SpeechRecognition 1.1.0 – speech recognition (download)

PyBluez and PyAudio can be installed by downloading the “….win32-py3.4.exe” versions from the link and simply running the executable (setup) file.

SpeechRecognition library is a bit tricky if you’re new to python. Open the ‘download’ link above and click downloads. Download “SpeechRecognition-1.1.0.tar.gz (md5)”. Extract all the CONTENTS (not the folder that says SpeechRecognition-1.1.0 but whats in it) of the downloaded folder (via winRAR or any extractor you feel like using) to the python 3.4 directory (C:\Python34). If a dialogue box appears asking whether to replace existing files, click yes to all. Now navigate to C: drive and while holding Shift, right click on the folder that says Python34 and click “Open command window here”. Type in “python setup.py install” without quotes and hit Enter. Now you should have all the required libraries installed. Yay!

Open eclipse.

Under the folder PythonTest from the tutorial create a new file called Speech_Controlled_Quadrapod.py similar to how you created the pytest.py file you created in the tutorial. Now paste or download…

The code:

# Title: Speech Controlled Quadrapod
# Author: Adith Jagadish
# Description: Uses Google's Speech Recognition API and python bluetooth library to control a quadrapod 
# How to: Say "stand up" to make bot stand up and say "sit down" to make it sit. 
# Say "stop listening to terminate the program"

import bluetooth
import speech_recognition as sr
   
port = 1
bd_addr = "xx:xx:xx:xx:xx:xx"
sock = bluetooth.BluetoothSocket( bluetooth.RFCOMM )
sock.connect((bd_addr, port))
r = sr.Recognizer()
r.pause_threshold = 0.6
r.energy_threshold = 400
MESSAGE = "continue"
print("Ready to accept command...\n")
while (MESSAGE != "stop listening"):
    while (MESSAGE == "continue"):
        try:
            # use the default microphone as the audio source
            with sr.Microphone() as source:
                # listen for the first phrase and extract it into audio data
                audio = r.listen(source,timeout = None)
            # recognize speech using Google Speech Recognition                   
            print("Me: " + r.recognize(audio))    
            MESSAGE = str(r.recognize(audio))
        except LookupError:
            # speech is unintelligible                            
            MESSAGE = "continue"
            print("Could not understand audio")
        if(MESSAGE == "sit down"):
            sock.send("i")
        elif(MESSAGE == "stand up"):
            sock.send("u")
        elif(MESSAGE == "stop listening"):
            break;
        MESSAGE = "continue"
sock.close()
print("\nProgram terminated.")

The only change you need to do is in the line:

bd_addr = "xx:xx:xx:xx:xx:xx"

Right click on the bluetooth icon on the tray and click on “Show Bluetooth Devices”. Right click on HC-06 and click on properties. Go to the Bluetooth tab and copy the Unique identifier number and paste that in the place of xx:xx:xx:xx:xx:xx.
You are good to go on to testing.

Step 6: Testing and Debugging

Picture of Testing and Debugging

Well there’s nothing much to say here than what you expect.

Testing:

  1. Power up the quadropod or your robot.
  2. Turn on your Bluetooth if you haven’t done so already on your computer.
  3. Make sure your microphone is working.
  4. Launch BitVoicer or Python which ever you decided to use.
  5. Run the program.
  6. Say “stand up” or “sit down” as clearly as you can and see the magic unfold!
  7. Voila!

Now that you know the basics you can add more lines of code to make your bot do even more cool stuff!

If you weren’t lucky enough to get it working in the first go, its okay! It happens to the best of us.

Debugging:

  1. Try Googling your issue
  2. Make sure your wiring is correct
  3. Your code is copied correctly
  4. If it still doesn’t work, post a comment here and I’ll try my best to help you out!

3D Printed Microcontroller Dice Roller

Picture of 3D Printed Microcontroller Dice Roller
demo_gif.gif

This is a just-for-fun project I did in the Digilent MakerSpace. Usually whenever I play board games I use a dice-rolling smartphone app since dice are so easy to lose. I thought I would try making my own hardware version though. It turned out really well and was a lot of fun. In this Instructable I will go over how I designed the circuitry, wrote the code for the microcontroller, made the housing, and put it all together!

I entered this project in several contests on Instructables. If you like this project, please vote for it by clicking the “Vote!” ribbon in the top right corner.

Note: If you’re viewing this on the Instructables App, there may be some formatting issues with lists and images. I would recommend viewing it on a computer. If you only have access to a mobile device though, you can view it in your mobile browser just fine. It’s only on the App that it has formatting issues.

Step 1: Materials

Picture of Materials

Note: If you’re viewing this on the Instructables App, there may be some formatting issues with lists and images. I would recommend viewing it on a computer. If you only have access to a mobile device though, you can view it in your mobile browser just fine. It’s only on the App that it has formatting issues.

Here’s a list of the parts I used for this project.

  • chipKIT uC32 Microcontroller and USB cable
    • You can also use a lot of other microcontrollers such as the Arduino Uno, or chipKIT Uno32
  • 7 green 5mm LEDs
  • Push button (approx. 6mm wide)
  • 2 zUNO clips to hold the microcontroller board
    • You can order some or even 3D print your own zUNO clips for free at the link above.
  • 3D printed housing (file below)
    • Not everyone has access to a 3D printer, but there are plenty of other options here. A thick card stock or paperboard will work perfectly. Also, if you want a 3D printed version and don’t have access to a printer, there are several companies (such as i.materialise, Sculpteo, Ponoko, and Shapeways) that will print a design for you and mail it to you.
  • For putting everything together you just need some basic tools: wire cutters, a soldering iron, and some glue (hot glue and super glue both work well).

My 3D print design can be found at this link. From this page you can order a print from one of the companies I mentioned above. Or, if you will be printing it yourself, you can download the file here too. Tinkercad will also allow you to make a copy of the file so you can make your own edits. You can make changes to the design to use a different size of LEDs or a different button. You can also make it bigger and even add text or logos!
Now onto the design. I will start by explaining the circuitry in the next step. After that, I will talk about programming the microcontroller. And last but not least, putting it all together!

Note: If you are going to be printing the housing, I would start that now as you continue through the remaining steps. Then the print will be done when you’re ready to put everything together.

Step 2: The Circuit

Picture of The Circuit

The circuit for this project is very simple. We just need a power and ground connection to the LEDs and the button wired appropriately. Note: the “power” to the LEDs comes from the microcontroller’s Digital I/O pins.

I included a schematic view of the uC32 so you can see how everything will eventually be connected to the board.

Step 3: Programming the Microcontroller

I wrote my code in MPIDE. MPIDE is completely free and can be downloadedhere. Note: MPIDE can program just about any microcontroller, including all chipKIT and Arduino boards. If you’re using an Arduino board and already have Arduino’s IDE that will work just as well too.

If you’re brand new to microcontrollers and programming in an IDE, here is a quick tutorial that explains how to install MPIDE and use it to communicate with your board: Intro to MPIDE.

The code I wrote for the microcontroller can be seen below. There are three main portions to this “sketch”:

  • The roll() function which makes it look as if the die is actually rolling. This is a function I created myself and it is included in the code below.
  • Generating a random number between 1 and 6 with the rand() function. (This function is predefined in the stdlib.h header file.)
  • And a series of if statements to light up the correct LEDs based on the random value that is generated.

This code can be copied directly from the box below and pasted into your IDE. Then you can upload it directly to your board from there. If you’re unsure how to upload the sketch, check out the “Intro to MPIDE” tutorial linked above.

#include <stdlib.h> // needed to use the rand() function

// Initializing the pin used for button input
const int btnPin = 26;

// Initializing pins to control each LED
const int led_left_top = 33;     // LED in top left corner of die
const int led_left_center = 32;  // LED in on the left side of the die in the center row
const int led_left_bottom = 31;  // LED in the bottom left corner of the die
const int led_right_top = 30;    // LED in the top right corner of the die
const int led_right_center = 29; // LED on the right side of the die in the center row
const int led_right_bottom = 28; // LED in the bottom right corner of the die
const int led_center = 27;       // LED in the center of the die

// Creating a function to simulate the action of a die rolling
// Function will make LEDs flicker through numbers 1-6
void roll()
{
  int i = 0; // Counting variable for loop
  
  for (i = 0; i < 5; i++) // cycle 5 times
  {
    // Display 1
    digitalWrite(led_left_top, LOW);
    digitalWrite(led_left_center, LOW);
    digitalWrite(led_left_bottom, LOW);
    digitalWrite(led_right_top, LOW);
    digitalWrite(led_right_center, LOW);
    digitalWrite(led_right_bottom, LOW);
    digitalWrite(led_center, HIGH);
    delay(50); //wait 100ms before displaying next value
    
    // Display 2
    digitalWrite(led_left_top, HIGH);
    digitalWrite(led_left_center, LOW);
    digitalWrite(led_left_bottom, LOW);
    digitalWrite(led_right_top, LOW);
    digitalWrite(led_right_center, LOW);
    digitalWrite(led_right_bottom, HIGH);
    digitalWrite(led_center, LOW);
    delay(50);
    
    // Display 3
    digitalWrite(led_left_top, HIGH);
    digitalWrite(led_left_center, LOW);
    digitalWrite(led_left_bottom, LOW);
    digitalWrite(led_right_top, LOW);
    digitalWrite(led_right_center, LOW);
    digitalWrite(led_right_bottom, HIGH);
    digitalWrite(led_center, HIGH);
    delay(50);
    
    // Display 4
    digitalWrite(led_left_top, HIGH);
    digitalWrite(led_left_center, LOW);
    digitalWrite(led_left_bottom, HIGH);
    digitalWrite(led_right_top, HIGH);
    digitalWrite(led_right_center, LOW);
    digitalWrite(led_right_bottom, HIGH);
    digitalWrite(led_center, LOW);
    delay(50);
    
    // Display 5
    digitalWrite(led_left_top, HIGH);
    digitalWrite(led_left_center, LOW);
    digitalWrite(led_left_bottom, HIGH);
    digitalWrite(led_right_top, HIGH);
    digitalWrite(led_right_center, LOW);
    digitalWrite(led_right_bottom, HIGH);
    digitalWrite(led_center, HIGH);
    delay(50);
    
    // Display 6
    digitalWrite(led_left_top, HIGH);
    digitalWrite(led_left_center, HIGH);
    digitalWrite(led_left_bottom, HIGH);
    digitalWrite(led_right_top, HIGH);
    digitalWrite(led_right_center, HIGH);
    digitalWrite(led_right_bottom, HIGH);
    digitalWrite(led_center, LOW);
    delay(50);
  }
} // End of roll function

// Initiliazing variable to represent die value
int die_value = 0;

void setup () // configure microcontroller pins
{
  //Set button pin for input
  pinMode(btnPin, INPUT);
  
  // Set LED pins to output
  pinMode(led_left_top, OUTPUT);
  pinMode(led_left_center, OUTPUT);
  pinMode(led_left_bottom, OUTPUT);
  pinMode(led_right_top, OUTPUT);
  pinMode(led_right_center, OUTPUT);
  pinMode(led_right_bottom, OUTPUT);
  pinMode(led_center, OUTPUT);
}

void loop ()
{
  // Check for button press
  if (digitalRead(btnPin) == HIGH) // if button was pressed
  {
    roll(); // Simulate die roll
    die_value = (rand() % 6) + 1; // Randomly select die value, this code will generate a number between 1 and 6
    
    if (die_value == 1) // only center LED is lit
    {
      digitalWrite(led_left_top, LOW);
      digitalWrite(led_left_center, LOW);
      digitalWrite(led_left_bottom, LOW);
      digitalWrite(led_right_top, LOW);
      digitalWrite(led_right_center, LOW);
      digitalWrite(led_right_bottom, LOW);
      digitalWrite(led_center, HIGH); 
    }
    else if (die_value == 2) // only the top left and bottom right LEDs are lit
    {
      digitalWrite(led_left_top, HIGH);
      digitalWrite(led_left_center, LOW);
      digitalWrite(led_left_bottom, LOW);
      digitalWrite(led_right_top, LOW);
      digitalWrite(led_right_center, LOW);
      digitalWrite(led_right_bottom, HIGH);
      digitalWrite(led_center, LOW);
    }
    else if (die_value == 3) // the top left, bottom right, and center LEDs are lit
    {
      digitalWrite(led_left_top, HIGH);
      digitalWrite(led_left_center, LOW);
      digitalWrite(led_left_bottom, LOW);
      digitalWrite(led_right_top, LOW);
      digitalWrite(led_right_center, LOW);
      digitalWrite(led_right_bottom, HIGH);
      digitalWrite(led_center, HIGH);
    }
    else if (die_value == 4) // the four LEDs on the corners are lit
    {
      digitalWrite(led_left_top, HIGH);
      digitalWrite(led_left_center, LOW);
      digitalWrite(led_left_bottom, HIGH);
      digitalWrite(led_right_top, HIGH);
      digitalWrite(led_right_center, LOW);
      digitalWrite(led_right_bottom, HIGH);
      digitalWrite(led_center, LOW);
    }
    else if (die_value == 5) // the 4 corner LEDs and the center LED are lit
    {
      digitalWrite(led_left_top, HIGH);
      digitalWrite(led_left_center, LOW);
      digitalWrite(led_left_bottom, HIGH);
      digitalWrite(led_right_top, HIGH);
      digitalWrite(led_right_center, LOW);
      digitalWrite(led_right_bottom, HIGH);
      digitalWrite(led_center, HIGH);
    }
    else // die_value == 6 and all LEDs are lit except the center one
    {
      digitalWrite(led_left_top, HIGH);
      digitalWrite(led_left_center, HIGH);
      digitalWrite(led_left_bottom, HIGH);
      digitalWrite(led_right_top, HIGH);
      digitalWrite(led_right_center, HIGH);
      digitalWrite(led_right_bottom, HIGH);
      digitalWrite(led_center, LOW);
    }
    delay(4000); // displays die value for 4000ms, or 4 seconds
  }
  else // LEDs stay off if button has not been pressed
  {
    digitalWrite(led_left_top, LOW);
    digitalWrite(led_left_center, LOW);
    digitalWrite(led_left_bottom, LOW);
    digitalWrite(led_right_top, LOW);
    digitalWrite(led_right_center, LOW);
    digitalWrite(led_right_bottom, LOW);
    digitalWrite(led_center, LOW);
  }
}

Step 4: Putting It All Together

This is the part that I think is the most fun! For one, I love soldering. And it’s really awesome getting to see the design come together.

Here’s a brief summary of what we need to do:

  • Place the LEDs in the top piece of the housing.
  • Solder all the connections to the LEDs.
  • Place the button in the square hole of the front panel of the housing.
  • Solder the button connections.
  • Attach the zUNO clips to the base of the housing.
  • Assemble the cube and attach it to the base.

I started by placing all of the LEDs and soldering their connections. For the housing I printed, the LEDs fit perfectly in the holes such that they stayed tightly in place without any glue. Depending on your printer you may need a little bit of glue to help them stay in place.

For soldering the LEDs, it is very important that the LEDs are connected with the correct orientation. The anodes of the LEDs will be connected to the digital I/O pins of the microcontroller, and the cathodes of the LEDs will be connected to Ground.

As you can see in my picture, I connected all of the cathodes together with blue wires. Ground connections do not have to be specific, so we can connect all of the cathodes together and then connect that group to ground with a single wire (the black one in the image). This makes it a lot simpler then having seven separate ground wires. After connecting all the cathodes together and to a single ground wire, I connected a separate wire to each of the anodes of the different LEDs. These are the red wires in the picture. As you can see, I used a marker to put some tally marks on each one. This makes it easier to keep track of which wire will be connected to each pin of the microcontroller.


Next, I put the button in the square hole of the front panel. This required a little bit of hot glue on the back to hold it in place. The picture above doesn’t show my nicest looking work, but when we put the cube together this side won’t be seen.

Recall from the schematic that only 3 of the 4 pins on the button are connected. One is connected to ground through a 220Ω resistor. Another pin is connected to the microcontroller’s 5V power source. The other pin is connected to pin 26 of the microcontroller. Pin 26 reads whether the button has been pressed or not. You can see how I soldered the wires in the picture above. Notice that I labeled the wires with one dot, two dots, and a dash. This is to keep track of what wire goes to which pin on the microcontroller. One dot is for the wire that goes to pin 26. Two dots is for the wire that connects to the microcontroller’s 5V supply. The dash is for the wire that connects to the microcontroller’s GND.

For the 220Ω resistor at the ground connection, I just put one end of a resistor into the end of my connector. The exposed pin of the 220Ω resistor will be connected directly to the GND pin of the microcontroller. If you aren’t using MTE cables like I am, you can just solder a resistor to the other end of your wire.


The next step is to attach the zUNO clips to the base. The holes on the zUNO clips were a little too big for the pins on the base, but a little bit of hot glue over the joint makes it fit well and stay in place.


Now let’s put the whole cube together. Use glue as necessary to hold the sides of the cube together. Also try to keep the wires lined up nicely so they will all be coming out of the front panel when we place the cube on the base.


All that’s left now is to put the cube on the base! I didn’t need any glue to hold the cube in place because the cube fit into the holes of the base perfectly. But use glue as needed to keep the cube attached. The final result is shown below! Move on to the next step to see it in action.

Step 5: The Final Result!

Lastly, mount the microcontroller onto the zUNO clips and plug in all the wires.

The pin numbers that each wire needs to connect to is shown in the code in step 3, but to review:

  • The LEDs are connected to pins 27-33.
  • The ground wire of the LEDs is connected to the microcontroller’s GND.
  • The wire that carries the button’s state (whether it’s been pressed or not) is connected to pin 26.
  • Another wire of the button is connected to the microcontroller’s 5V supply, and the button’s other last wire is connected to the microcontroller’s GND through a 220Ω resistor..

Then just upload the sketch from the IDE to your microcontroller and it’s ready to go! Check it out in the video above!

Please don’t hesitate to ask any questions if you have any!

How to make a Playful Puppy Robot

Picture of Playful Puppy Robot
This guide will show you how to make a surprisingly simple puppy robot in just 3-4 hours.
No soldering is required and it is Arduino compatible for easy programming.
The kit with all parts except batteries is now being sold!

The controller comes with an ATmega8 but this is easily upgradeable if required.
The sample code provided allows the puppy to sit, shake hands, lie down, beg and walk on it’s hind legs.
This robot is fully interactive and is controlled by hand gestures.

Required tools:
– #1 Phillips screwdriver.
– Small (2.4mm or 3/32 inch) flat head screwdriver.
– Long nose pliers

Required parts:
(Available around the world from online stores that sell DAGU products)

– QuadBot chassis kit.
– Pan / tilt kit.
– IR compound eye.
– Magician robot controller.
– Pack of jumper wires (male to male and female to female).
– 1x 1N5400 3A rectifier diode
– 4x small servo extension cables 100mm (4 inch).
– 2x 25mm (1 inch) brass hex spacers with suitable mounting screws.
– Small cable ties and two larger cable ties to hold the battery holder.

NOTE: The robot works best with a 7.4V 2300mAH battery as shown in the videos but it can also work with 6x NiMh AA batteries in a 3×2 battery holder. There are two variations of the sample code depending on which battery you use. The only difference is that the NiMh batteries are heavier so that the robot needs different code to balance on it’s hind legs.

Step 1: Center your servos

Picture of Center your servos
The QuadBot chassis kit includes 8x miniature servos and the Pan/Tilt kit includes 2 miniature servos. These servos have clear cases so you should be able to see through them well enough to check that they are centered.

When the servo is centered it can physically turn the same amount in each direction (approximately 90 degrees). Before you begin assembly check that all 10 of your servos are centered. This will save you time in later steps. Once the robot is assembled, the center position of each servo can be refined in the software if required.

Step 2: Attach the foam rubber feet

Picture of Attach the foam rubber feet
Instructions are included with the QuadBot chassis and can also be downloaded from my DAGU product support site . I will go through them in detail here.

There are 8 identical leg segments included in the kit. Select 4 of them and attach foam rubber feet using the 3 x 12mm panhead screws. There are only 4 of these screws and they are the largest so they are easy to find.

NOTE:
Self tapping screws have a course thread and tapered tip that can drill into soft materials like plastic and wood.

Pan head screws have an extra large head. This is the equivalent of using a seperate washer and screw. The pan head screws in this kit are also self tapping screws.

Machine screws have a fine thread and are designed to screw into machined parts where a thread is already present.

Step 3: Mount knee servos to legs

Picture of Mount knee servos to legs
Mount 4 of your servos on the leg segments with the foam rubber feet. Use 2.3 x 8mm self tapping screws.
Make sure the output shaft is away from the foot as shown in the photo.

Step 4: Mount servo horns on the thigh segments

Picture of Mount servo horns on the thigh segments
A servo horn is a piece of plastic that fits onto the output shaft of the servo and allows you to attach control rods or other devices using self tapping screws.

Take the 4 unused leg segments and fit servo horns as shown in the photo below using 2 x 6mm self tapping screws. There are are only 8 of these. Use two for each servo horn.

Step 5: Attach your thigh segment to your legs

Picture of Attach your thigh segment to your legs
Once you have mounted the servo horns on the thigh segments they can be attached to your knee servos using 4 of the 8 supplied 2 x 8mm pan head screws.  With your knee servo centered the thighs should be at 90 degrees to the leg segment with the foam rubber foot.

Step 6: Left and Right legs

Picture of Left and Right legs
So far all of your legs should look identical. Now we will seperate them into left and right legs. The only difference between a left leg and a right leg is the way in which the thigh servo is mounted.

Mount you thigh servos as shown in the photo using the 2 x 8mm self tapping screws. Note the difference in the position of the output shaft in the left leg thigh servos compared to the right leg thigh servos. They are mirror imaged.

Step 7: Attach your legs to the mounting plate

Picture of Attach your legs to the mounting plate
The mounting plate of the QuadBot chassis is the laser cut acrylic panel that you mount all your parts on. Mount your left and right legs as shown in the photo.

NOTE: the rounded end is the rear of your robot. The hole in the center of the rear is where you will mount your tail.

Step 8: Fit mounting spacers to pan servo.

Picture of Fit mounting spacers to pan servo.
a8 insert pan servo mounting screws.jpg
a9 attach pan servo spacers.jpg
The brass hex spacers used to mount the pan / tilt kit require a 3mm screw but the miniature servo housing has a hole designed for a 2mm screw. Previously I have tried using a 3mm drill bit but it bites into the plastic which can break the servo mounts and it can accidently dig into the servo cable.

Since the hole only needs to be opened by a small amount I found it was easier and safer to use my phillips head screwdriver to ream out the holes on each side. Do not press to hard, you only need to ream out each side of the hole by a small amount so that the 3mm screw can be inserted without too much force.

Once you have the 3mm screw in all the way, continue turning the screw a few times so that the screw thread can drill out the hole a bit more. You can now easily fit your 25mm (1 inch) brass hex spacers. Use your pliers if necessary to hold the brass spacers when tightening the screws. Do not overtighten otherwise you will crack the servo mounts.

Step 9: Attach the pan servo to the pan bracket

Picture of Attach the pan servo to the pan bracket
The pan / tilt kit comes with two large brackets. Each bracket has a different hole pattern on it to suit many different sensors. Look at the photo below and make sure you attach the correct bracket to the pan servo as the compound eye cannot be mounted if you use the wrong bracket.

The assembly instructions for the pan / tilt kit are included in the kit. They can also be downloaded from my product support site . Use 2 x 6mm pan head servos to mount the bracket to the round servo horn and another 2 x 6mm pan head to mount the servo horn onto the pan servo as shown.

Step 10: Attach the tilt servo locating bracket

Picture of Attach the tilt servo locating bracket
The tilt servo locating bracket doesn’t actually hold the tilt servo in place. It’s main purpose is to prevent the tilt servo from twisting while allowing some play so the servo can self center.

This bracket is mounted using two 3 x 6mm machine thread screws and two 3mm nuts. Leave these nuts slightly loose at this stage so that the bracket can move forwards and backwards slightly. We will tighten them once the servo is aligned.

Step 11: Attach the servo horn to the tilt bracket

Picture of Attach the servo horn to the tilt bracket
Use two 2 x 6mm pan head screws to attach a round servo horn to the tilt bracket. Use holes that will allow the center hole of the servo horn to align with the bracket as shown without allowing too much play if the screws come loose.

Step 12: Align the tilt servo

Picture of Align the tilt servo
Use a 2.3 x 12mm screw to align your tilt servo. The servo will be a firm fit in the tilt servo locating bracket. Adjust the position so that the servo alignes with the pan bracket and allow room for spacer washers and a servo horn. Once everything is aligned you can tighten the 3mm screws holding the tilt servo locating bracket so that it wont move.

Step 13: Inserting the spacer washers

Picture of Inserting the spacer washers
a15 insert spacer washers.jpg
a16 align spacer washers.jpg
a17 pan tilt complete.jpg
There are some spacer washers that must be inserted so that the tilt assembly can rotate smoothly. The easiest way I know to insert these spacer washers is to lie the pan / tilt assembly on it’s side with the tilt bracket at about 40 degrees when the tilt servo is centered as shown in the first photo.

Stack your spacer washers neatly and insert them as best you can with your hand or long nose pliers as shown in the second photo.

Use a plastic cable tie to gently push the washers into position as shown in the third photo. Once the holes line up it’s easy to insert your 2.3 x 10mm screw through the brackets and washers into the servo output shaft.

Tighten all the way being careful not to overtighten and then back off 1/4 of a turn. The tilt bracket should be able to tilt up and down fairly easily. Loosen the screw a little more if required.

Do the same on the other side with the 3 x 10mm machine screw and nylon nut on the other side.
NOTE: The nylon nut should be slightly loose so that the tilt assembly can move easily.

Step 14: Mount the eye

Picture of Mount the eye
Eye connections.jpg
a19 attach wires to eye.jpg
a20 mount eye.jpg
The IR compound eye is a simple sensor that measures ambient light and reflected IR light from an object. The eye has seven connectors. Vcc (+5V) and Ground for power. The ENable pin turns the IR LEDs on or off and then there are 4 analog outputs for up, down, left and right.

Attach the four mounting spacers on the tilt bracket as shown in the first photo. Make sure the 3mm nuts are tight.

Connect 80mm female to female jumper wires to the eye as shown in the second photo. I’ve used red for Vcc (+5V), black for ground (0V), orange for the enable pin and white for the 4 analog outputs.

Because 80mm is not long enough I have then attached 80mm male to male jumper wires. This gives me 160mm female to male jumper wires as shown in the third photo.

By runing the wires under the tilt bracket and then over the tilt servo as shown I find the head can move freely in any direction without pulling on the wires.

Step 15: Mount the head onto the body

Picture of Mount the head onto the body
a22 head and body complete.jpg
Use two 3 x 10mm pan head machine screws to mount the head. Notice the holes used in the first photo. The servo is mounted slightly toone side so that the pan servo’s output shaft is centered rather than the servo body. The pan servo is mounted a reasonable distance back to improve the ballance of the robot and to help protect the pan servo if the robot runs into a wall.

Step 16: Mount your batteries

Picture of Mount your batteries
a25 mount the batteries.jpg
This robot works best with a small LiPo battery (7.4V, 2300mAH) but rechargeable AA NiMh batteries are more common and easier to recharge so we will use 6x AA NiMh batteries for these instructions. Do not use alkaline batteries, the voltage is too high and they cannot put out enough current to drive the servos. This will cause your processor to continually reset. Use good quality batteries with at least 2000mAH capacity.

2 x 3 cell battery holders are fairly common and are small enough to fit between the legs of the robot. Place some tape on the ends of the wires before you starts so that they can’t short out before you are ready to connect them and then install 6x fully charged batteries.

Cable tie your battery holder to the center of the mounting plate with your wires on the right hand side. Make sure all legs can move inward about 45 degrees without hitting it. Do not make the cable ties too tight ! You should be able to just slide the battery holder sideways out of the cable ties when it is time to recharge the batteries.

Step 17: Mount the Magician robot controller

Picture of Mount the Magician robot controller
a27 mount controller board.jpg
Attach the four supplied brass spacers to the corners of the controller PCB. Now mount the board on the chassis with the power connection on the right hand side and the analog pins closest to the head as shown in the second photo.

Step 18: Wire up the head

Picture of Wire up the head
a29 wire head.jpg
Magician Controller descriptive photo.jpg
Eye connections.jpg
Connect your pan servo to A4 and your tilt servo to A5 with the white signal wire towards the female header. The software reconfigures these pins as servo outputs.

Fit small black and red female to female jumper wires to Vcc and ground as shown in the photo. These will provide power for the infrared compound eye. Check the third photo if your not certain where to connect something or download the manual from my support site . Connect the red and black power wires from the eye.

The motor control circuitry connects to the processor via 4 jumpers. remove the jumpers on D8 and D10. Leave the jumpers on D7 and D9. This allows us to use D8 to control the IR LEDs on the eye. D10 might be used to control a small speaker or LED. The left motor control circuit is still connected and can be used to drive a small shaker motor to make the tail wag.

Connect your orange jumper wire to D8 and connect your 4 analog outputs from your eye.
A0 is up.
A1 is left.
A2 is down.
A3 is right.

Step 19: Power configuration

Picture of Power configuration
The power configuration for this robot is slightly unusual.

10 miniature servos combined can draw more than 3amps when the robot is running around. This is far more than the Magician’s regulator can handle. The magician board does allow you to power up to 8 servos directly from the battery but the servos are rated at 6V and the battery is 7.2V!

To solve these problems, the pan and tilt servos which are not heavily loaded are run from analog pins 4 and 5. These pins have been converted to servo outputs in the software and have power supplied by the 5V regulator (normally used for sensors). This leaves the 8 leg servos to be powered directly from the battery.

What we do here is to use diodes to drop the voltage to a safe level. Most silicon diodes have a forward voltage drop of about 0.6V regardless of the current. The Magician controller already has a 3A diode in series with the battery to protect against accidently connecting the power around the wrong way.

This diode will drop the voltage by about 0.6V. By adding a second diode in place of the jumper that selects the servo power we effectively have two 3A diodes in series with the battery that is reducing the voltage to the servos by a total of 1.2V.  Our leg servos are now getting the 6V they are rated for.

The leads on a 3A diode are slightly thicker that a male header pin. bend the leads close to the body of the diode as shown and fit two small female to female jimper wires. they will be a tight fit but this is good as it ensures a good electrical connection. Fit a red wire to the anode and a black wire to the cathode.

Now remove the servo power selection jumper and fit the diode instead. Attach the red wire to VBAT and the black wire to the center pin. The servos will now receive power through this diode. If you accidentally connect the diode the wrong way around then the servos will not get power but no damage will be done.

NOTE: If you want your robot to stand on it’s hind legs with NiMh batteries then you may need to leave the diode out and use the original jumper. This will give your servos 6.6V and more power. You will increase your risk of damaging the servos slightly if you do this.

Step 20: Connect the leg servos

Picture of Connect the leg servos
Start by fitting the servo extension cables to the front leg servos. Now plug in your leg servos.

D2 – Rear left hip
D3 – Rear left knee
D4 – Front left hip
D5 – Front left knee
D6 – Front right knee
D11 – Front right hip
D12 – Rear right knee
D13 – Rear right hip

Make sure your white / orange control wire is closest to the female header.

Step 21: Connect your batteries.

Picture of Connect your batteries.
Make sure the power switch is in the off position and then remove the tape from your black (- negative) wire. strip the wire back, twist the strands together and then bend it over before inserting into the screw terminal as shown in the photo. This will help to ensure a good connection. Now remove the tape on the red wire (+ positive) and do the same.

Step 22: Add a tail

Picture of Add a tail
a34 mount the tail.jpg
a36 shaker motor.jpg
Perhaps the easiest way to make a tail is to use a short length of spiral wrap. If you don’t have spiral wrap then you can make some by feeding a drinking straw through a pencil sharpener. The pencil sharpener will probably need a new blade as it has to be sharp.

Plug one end with some hotglue or putty. Then use a self tapping screw to mount it at the back of the chassis as shown in the second photo. If you can get hold of a small shaker motor from an old mobile phone then mount it on the tail and connect it to the left motor output. The sample code will shake the tail harder as your hand gets closer.

Step 23: Installing the software

Picture of Installing the software
IOpins tab.jpg
Code screenshot003.jpg
To install the sample code you must first have the Arduino IDE version 18 or later running on your computer. Download here .

The Magician controller uses the CP2102 interface IC. Depending on your OS you may need to install drivers. You can download the USB drivers as well as the QuadBot Puppy program from my support site .

Once you have the Arduino IDE running, you can open the program. Make sure you have the version suited to the batteries you are using so that the robot will be able to stand on it’s hind legs.  You will see that the code has a constants tab and an IO pins tab.

The constants tab stores values such as the center positions of servos. This makes it quick and easy to fine tune your code. The IOpins tab is like a map for your wiring. It tells you which device is connected to what pin.

Go to the tools menu and select your board type. The Magician controller is compatible with “Arduino NG or older /w ATmega8”

Now plug in your USB cable. After a few seconds your computer should detect the USB interface of the Magician controller. Go back to the tools menu and select your USB virtual serial port.

Press the reset button on the controller and upload the program.

Step 24: Controlling your robot

Picture of Controlling your robot
The sample code tracks your hand movements with the compound eye and responds accordingly. If you leave the robot alone for 3 seconds then it will sit down out of boredom. Once it sits down it gets lazy and will only track your hand with it’s head.

If you slowly bring your hand within range and move your hand to a paw then the robot will lift that paw for you to shake hands. If you bring your hand between the paws it will lie down.

Once sitting you must excite the robot to get it to stand back on it’s feet. taking your hand to one side or above the robot so it can no longer see your hand will do this.

When the robot is standing it will try and maintain a set distance from your hand. It will walk forward, move backward or turn in order to keep your hand in range. Do not move your hand too quick or it may loose sight of your hand.

If you move your hand directly over it’s head while the robot is standing then it will try to jump on it’s rear legs. If your hand stays in range of the eye the robot will try and walk on it’s back legs to follow your hand. This trick might not work with the NiMh batteries as they are heavier than the LiPo battery used in the demo video.

Step 25: Trouble shooting

Picture of Trouble shooting
The robot probably won’t work very well at first. Each servo is a little bit different so some tweaking is required. At the start of the main loop add the line “return;”. This will cause the robot to stand still indefinitely.

You can now adjust the servo center positions in the constants tab until the rear hips are at 90 degrees to the body and the front hips are slightly forward. All knees should bend inward slightly. If any of your servos are drastically out of alignment then you may need to reseat the servo horn.

Your head should look straight forward with the head looking up at about a 30 – 40 degree angle. Once your robots servos are set correctly you can remove the “return;” line from your main loop.

If the head does not track your hand then check your eye connections. If the head turns away from your hand then you may have accidently swapped your wires or mounted the eye upside down (the wires should come out at the bottom). Double check everything in steps 14 and 18. Are your servos plugged in corectly?

If the robot frequently resets then you may have a bad battery or battery connection. Check your batteries have a good charge and are making a good connection with the battery holder terminals. Are the battery terminal screws tight? Some cheap batteries may not be able to deliver enough current.

If the head does not follow at all then check your eye has power (5V) and that the orange wire is connected to D8. Double check everything in steps 14 and 18.

If the head works but the legs don’t move then your diode that replaced the servo power jumper in step 19 may be around the wrong way or not plugged into the correct pins. Check step 20 to make sure you have the servos plugged in correctly with the signal wire closest to the female header.

If the robot walks funny then you may have accidently swapped hip and knee servos. The hip servo allows the leg to move forward and backward. The knee servo allows the leg to move up and down.

If the robot cannot stand on it’s hind legs then check you have the correct version of the code. NiMh batteries are heavier than LiPo batteries. This affects the robots ballance. You may need to play with the code or adjust the position of your batteries a bit.

If you are Using NiMh then you will probably need to remove the voltage drop diode and re-install the voltage selection jumper for the servo power. This will run your servos at 6.6V giving them more power but there is a greater chance of the servo stripping a gear or overheating.

Step 26: Give your puppy some personality!

Picture of Give your puppy some personality!
The Sample code is just the beginning. Experiment with it! If you make a mess of it you can always download the original code and start again.

Moods such as happy and bored are easily programmed into the robot. Use a counter to determine happiness by how much a person plays with it. When they stop playing then the counter counts down.

Boredom sets in when you don’t play with it for a short while. Perhaps your robot will be naughty if it becomes too bored?