As I am qualified to participate to “Infoeducatie” National Contest, robotics section, I have started to build a new arm, 1:1 scale, stronger and which is capable of moving in 3 axis. In order to do this, I used 3 independent motors for the shoulder joint, one for elbow and one for rotation (in the biceps). In all, there are 5 motors (this model) 12 V DC, which I chose for its included gearbox with a powerful ratio (1:131) and as well as for its compactness.
Here are my sketches for the lathe man:
And here is the result:
For the skeleton, I used plexiglass as in the previous version, but this time a thicker one (6 mm). Here is the draft:
The following pdfs are exported from CAD. This project is open source, so feel free to cut your own parts using my sketches.
Last Christmas, I received a special gift from my parents. It was a sensor called Xtion from Asus, specially designed for developers. It’s actually a web cam and at the same time a IR cam (for depth) which simulates a 3D vision. It is very similar to Kinect sensor, but the Asus’s version is more flexible from the OS point of view. I am interested in working in Linux (Ubuntu), so this would be the perfect sensor for me.
As I am a beginner at this subject, I have started with different tutorials about Computer Vision and Body Tracking (OpenNI). After several weeks of practice, I made up my mind about how to use this sensor. I built a robotic arm which executes simultaneously, that is in real-time, the movements of my physical arm. I called it “The Xtion Arm”.
Here is my first version:
It is made by plexiglass (4 mm) and has 2 servos for the two joints (elbow and shoulder).
My movements are recorded by Asus Xtion and processed in Processing software which transfers the coordinates of the two points (joints) through a specific port to Arduino software which converts the coordinates of the 3-axis into angular dimensions. These values are sent to Arduino board which commands the servos.
In order for the software to perform this, I have written 3 applications. The first one is called “the Server”, it is developed in Processing and it gets the coordinates in “real world” dimensions. The second application is called “the Client”. The transmission of 9600 bits/s between Server and Client can be achieved either by serial (local: both arm and sensor are connected to the same computer) or via the Internet (sensor+Server on computer 1 and Xtrion Arm+Client on computer 2). The third application is Arduino, it communicates with the Client and sends values to servos. All these codes are published on my GitHub account here.
After my success with Spy Robot I decided to put together all my projects in one in order to create an Android. The purpose of this project is to control a humanlike robot. This idea comes from what I’ve done with the pan & tilt system which imitates a human neck. If I place the IMU sensor on my head, it will send my head’s coordinates and position to the servos which control the camera. In this way, the Android’s head (camera) is moving simultaneously with my head, recording and streaming in real-time the images. In this way, it gives me the opportunity to collect data from the place where the Android is sent.
I divided the Android into three smaller projects:
Next year, I will design and build one of the Android’s arm, which would be one of the most important abilities of the robot.