In late 2024 I decided to switch my main coding efforts from software to hardware. I am of the view that LLMs will largely commodify software development — at least for the purposes for which I code. Humans are still far from achieving similarly powerful LLM applications in the physical world due to issues with training data, error tolerance, and analytical complexity. For that reason there lies ahead far more abundant possibilities in the hardware domain to satisfy both my intellectual curiosity and potential commercial applications.
This blog tracks my progress from being a near-beginner in electronics, robotics, and hardware. My rules are simple: I follow only my curiosity and drive. I am not following any particular textbook, course, or kit in a linear manner. I imagine a new idea, try it and iterate through trial and error. I move on onto more sophisticated ideas that spark my interest, building on my previous progress. NB: I am currently in full-time military training, so I am progressing insofar I have free time in the evenings or weekends.
My only aim is to be as competent as possible as quickly as possible.
Week c/23rd September 2024
Bought a Raspberry Pi 4B, MicroSD card, small monitor and keyboard to assemble. Takes a lot longer to flash correctly, after several failed attempts. Eventually I have it up and running, with the OS GUI showing properly.
One of the many attempts to flash the RPi with Raspbian
Weeks through 30th September - 10th November
Iterating basic electronic circuits using logic implemented on the RPi. I buy a big box of various beginner components and work my way through it. I start with something basic as an turning on an LED, then adding button control. I move onto new components and features: LED bar graphs that ‘flow', PWM manpulation of an LED, RGB LEDs, potentiometers, DC motors, servo motors, 74HC595 shift register, ultrasonic ranging and attitude sensing. This is the one stage where I am not looking to iterate on the same thing. I am purely building up a basic and broad knowledge of various components with which I can then build more meaningful projects.
Using a potentiometer to drive variable RPM in a DC motor
Week c/11th November
I assemble a 6 DOF robotic arm for the first time, although some of the cheap servos I bought clearly don’t have sufficient torque to hold themselves upright. I order stronger servos for next week, especially for servos 2 (the shoulder DOF across the vertical plane) and 3 (the elbow).
Week c/18th November
A working prototype for the robotic arm is up and running. I have not connected any HMI like a joystick or switch so I am having to manually input servo angles one-by-one in a loop through servos 1-6. It is accurate and responsive to command but the control loop is unusably slow.
Working prototype lifts a loose servo motor into the box
Week c/25th November
I add an ADC module and joystick - the same you would find on a PlayStation controller. As with everything, I quickly get to the state where it improperly works and executes code without errors, yet I am far from optimising it to run smoothly. The joystick inputs x and y values without issue yet the two values are always similar if not identical, despite clearly rotating it around the x-y plane trying to obtain different values. I’m unable to obtain two differentiated values and it seems to come down to the cheap nature of the joystick. I take only the x-value and use that to control servo 3 (the elbow) and to move the forearm up and down.
Week c/2nd December
Still very jittery but I have now managed to incorporate the click or ‘switch’ input of the joystick to control servo 6 - the gripper as the end effector. This works as well it reasonably could, the gripper has good torque strength as to pick up objects even at awkward angles, and fully opens and closes in a binary manner. The only way to improve is to make the click more responsive, given the apparent limit to how quickly the grabber can be clicked open and closed. I have made enough progress that I have the dilemma of progressing ahead to pursue new interesting projects or to optimise what I have already built to ensure I am iterating on strong foundations. I lean towards the former for now, and order a series of new components from AliExpress..
The joystick clearly allows for a faster control loop, but only allows for movement in the vertical plane and not horizontally
Week c/9th December
I’ll be needing to take a step back on that choice from last week. Servo jitteriness is worse than ever and clearly there is an issue with the servos as most of them have a tendency to fully rotate across their full range of motion without prompt. CAD is something I will start learning soon. It will help with making more rigourous designs, and also facilitating 3D printing parts.
Week c/16th December
A new batch of equipment arrives from AliExpress: a speakerphone, 2x HC-05s, a flex sensor, a 5MP camera (both normal vision and infrared), new mini breadboards, a Raspberry Pi Pico, and a Li-ion battery pack. I get to work testing all the new components to understand the basics: RGB manipulation with the camera, getting a stream of reliable inputs from the flex sensor, and reliably integrating the Li-ion battery pack with an RPi through a buck converter.
Weeks c/23rd and 30th December
I am away from all my resources for two weeks so conventional progress is halted. I have time to draw soft plans for future ideas, consolidate what I have done so far and what I need to do in the coming weeks and months.
Week c/6th January 2025
Productive. I optimise the MPU6050 accelerometer to output useful and stable variables across the x, y and z planes. I am unable to utilise DMP with the hardware I have, but using raw data output I standardise accx/accy/accz outputs as a proportion of g (9.81m^-2) which I will be putting to use on the robot arm. I also work on going back to the drawing board with the robot. As mentioned in previous weeks, I got too far ahead of myself and return to servo-by-servo numerical input with a total focus on minimising servo jitteriness. I refine my angle-to-duty cycle calculations to end unnecessary PWM signals after a servo completes its movement, ensuring the servo is not continuously bombarded with signals. I retain a separate angle-to-duty cycle function for the gripper end effector closing which is the one movement that requires constant PWM signals to maintain its grip. I also begin plans to start building larger, semi-humanoid forms which will integrate more advanced ‘intelligent’ features regarding language and sensors. I order more components (e.g. longer ribbon cable for the camera) and start to draw sketches for what it may look like and how I will fit all components together.
Spent six hours integrating and optimising an MPU6050 accelerometer ready for use as a reliable input
Servo jitteriness finally removed, but requiring basic keyboard entry for system control
Week c/ 13th January
Using a WASD (+ extra keys) input system to control the 6-DOF robot arm. There is no static jitteriness as before although the control feedback loop during movements causes shudders. An Xbox controller will ultimately replace the keyboard, but for now it proves a useful transition in isolating servo-to-duty functions for individual servos rather than in a single continuous loop. I also wire up the 9900mAh 12V battery pack (designed for a robotic vacuum cleaner) to a buck converter to allow for a self-contained power system independent of mains power. Later down the line I will want to mount sensors and arms onto an vehicle for autonomous mobility. Now work has begun on a haptic control glove for the robot arm. This may prove to only be a side-project, depending on how well I can optimise it, or the primary form of robotic arm control if I can get it to a level that is user-friendly enough as an Xbox controller. Also managed to create a working prototype of a python programme incorporating the API for ChatGPT. (Ironically, ChatGPT was useless at explaining how to set up its own API through python following the disuse of the openai.chat.completions.create method). Once I have incoporated a custom GPT model and TTS and STT functionality, it will act as the front end for converstional capability and essentially mimic conscientious.
Week c/ 20th January
A less productive week. I was away from my kit for a few days, but arrival of new components has spurred some progress. I have attached a camera to the end of the robot arm which allows a user to see what they are grabbing through the screen (useful for when I make a mobile robot). I have also incorporated the Xbox controller for more practical real-time control. It was easy to set up a preliminary connection but I have been frustrated by attempts to optimise smooth servo movement (it doesn’t jitter when still, but it does between movements). As with everything in this domain, getting something to work is easy but optimising is hard.
Week c/27th January
In a single three hour session I have got the speakerphone system up and running as well as slightly smoothened the Xbox-to-servo control loops (although still plenty to do on that front). The speakerphone is able to output any audio from the RPi and the speaker can STT input text to a sufficiently accurate degree.
My track-driven chassis has arrived. Assembled it and connected it up to the motor drive and RPi. Foolishly I didn’t get a connection splitter as motor will be powered at 12V directly from the battery while the RPi and logic system will be powered at 5V. I don’t have a way of separating that so although I have my 10000mAh 12V battery connected, the RPi has to be powered by mains. Have ordered the splitter to arrive ASAP. Making the tracks move back and forth and rotate smoothly as the x and y values of the joystick oscillate between 0 and 1 proves easier than expected.
On a downside, containing the battery, RPi, main breadboard, motor driver all on the same chassis is proving chaotic. Given my iterative nature I don’t have a proper design to fit them ergonomically together so the robot will look very messy for the forseeable future with parts hanging off it.
I eventually programme the joystick to move in conjunction with the LT (L2 equivalent to a PS controller) button being pressed and held. This will differentiate between movement of the whole robot with the tracks, and, without LT being pressed, the 6DOF arm.
A primitive first go integrating the motor control system
Week c/ 3rd February
I have been procrastinating on getting up to speed on CAD. Like most of this, I have no prior experience but the growing entanglement of processors, end effectors and wires is at best ugly and at worst disfunctional. Using CAD will introduce me to the idea-to-final design-to-3D printing pipeline I need to become familiar with. It also makes me think critically about end products in of themselves rather than a rambling series of bolted-on components, and therefore how to package them ergonomically.
Otherwise my T-split connectors have arrived and now allow for the creation of two separate circuits at different voltages: a 12V circuit running straight to the motor drive and motors, and all else (logic, 6DOF arm etc.) run on 5V arm. This has finally allowed my robot to move independently which was a great milestone to hit. Rather than a sprawling entanglement of servos and wires across my desk, I have finally created a wholistic thing, mobile, powered and controlled as a single entity.
Using front-end custom GPTs via OpenAI I have also created a semblence of a ‘conscience’. Ideally I would want to feed this through the same python programme that brings together all other features but I have yet to find a smooth STT and TTS API that is as optimal as having the custom GPT running in the background. It can glitch slightly in speech if the CPU is overloaded with servo and motor movements.
Doing some research on my seemingly incurable servo jitteriness and irrational movements, I have bought a PCA9685 servo motor driver which should optimise and smoothen controls across all servos in real-time Xbox control.
Moving via remote control for the first time
The same movement caught in real-time from the camera stream to my laptop
Weeks c/10th, c/17th, c/24th February
No progess.
Week c/3rd March
Finally integrate a PCA9685 servo driver which eliminates jitteriness in real time and across multiple planes simultaneously. (Compare it to week c/6th January where jitteriness is only removed on a plane in isolation and in response to numerical control input).
Also I rewire the circuit to 10x the battery capacity to a 100,000 mAh Li-ion battery. The old 10,000 mAh battery used to run flat after about 10 minutes of moving, arm-use and RPi processing which implies an improbably current draw of ~45A. More likely the battery overestimated its actual capacity.
Long overdue… Smooth, controller-input real time control of the arm is achieved thanks to the servo driver
Weeks c/10th, 17th March
Slower progress. Diminishing marginal returns are setting in. I feel as if I am stuck optimising for incremental gains as what I have assembled with these components has reached its natural limit. After a few failed updates (the arm smashed into the camera and broke it…) I am stepping back from the frustrations of being at a developmental cul-de-sac. At this juncture I need to plan from scratch for something more ambitious which a) allows me to start afresh and escape any initial errors or flaws I embedded as a beginner on my current project and b) involves a whole new series of components which give me scope for improving my skills. This will necessarily involve upscaling the current size of my project by a factor of at least 2-3x, ideally closer to 5x. With that comes a chance to move beyond a relatively small breadboard, incorporate larger 3D-printed body parts, and develop a new mode of movement.
Weeks c/24th March
Starting from scratch now. I’ve decided to build a crude ‘humanoid’; a camera and screen sat on the top of a ~1m high shaft, which moves around on a wheel-driven base with two 6DOF arms halfway up. This represents a far more thorough ground-up approach to anything I have done so far. I am sketching and then designing component parts, from first principles, to then be manufactured/printed. I am slowly expanding the stack of my capabilites so that I can take a totally novel idea and produce it. Key objectives: intelligent use of visual data from the camera (primarily facial recognition), incorporation of two larger 6DOF arms, coherent design and assembly to produce a working prototype.
Further optimised servo control and movement from the old arm and got a CAD model produced of the top bracket.
Some rough sketches and calculations
Getting the main powertrain and arm to work smoothly in tandem
Weeks c/31st March
React + vite, json, finalise top bracket, continue processing, order brackets for larger servos.