Category Archives: Programming

Prototyping and testing of the balancing robot

Made some progress on the balancing robot project. In this video, I made the first version of the PCB. It had a few problems but I still got the robot to balance. Some of the problems included:

  • Design error with the voltage divider and filter capacitor for the battery voltage measuring. Capacitor blew up 🙂
  • I forgot to connect the MISO, MOSI and SCK lines for the SPI bus.
  • Problem with the fotprint for the wireless module.
  • And a few more minior things…

Apparently, I need to check everything more carefully. Especially with the schematic, before I start to design the PCB itself. I hade also learned from many previous PCB projects that I have made, including this one, that you always need to create important/special component footprints yourself. Comunity-made footprints always have some errors, or something just decisions that I don’t like/want. Enev official footprints for components can have major problems.

In this video, I also show some of my planning of the remote controller for this robot. It will be a custom remote controller with an LCD and two analog control sticks. The controller should be able to control the robot, maybe in different ways. It will also have some kind of menu system for making adjustments to PID values and other settings.

Robot balancing by itself. The battery is just laying there on top. The robot will of course have a top plate for the battery when it is done.
Detailed shot of the first version of the PCB. Here you can also see the wires I had to solder on the connect the PIS bus for the IMU sensor.

DIY FrSky telemetry multi-sensor

I made my own “multi-sensor” for the FrSky radio system. An Arduino Nano reads a few different sensors and sends all the data to the Smart-Port connector on the RC receiver. This way I can get information such as battery voltage, speed, altitude, GPS position, and temperatures on my RC Radio when I fly my RC planes. This is perfect for planes that don’t have a flight controller onboard.

My design and hardware is based on the FrSky library from Pawelsky: https://www.rcgroups.com/forums/showthread.php?2245978-FrSky-S-Port-telemetry-library-easy-to-use-and-configurable

I actually started experimenting with making my own telemetry sensors a few years ago. Therefore my code is actually quite old. Back then, this was the only library available. I still think it works well, but today there are also other alternatives available.

At first, I just soldered a GPS and a servo connector to an Arduino NANO. But later I made a few PCBs to make them easier to assemble. My first versions had voltage monitoring of the individual cells in the flight battery, but I found that it was annoying having to plug in the balance connector of the battery in the plane.

This is my third iteration of the board. It has a single input for LiPo battery voltage monitoring with a voltage divider and filter capacitor, a BMP280 barometer for altitude and variometer measurements, a connection for a Beitian BN-220 GPS or any other NEMA capable GPS for speed and position data, and two Dallas DS18B20 temperature sensors, one onboard and one attached with cables.

Update 2022-01-07: I fixed an error in the PCB design and made a few small adjustments in the code. The links are now the updated versions.

Link to my schematic and PCB: https://oshwlab.com/Axbri/frsky-telemetry-v3

Download link for my code, schematic and PCB gerber-files:
Axels FrSky tememetry sensor V4.zip

Lidar Robot

This is a video of my DIY Lidar robot. Here it is using a spinning laser distance sensor (Xiaomi robot vacuum spare part) to drive around and avoid obstacles. The sensor is connected to a Raspberry Pi running a Python script that is the main behavior program. There is no mapping going on, the robot is just going forward and turning away from things that are to close. The Raspberry Pi then sends serial data to an Arduino that controls stepper motors driving the robot.

The Xiaomi laser distance sensor is connected to a Teensy 3.2 running code from this project: https://github.com/getSurreal/XV_Lidar_Controller

The Teensy 3.2 is running PID speed control for the DC motor that spins the sensor. It is also reading the binary sensor data and sends it in easy to understand ASCII-messages over serial to the Raspberry Pi. The sensor spins at 5 revolutions per second and makes a distance measurement for every degree. Resulting in 360*5 = 1800 measurements per second. The accuracy is within a few centimeters. This sensor trigonometry to measure distance, in that way it is not a real Lidar sensor.


Rebuild of my “original” balancing robot.

This is the first self-balancing robot that I built that actually worked well. More info about it here. My absolute first self-balancing robot was the “Equaipose bot”, link the page about it here.

This video has gained some attention on youtube, and I think that is why I get at least one email every other week from people asking about the code and schematics for this robot. I have not shared the code of any details about this robot since it is very poorly made and coded. The robot uses multiple Arduinos just to keep the balance and drive the motors, later I also added another Arduino to manage the Ultrasonic sensors and some other stuff. There was also a Raspberry Pi that I planned to use for computer vision experiments, I never came around to doing that using this robot.

Now when I know a lot more about self-balancing robots, and Arduino programming and DIY robots in general, I decided to make a new version of this robot. I new complete rebuild of it. My intention is to make the robot less complicated and also better. I stripped down almost everything in the robot and started over. Still using the same old wooden chassis. The new version is based on a single Arduino MEGA R3 controlling everything in the robot. It reads the MPU6050 IMU, makes all the filtering and balancing calculations, and drives the motors using hardware timers for maximum precision. The new code for this robot is heavily based on the code for my “mini balancing robot”, with some improvements, more info about that robot here. I still use the same old stepper motors, model airplane wheels, and “Big Easy Driver” -stepper motor driver boards. The Arduino MEGA also controls and reads four HC-SR04 ultrasonic sensors using Interrupts. Those sensors are used for obstacle avoidance.

The new version of the robot works now. It balances and drives around avoiding obstacles, but I still have a few things to fix with the code. I want to add a pushbutton for control and a buzzer for feedback, I should also add battery measuring and implement a low voltage cut off to prevent damage to the battery.

I hope to release I video of the robot together with the code and schematic in a couple of weeks.

The servo seen in the pictures is not and will not be implemented in the first version of the code, but the intention is that the servo should be used to make it possible for the robot to raise itself up and start balancing on its own. Maybe I will also ad some sort of remote control since it is a frequently requested feature, But that will be in a later version.

 

GPS navigation robot – Two waypoints

This video shows a robot I have built that is driving between two waypoints using GPS.

I built this robot last summer, about a year ago. Back then, I never really managed to the software part of the robot working. The weather became worse outside as the fall came, and a lost interest in this project and started working on other things instead.

I wave written all the code myself, apart from the functions used to calculate course and distance between GPS waypoints. I used functions from the TinyGPS library for that. The code runs on an Arduino Due, the robot is using PID control to steer towards the waypoints. The robot uses a combination of GPS-course and integrating the signal from a yaw-gyro to determine its current course. The robot also has a compass, but it does not seem to work very reliably, therefore I do not use it. The robot also has sonar and other sensors, but they are not used in this video. Expect more videos and info about this robot in the near future.

Real time procedural water demo

I added procedural water in the “Procedural terrain engine”.

Normally when creating water one often use tiling Du-Dv and Normal textures to create distortion effects and lighting. This is how I did it in the Realization EngineHere I am generating those textures procedurally on the GPU in real time instead. This allows me to adjust diffrent parameters to change the look of the water. It also removes the repeting patterns in the water that are clearly visible when using texture images that are loaded from the disc.

Procedural terrain engine is my own graphics engine I have written In C++ using OpenGL, GLSL and GLFW to experiment with procedural generation of different typs.

Robot and Rocks – game trailer

I worked on this project mostly during this summer with the purpose of learning more about the game engine Unity and game development in general. In the last couple of weeks I have tried to finish all the features I stated to implement, and make a playable game out of it. This Game is also my entry in the annual “LiU Game Awards” competition.

You can download the game and try it for yourself here: http://robotandrocksgame.brinkeby.se

Procedural terrain engine demo

This is a demo I made using C++, OpenGL and GLFW. It is a proceduraly generated landscape in which the user can “walk around”. The terrain is generated using Simplex-noise and is made up of chunks that are loaded and removed as the user walks over the terrain. It is possible to walk infinity (or at least very very far) in one direction without reaching any edge or crashing the program. The chunks are rendered in different levels of detail depending on the distance from the camera to improve performance.

Real time cloth simulation

This is a project I made together with two others students: Mikael Lindhe and Eleonora Petersson. This project was made in the course “Modelling Project TNM085” at Linköping University. The video demonstrates two pieces of cloth that are simulated in two different ways.

The first cloth is represented with particles that are connected with each other using constraints. This mean that whenever the cloth moves, the distances between the particles are corrected to make the cloth retain it’s shape. This is the “usual” method for simulating cloth in computer games.

The second cloth is simulated using a method where the particles are connected with each other using springs. When the cloth moves, forces are applied to all particles to correct them to there original distances from each other.  This method proved to be more computationally heavy and less stable then the first method.

I was mainly working on the graphics part for this project, while the others focused more on the simulation part. It was the first time I developed a basic rendering system for modern OpenGL in C++ from scratch. It was also the first time I made a program that updates vertex buffer data for an object every frame.