In Pi Wars 2019 there were 2 tasks that involved moving in a straight line between two walls. To achieve this, we had sonar distance sensors on each side of the robot, and code that determined the angle we were travelling at and tried to correct it to keep within the centre of the two walls.
This meant that for the maze challenge, when turning right or left, the turn didn't have to be particularly accurate - as the wall centring code would correct any errors.
For 2021, we're inside an arena or our own making, so the environment is more controlled. We still need the robot to move around the arena in a straight line and correct any errors when turning.
However, instead of using distance sensors on each side, I wondered if I could use the camera. By looking at the wall we're travelling to, and looking at the angle of the base, we can determine whether we're travelling straight towards the wall - or are pointing slightly left or right - and then correct accordingly.
But first - how to determine where the horizon is?
I went back to my vision processing code, and added a mode to the GUI - called Horizon mode. This takes the image and works out the Horizon line - i.e. where the bottom of the black wall meets the lighter floor.
The processing is fairly straightforward. First we convert the image to black and white. (This is a standard function from the PIL library and simply converts any pixel with values < 128 to black and pixels with values >128 to white).
The centre image in the GUI shows this conversion.
We then take the central 20% of the image columns - since we only want to focus on the wall dead ahead and try to ignore any side walls and divide this into 10 equal columns. For each of these columns, we go through the pixels finding the longest black part. The bottom of that we save as a horizon point, so at the end of this list we should have a list of 10 horizon points.
We then apply the line of best fit through these points. and extrapolate this to the width of the image to produce the horizon. (To calculate the line of best fit we're using the 'Linear Least Square' method).
We then use the angle to determine whether we need to veer left or right to move directly towards a wall - which then also gives our sonar sensor the best chance of returning accurate distances.
The image shown above shows the horizon line against the original image. This shows a small negative angle which means we need to veer slightly right to get straight.
If we do find we're moving towards a side wall, then there's some additional processing to move away from the wall if the angle is greater than a certain amount.
The video below shows the robot moving backwards and forwards within the arena. It does get confused a couple of times, but for the most part corrects the direction of travel to face the wall.
No comments:
Post a Comment