Dead Reckoning

Back in the days without GPS and other navigational tools there was a navigation technique that was used by airplanes that was known as Dead Reckoning. The basic idea was that if you know how fast you were going and the direction you were traveling, you could keep track of your position.

We are going to use something similar to keep track of our robot’s position when it cannot see the tower.  We have a gyro which will tell us what direction the robot is heading, and encoders on the wheels which will tell us how far it has moved in that direction.  If we check periodically (say every 20th of a second) and compute how far and in what direction it has moved in the X and Y directions, we can keep track of it’s absolute position.

The main trouble with this approach is that each time we do our incremental calculation there will be a small error and, over time, these errors will add up until we have a significant error in the robot’s position.  Fortunately, every time the robot turns so that it can see the tower we will be able to reacquire it’s absolute position on the field.

We are, of course, going to need access to the encoders attached to the drive system.  Rather than using the encoders that are returned directly from our DriveSubsystem class, we are going to create copies which will allow us to control the zero point of the encoders.  First we declare two member variables to hold the left and right encoders:

	private Encoder			m_leftEncoder		= null;
	private Encoder			m_rightEncoder		= null;

Then in the Init() function we create copies of the left and right encoders used by the DriveSubsystem. Note the call to the Clone() function.  This returns a copy of the referenced encoder.  This copy still access the same physical device, but has it’s own zero point.

		m_leftEncoder	= Robot.m_driveSubsystem.GetLeftEncoder().Clone();
		m_rightEncoder	= Robot.m_driveSubsystem.GetRightEncoder().Clone();

Now each time our Update() function is called we want to compute the distance the robot has traveled since the last call.  To do this we will need member variables to store the previous encoder values:

	private int				m_lastLeftEncoder	= 0;
	private	int				m_lastRightEncoder	= 0;

Then in our Update() function, in the case where we cannot see the tower, we will use the yaw angle and the distance to compute a new X and Y from the previous, as follows:

	private void Update()
	{
		PiCameraData	data 			= Robot.m_piCamera.GetCameraData();
		int				yaw 			= m_gyro.GetYaw();
		int				leftEncoder		= m_leftEncoder.get();
		int 			rightEncoder	= m_rightEncoder.get();
		
		
		if ((data != null) && IsTargetVisible(data))
		{
			double	top				= (data.m_y1 + data.m_y2) / 2;
			double	center			= (data.m_x + (data.m_width / 2));
			double	angleToCenter	= 0.1014*center - 34.061;
			double	distance 		= (0.000271*top*top + 0.0146*top + 16.64) * (268.46 / (268.46 - 0.0577*angleToCenter*angleToCenter));
			double	angle 			= yaw + angleToCenter;
			double	x 				= -distance * Math.sin(angle * Math.PI/180);
			double	y 				= distance * Math.cos(angle * Math.PI/180);	
			
			synchronized(this)
			{
				m_position	= new RobotPosition(x, y, yaw);
			}
		}
		else
		{
			double	deltaLeft	= (leftEncoder - m_lastLeftEncoder) / k_ticksPerInch;
			double	deltaRight	= (rightEncoder - m_lastRightEncoder) / k_ticksPerInch;
			double	distance	= (deltaLeft + deltaRight) / 2;
			
			if (distance != 0)
			{
				double	radians	= (m_position.m_yaw * Math.PI) / 180;
				double	deltaX	= distance * Math.sin(radians);
				double	deltaY	= distance * Math.cos(radians);
							
				m_position.m_x	+= deltaX;
				m_position.m_y	-= deltaY;
			}
			
		}
		
		m_lastLeftEncoder	= leftEncoder;
		m_lastRightEncoder	= rightEncoder;
	}

At the beginning of the function get the current encoder values for the left and right.  Then if we cannot see the tower we determine the amount the left and right encoders have changed since the last call and take their average to get the distance the robot has moved.  Note that we are using the constant k_ticksPerInch to convert from the raw encoder value to inches.  Previously we have found that the robot moves 38.5 inches in 8000 encoder units which means we define k_ticksPerInch to be 38.5/8000.

Next using trigonometry, we compute the change in the X and Y coordinates and add that change to the current position.  Finally we save the left and right encoder values for next time.

Your Navigator.java file should now look like:

package robot;

import robot.PiCamera.PiCameraData;
import robotCore.Encoder;
import robotCore.Gyro;
import robotCore.RobotBase;

public class Navigator 
{
	private static final double k_ticksPerInch = 38.5/8000.0;
	
	private Gyro			m_gyro		= new Gyro();
	private RobotPosition	m_position	= null;
	private Encoder			m_leftEncoder		= null;
	private Encoder			m_rightEncoder		= null;
	private int				m_lastLeftEncoder	= 0;
	private	int				m_lastRightEncoder	= 0;
	
	public class RobotPosition
	{
		public double	m_x;
		public double	m_y;
		public double	m_yaw;
		
		RobotPosition(double x, double y, double yaw)
		{
			m_x		= x;
			m_y		= y;
			m_yaw	= yaw;
		}
	}
	
	public RobotPosition GetPosition()
	{
		synchronized(this)
		{
			return(m_position);
		}
	}
	
	private boolean IsTargetVisible(PiCameraData data)
	{
		return(	(data.m_y1 > 2) &&
				(data.m_y2 > 2) &&
				(data.m_x > 2) &&
				((data.m_x + data.m_width) < 638));
	}
	
	private void Update()
	{
		PiCameraData	data 			= Robot.m_piCamera.GetCameraData();
		int				yaw 			= m_gyro.GetYaw();
		int				leftEncoder		= m_leftEncoder.get();
		int 			rightEncoder	= m_rightEncoder.get();
		
		
		if ((data != null) && IsTargetVisible(data))
		{
			double	top				= (data.m_y1 + data.m_y2) / 2;
			double	center			= (data.m_x + (data.m_width / 2));
			double	angleToCenter	= 0.1014*center - 34.061;
			double	distance 		= (0.000271*top*top + 0.0146*top + 16.64) * (268.46 / (268.46 - 0.0577*angleToCenter*angleToCenter));
			double	angle 			= yaw + angleToCenter;
			double	x 				= -distance * Math.sin(angle * Math.PI/180);
			double	y 				= distance * Math.cos(angle * Math.PI/180);	
			
			synchronized(this)
			{
				m_position	= new RobotPosition(x, y, yaw);
			}
		}
		else
		{
			double	deltaLeft	= (leftEncoder - m_lastLeftEncoder) / k_ticksPerInch;
			double	deltaRight	= (rightEncoder - m_lastRightEncoder) / k_ticksPerInch;
			double	distance	= (deltaLeft + deltaRight) / 2;
			
			if (distance != 0)
			{
				double	radians	= (m_position.m_yaw * Math.PI) / 180;
				double	deltaX	= distance * Math.sin(radians);
				double	deltaY	= distance * Math.cos(radians);
							
				m_position.m_x	+= deltaX;
				m_position.m_y	-= deltaY;
			}
			
		}
		
		m_lastLeftEncoder	= leftEncoder;
		m_lastRightEncoder	= rightEncoder;
	}
	
	public void Init()
	{
		m_gyro.Reset();
		
		m_leftEncoder	= Robot.m_driveSubsystem.GetLeftEncoder().Clone();
		m_rightEncoder	= Robot.m_driveSubsystem.GetRightEncoder().Clone();
		
		RobotBase.Schedule(new Runnable() 
			{
				@Override
				public void run() 
				{
					Update();
				}
			
			}, 50);
	}
}

Now if we deploy and run our program, we should see the position of the robot changing, even when we cannot see the tower.

Next: Plotting the position