- Mon Mar 12, 2007 8:28 am
#27069
I coded an artificial horizon in OpenGL with is overlayed over an video feed from a wireless camera. The camera is oriented at zero roll & pitch relative to the imu (like an aircraft nosecam).
So if you point the camera to an object with a known orientation (imagine a house window frame) and jig it around you can verify how close the OpenGL horizon line is to the real frame. If you're saving the movie, you can even review frame by frame and manually calculate real tilt (my doing some hand trigonometry calculations on the border image) and compare it to the imu data (used to draw the OpenGL overlay).
Now would be a good time to mention you'll never be able to fine tune the filter to track the frame accurately. Lens have slight distortion, and the bias is not 100% trackable in ANY existing filter. But it does provide a nice reference for improvements.
Works kind of like the old gunsight alignment method of pointing to any part of a target, firing, and then aligning the scope to place the X over the bullet hole
Comments?
emf wrote:This stuff is going to be hard to test since it'll always have to be done on the hardware. It would be *really* nice to have sets of test data that also included physical measurements of the angles so we could analyze how much error we're getting. Something tells me I'm going to spend a few hours driving figure eights in empty parking lots trying to get some tricky data...Actually there's a pretty straightforward way of checking errors without precise measurements from an unbiased industrial strength IMU. I do it visually.
I coded an artificial horizon in OpenGL with is overlayed over an video feed from a wireless camera. The camera is oriented at zero roll & pitch relative to the imu (like an aircraft nosecam).
So if you point the camera to an object with a known orientation (imagine a house window frame) and jig it around you can verify how close the OpenGL horizon line is to the real frame. If you're saving the movie, you can even review frame by frame and manually calculate real tilt (my doing some hand trigonometry calculations on the border image) and compare it to the imu data (used to draw the OpenGL overlay).
Now would be a good time to mention you'll never be able to fine tune the filter to track the frame accurately. Lens have slight distortion, and the bias is not 100% trackable in ANY existing filter. But it does provide a nice reference for improvements.
Works kind of like the old gunsight alignment method of pointing to any part of a target, firing, and then aligning the scope to place the X over the bullet hole
Comments?