Jump to content

Recommended Posts

Geschrieben

Love the IMU concept but have a couple of problems I hope you can help with.

 

1. I've downloaded the latest brick viewer but when connected to the IMU the image of it appears for a fraction of a second and then vanishes.  The background is just black most of the time.  Occasionally it flickers back for a micro-second.  I'm using Windows 7  - any suggestions?

 

2. I'm most interested in the yaw value at present (robotics purpose) but not getting expected results. The results are as follows:

 

a - yaw ranges from 0 - 85 or 86 not 0 - 90 as expected.

b - I thought it might give me 0 - 360 degrees but if align to the zero position and then spin it anti-clockwise it goes from the zero up to 85 (at -90degrees) and then back to 0 at 180 degrees and then to -85 at -270 degrees and then back down to zero at the starting position.

 

3. If I keep it perfectly still the yaw value will vary by 1-2 degree at times.  I've tried playing with the convergence values and it's not improved the situation - temperature is pretty stable.  I haven't tried calibrating as yet but I'd have thought that as it's stationery then the values would be a lot more stable.

Geschrieben

1. I've downloaded the latest brick viewer but when connected to the IMU the image of it appears for a fraction of a second and then vanishes.  The background is just black most of the time.  Occasionally it flickers back for a micro-second.  I'm using Windows 7  - any suggestions?

I will look into that tomorrow, what happens if you press "Save Orientation"?

 

Edit: I just tried it on Windows 7, unfortunately it worked like expected. Is there some kind of error or something comparable? I am not sure what to do about your problem, i will test all of the windows versions we have in our test VM, perhaps one of them can reproduce this problem.

 

Edit2: Nope, can't reproduce it. The IMU is drawn with OpenGL, could this be the problem? Do you have graphics card drivers installed?

 

2. I'm most interested in the yaw value at present (robotics purpose) but not getting expected results. The results are as follows:

 

a - yaw ranges from 0 - 85 or 86 not 0 - 90 as expected.

b - I thought it might give me 0 - 360 degrees but if align to the zero position and then spin it anti-clockwise it goes from the zero up to 85 (at -90degrees) and then back to 0 at 180 degrees and then to -85 at -270 degrees and then back down to zero at the starting position.

Yes. That is the problem with Euler angles. First of all, you are hitting a gimbal lock, that is why you can't reach 90 degree. See here: http://en.wikipedia.org/wiki/Gimbal_lock

 

Second, when you say it goes back to zero (from -90, when it should go to -180), i am sure the roll changes from 180 or -180 to 0 and the pitch changes from 0 to 180 or -180, which means the yaw is correct! Test this: Search for the 0, 0, 0 position with the IMU Brick and the Brick Viewer, memorize the position. Then go to some other position you think the values are incorrect for, there write down the roll, pitch and yaw. Then go to the memorized position and then turn the Brick successively in the roll, pitch and yaw direction by the degrees you wrote down (in that order! Euler angles are not commutative).

 

You will find that you can reach every position as expected, as long as you don't hit a gimbal lock.

 

This gimbal lock concept is not easy to grasp, but there is mathematically no way around it if you use euler angles. Every IMU that gives out euler angles will have this problem. I can only recommend that you try to use quaternions and rotation matrices (which are not easy to understand either), there is no gimbal lock with those.

 

3. If I keep it perfectly still the yaw value will vary by 1-2 degree at times.  I've tried playing with the convergence values and it's not improved the situation - temperature is pretty stable.  I haven't tried calibrating as yet but I'd have thought that as it's stationery then the values would be a lot more stable.

I would try to get the values with a high frequency, with a callback (set the period to 2ms) and average over 50 or 100 values. This way the values will be a lot more stable.

Geschrieben

Hi Coldwilson,

 

some other guys seems to have problem with brickv, too. We have tested it on several systems and can't reproduce the error. What graphics card do you have? Can you post your OpenGL version?

 

 

Geschrieben

I thougt a bit about what you wanted to do (discover the direction your robot is facing), i think the "correct" way to do it is as follows:

 

Multiply the quaterions with a vector facing in the y axis, take the x and y component from the result and calculate the angle for the vector. The resulting angle will be exactly what you want.

 

Pseudocode:

q = getQuaternion()
v1 = Vector3d(0, 1, 0)
v2 = q*v1
angle = atan2(v2.x, v2.y)

 

The funny thing is that i could have used that on the IMU Brick to calculate the LED blinking for the direction leds (instead i used the magnetometer values directly). This would have been much better, since it also works if the IMU is not laying plane on the x and y axis. I will change that in the next firmware version.

Geschrieben

Hi, me again  8)

 

I couldn't resist, i had to try that. To multiply a quaternion (q) with a 3d vector (v) you have to represent the vector as a quaternion with w=0 (v') and calculate:

 

q*v'*q^-1 (where q^-1 is the conjugate of the quaternion)

 

after that you can calculate the angle with atan2. If you insert (0,1,0,0) for v' and you simplify everything possible you get:

 

int yaw_angle = atan2(-w*y + x*y + y*x - w*w, w*w + y*y - z*y -  x*x)*180/PI

 

where x, y, z and w are the quaternion values from the imu.

 

Really neat if you ask me. And there is of course no gimbal lock!

 

edit: I uploaded version 1.0.1 of the IMU firmware with the improved code for direction calculation for the direction leds (implemented as described above).

Geschrieben

Wow, that's great.  Thanks Borg I'll give that code a try.  Great idea on the sampling approach as well - I should have thought of that.

 

So I checked out the brick viewer on my windows 7 PC and also on windows 7 via VMWare virtual machine on my mac. On the PC the opengl version was 4.2.  I updated the Nvidia driver to the latest 295.73 but it made no difference (and didn't update the opengl version).  If I press save orientation it makes no difference.  The image very occasionally appears for a fraction of a second.

 

On the virtual machine version of windows 7 it's running opengl 2.1 and it's working perfectly on that.

 

While you're in a great problem solving frame of mind can you think of an easy way to do the save orientation (just on yaw) in python?  I'd like the orientation to be set when I turn on the robot as the zero point.  I can do it by basically using the get orientation and subtracting or adding it from the  current reading to zero.  I wondered whether there was a more system based way to do it so it thinks it's now calibrated to the new yaw position correctly?  But don't see a function that does this specifically.

 

 

Geschrieben

So I checked out the brick viewer on my windows 7 PC and also on windows 7 via VMWare virtual machine on my mac. On the PC the opengl version was 4.2.  I updated the Nvidia driver to the latest 295.73 but it made no difference (and didn't update the opengl version).  If I press save orientation it makes no difference.  The image very occasionally appears for a fraction of a second.

 

On the virtual machine version of windows 7 it's running opengl 2.1 and it's working perfectly on that.

We found a lenovo netbook that has the same problem. So we are able to reproduce it now. However, i couldn't figure out what the problem is yet.

 

While you're in a great problem solving frame of mind can you think of an easy way to do the save orientation (just on yaw) in python?  I'd like the orientation to be set when I turn on the robot as the zero point.  I can do it by basically using the get orientation and subtracting or adding it from the  current reading to zero.  I wondered whether there was a more system based way to do it so it thinks it's now calibrated to the new yaw position correctly?  But don't see a function that does this specifically.

 

Well, that is what quaternion multiplications are for: http://www.cprogramming.com/tutorial/3d/quaternions.html

 

In the Brick Viewer, when you press "Save Orientation" i do:

 

self.rel_x = x
self.rel_y = y
self.rel_z = z
self.rel_w = w

 

and whenever i get new quaternion values i do:

 

# conjugate
x = -x
y = -y
z = -z

# multiply
wn = w * self.rel_w - x * self.rel_x - y * self.rel_y - z * self.rel_z
xn = w * self.rel_x + x * self.rel_w + y * self.rel_z - z * self.rel_y
yn = w * self.rel_y - x * self.rel_z + y * self.rel_w + z * self.rel_x
zn = w * self.rel_z + x * self.rel_y - y * self.rel_x + z * self.rel_w

 

And i work with wn, xn, yn and zn from there on. I suppose you could do the same.

 

On startup you should set the convergence to a high value (like 250) for 2 seconds or so, then the IMU has its position. On a robot you should then be able to go down to something like 5, since you aren't making huge accelerations. Then you should get the current quaternions and save them. And then you are ready to go 8).

 

We don't have a "save orientation" function in the Brick API itself, since that would need 16 additional floating point multiplications per ms. I would have a hard time to squeeze them in. The microcontroller on the IMU Brick is practically running at its limit. On the PC side  the multiplications don't matter at all, even if you use something like a Beagle Board or Raspberry PI.

Geschrieben
int yaw_angle = atan2(-w*y + x*y + y*x - w*w, w*w + y*y - z*y -  x*x)*180/PI

 

Borg - you are an absolute genius, I modified your example code for IMU to run this above formula and set quaternion update frequency to 2ms for fun as well.  just needed to add the line import math and change atan2 to math.atan2 and the PI to math.pi and it worked brilliantly.

 

Now it gives me perfect accurate results.  There's no gimble lock issue and it goes around from 0 to -180 degree clockwise and then +180 back to 0 as I continue around clockwise to the start again.  I was delighted to see that it also gave the exact same result despite whatever orientation I had the device set at so it would show accurate yaw position even on very rough terrain.

 

Your last note has blown my mind somewhat at the moment but I'll try to figure it out.  It's not a big issue, there's lots of ways to tackle what I need to do.  Those quaternions are very advanced maths.

 

One day after I get my maze robot finished I'd like to have a go at a quadrocopter robot using the IMU.  Perhaps it's time for 3D flying maze competitions?  8)

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Gast
Reply to this topic...

×   Du hast formatierten Text eingefügt.   Formatierung jetzt entfernen

  Only 75 emoji are allowed.

×   Dein Link wurde automatisch eingebettet.   Einbetten rückgängig machen und als Link darstellen

×   Dein vorheriger Inhalt wurde wiederhergestellt.   Clear editor

×   Du kannst Bilder nicht direkt einfügen. Lade Bilder hoch oder lade sie von einer URL.

×
×
  • Neu erstellen...