Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
245 views
in Technique[技术] by (71.8m points)

iphone - Getting displacement from accelerometer data with Core Motion

I am developing an augmented reality application that (at the moment) wants to display a simple cube on top of a surface, and be able to move in space (both rotating and displacing) to look at the cube in all the different angles. The problem of calibrating the camera doesn't apply here since I ask the user to place the iPhone on the surface he wants to place the cube on and then press a button to reset the attitude. To find out the camera rotation is very simple with the Gyroscope and Core Motion. I do it this way:

if (referenceAttitude != nil) {
    [attitude multiplyByInverseOfAttitude:referenceAttitude];
}

CMRotationMatrix mat = attitude.rotationMatrix;

GLfloat rotMat[] = {
    mat.m11, mat.m21, mat.m31, 0,
    mat.m12, mat.m22, mat.m32, 0,
    mat.m13, mat.m23, mat.m33, 0,
    0, 0, 0, 1
};

glMultMatrixf(rotMat);

This works really well. More problems arise anyway when I try to find the displacement in space during an acceleration. The Apple Teapot example with Core Motion just adds the x, y and z values of the acceleration vector to the position vector. This (apart from having not much sense) has the result of returning the object to the original position after an acceleration. (Since the acceleration goes from positive to negative or vice versa). They did it like this:

translation.x += userAcceleration.x;
translation.y += userAcceleration.y;
translation.z += userAcceleration.z;

What should I do to find out displacement from the acceleration in some istant? (with known time difference). Looking some other answers, it seems like I have to integrate twice to get velocity from acceleration and then position from velocity. But there is no example in code whatsoever, and I don't think that is really necessary. Also, there is the problem that when the iPhone is still on a plane, accelerometer values are not null (there is some noise I think). How much should I filter those values? Am I supposed to filter them at all?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Cool, there are people out there struggling with the same problem so it is worth to spent some time :-)

I agree with westsider's statement as I spent a few weeks of experimenting with different approaches and ended up with poor results. I am sure that there won't be an acceptable solution for either larger distances or slow motions lasting for more than 1 or 2 seconds. If you can live with some restrictions like small distances (< 10 cm) and a given minimum velocity for your motions, then I believe there might be the chance to find a solution - no guarantee at all. If so, it will take you a pretty hard time of research and a lot of frustration, but if you get it, it will be very very cool :-) Maybe you find these hints useful:

First of all to make things easy just look at one axis e.g x but consider both left (-x) and right (+x) to have a representable situation.

Yes you are right, you have to integrate twice to get the position as function of time. And for further processing you should store the first integration's result (== velocity), because you will need it in a later stage for optimisation. Do it very careful because every tiny bug will lead to huge errors after short period of time.

Always bear in mind that even a very small error (e.g. <0.1%) will grow rapidly after doing integration twice. Situation will become even worse after one second if you configure accelerometer with let's say 50 Hz, i.e. 50 ticks are processed and the tiny neglectable error will outrun the "true" value. I would strongly recommend to not rely on trapezoidal rule but to use at least Simpson or a higher degree Newton-Cotes formula.

If you managed this, you will have to keep an eye on setting up the right low pass filtering. I cannot give a general value but as a rule of thumb experimenting with filtering factors between 0.2 and 0.8 will be a good starting point. The right value depends on the business case you need, for instance what kind of game, how fast to react on events, ...

Now you will have a solution which is working pretty good under certain circumstances and within a short period of time. But than after a few seconds you will run into trouble because your object is drifting away. Now you will enter the difficult part of the solution which I failed to handle eventually within the given time scope :-(

One promising approach is to introduce something I call "synthectic forces" or "virtual forces". This is some strategy to react on several bad situations triggering the object to drift away although the device remains fixed (? no native speaker, I mean without moving) in your hands. The most troubling one is a velocity greater than 0 without any acceleration. This is an unavoidable result of error propagation and can be handled by slowing down artificially that means introducing a virtual deceleration even if there is no real counterpart. A very simplified example:

if (vX > 0 && lastAccelerationXTimeStamp > 0.3sec) {

     vX *= 0.9;
}

`

You will need a combination of such conditions to tame the beast. A lot of try and error is required to get a feeling for the right way to go and this will be the hard part of the problem.

If you ever managed to crack the code, pleeeease let me know, I am very curious to see if it is possible in general or not :-)

Cheers Kay


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...