Hi, I hope I can give as much information as necessary to help me out. I have a better than average Mathematical ability and am an extremely competent software engineer, but have limited knowledge in physics. My attempts so far to model the behavior below have had better success than I would have expected, but I really need to do this 'properly'!.
I am creating an application whereby a ball is rolling across a 2d surface (imagine a ball on a table, being looking down upon birds eye view).
This surface is not flat, at any location I can tell you how much angle there is relative to ground (assuming the ground is flat) - I can give two angles, the X and Y.
The ball will be 'free rolling' I think that is the correct term in that it will never slip, slide, bounce, fly! I simply want to give it a push in direction D and it roll merrily along the contours of the surface until such a point that friction (if thats the right term?) slows the ball down to a stop.
My original model was very basic with an xspeed and yspeed, these being accelerated by a fraction of a gravity value that was higher or lower depending on the amount of X or Y slope. I then simply subtracted a 'drag' value that was acting against the X and Y speeds.
It works, but I do get some odd results, I think down to the fact that the drag is not relative to the amount of X or Y speed - in otherwords you might not be travelling in X at all, but drag starts accelerating you again. Not totally sure to be honest, but I am at the point of seeking help!
This forum looks like a good place to start, I am sure I have probably not been very clear on the above - so please ask away.
Many thanks in advance.