Originally Posted by SoupRKnowva
i dunno, you complained about math being an approximation when you were using a method that was an approximation to begin with...seemed weird to me. Wouldn't using an accurate method to begin with have been better?
No, it would not.
And if you want to know why, take a few physics advanced physics courses, and you'll realize that almost all physical laws rest on a lot of approximations . Everything in physics is one big approximation, and if you wouldn't use such approximations, then it would be close to impossible to work with; it would be too complicated.
Furthermore most approximations are accurate to within at least a millionth of a percent, and not making such approximations when they do in fact simplify the math would be stupid. The only reason not to use an approximation is if the lost information is essential -- or at the very least significant -- to what you're calculating, which proves very often not to be the case.
Still, it's sad we have to make such approximations, but it's an unfortunate truth.
Again, this is not really relevant to the discussion.