Ok, so I know that it's a differential--a change in the function input value--and that it can be used in approximating the function value at that change by dy = f'(x)dx. So far so good.

Then, my book goes on to give the rules of derivatives in differential notation. An example of this is the power rule: d(u^n) = nu^(n-1)**du**

I've emphasized the du because it seems to have no business being there. You have (nu^(n-1)), which is the rule to find the *exact* derivative of a power function, and then we stupidly approximate it by multiplying by an arbitrary du. And if the derivative we've found really is exact, then wouldn't du=0, thus invalidating our results entirely?

It gets worse in integrals. They insist on a meaningless dx in ALL of them. They tell you to put it on there and then ignore it while you happily integrate, leaving it out of your solution entirely. What?

I'm sorry for ranting. I know I shouldn't be so condescending towards dx, since it's me who doesn't understand, but it makes me angry because I can't find a proper explanation anywhere, so it's something that *obviously* everyone should just understand, but I don't.

It's a conspiracy of mathematicians! Lol, I'm better now. Or I will be, once I understand dx.