That is, if f'(x) = f(x) and g'(x) = g(x) then f = kg for some constant k.

Proof:

Consider

Show this is equal to zero which means that h(x) = constant.

I think that should enable you to do what you want.

Bob

]]>This was posted by yeyui on

http://forums.xkcd.com/viewtopic.php?f=17&t=36281

I have kept the post but edited the variables to suit your problem.

First some change of variable magic:

Now apply this magic to the function of interest

So this function has the property f(ab)=f(a)+f(b) which means that it is some logarithm.

Now "just" evaluate it at any particular point to show that it is the right one.

Bob

]]>The Taylor series uses the derivatives, so you are still using the derivative of log there. Unless we get the Taylor series of log in a different manner.

]]>I suspect there's a circular argument lurking here as power series probably depend on natural logs somewhere, but maybe it's ok.

Bob

]]>For the function

at (0,1), the derivative is:

Even though I don't know what that is, it will have a value; let's say k.

Now the derivative at other points

So all graphs in the family have the property that the gradient function at x is a^x times the gradient at (1,0)

In the family there will be one value of a for which k = 1

Call that one a = e

then

so

Now suppose

Taking logs base e for the first expression:

Differentiating wrt x

which means we now know the value of k ... and

so [still working on this last bit but I think I'll post before I lose it all]

No good. I seem to be stuck here because if k - ln a this becomes ln x and I was trying hard to avoid that. I seem to have gone too far and proved the log is base e. I'll come back to it later after a think.

Bob

]]>Are you wanting a proof from first principles? I usually start with the derivative of a^x, then e^x, then reverse these for the log.

If you may assume d(e^x)/dx = e^x then it will only take a few lines.

Bob

]]>