Math Is Fun Forum
  Discussion about math, puzzles, games and fun.   Useful symbols: √ ∞ ≠ ≤ ≥ ≈ ⇒ ∈ Δ θ ∴ ∑ ∫ π -

Login

Username

Password

Not registered yet?

Post a reply

Go back

Write your message and submit
:) :| :( :D :o ;) :/ :P :lol: :mad: :rolleyes: :cool: | :dizzy :eek :kiss :roflol :rolleyes :shame :down :up :touched :sleep :wave :swear :tongue :what :faint :dunno
Options

Go back

Topic review (newest first)

{7/3}
2013-05-24 21:30:24

Thanks

bob bundy
2013-05-24 16:40:18

There are a whole set of functions that differentiate to give themselves.  But they are all multiples of each other.

That is, if f'(x) = f(x) and g'(x) = g(x) then f = kg for some constant k.

Proof:

Consider



Show this is equal to zero which means that h(x) = constant.

I think that should enable you to do what you want.


Bob

{7/3}
2013-05-24 12:27:51

That was an awesome proof,but i need one more favor,if f'(x)=f(x) and f(0)=1 than f(x)=a^x for some constant a,how do i prove this?

anonimnystefy
2013-05-24 06:07:33

I really like that proof. Really elegant.

bob bundy
2013-05-24 05:35:27

OR

This was posted by yeyui on

http://forums.xkcd.com/viewtopic.php?f=17&t=36281

I have kept the post but edited the variables to suit your problem.

First some change of variable magic:



Now apply this magic to the function of interest






So this function has the property f(ab)=f(a)+f(b) which means that it is some logarithm.

Now "just" evaluate it at any particular point to show that it is the right one.

Bob

anonimnystefy
2013-05-24 02:10:32

Hi Bob

The Taylor series uses the derivatives, so you are still using the derivative of log there. Unless we get the Taylor series of log in a different manner.

bob bundy
2013-05-24 01:32:01

OR









I suspect there's a circular argument lurking here as power series probably depend on natural logs somewhere, but maybe it's ok.

Bob

bob bundy
2013-05-23 23:26:37

This is how I do powers and logs:

For the function



at (0,1), the derivative is:



Even though I don't know what that is, it will have a value; let's say k.

Now the derivative at other points




So all graphs in the family have the property that the gradient function at x is a^x times the gradient at (1,0)

In the family there will be one value of a for which k = 1

Call that one a = e

then







so



Now suppose



Taking logs base e for the first expression:



Differentiating wrt x



which means we now know the value of k ... and



so  [still working on this last bit but I think I'll post before I lose it all]



No good.  I seem to be stuck here because if k - ln a this becomes ln x and I was trying hard to avoid that.  I seem to have gone too far and proved the log is base e.  I'll come back to it later after a think.

Bob

{7/3}
2013-05-23 21:50:21

Proof from first principles will be better

bob bundy
2013-05-23 21:14:11

hi {7/3}

Are you wanting a proof from first principles?  I usually start with the derivative of a^x, then e^x, then reverse these for the log.

If you may assume d(e^x)/dx = e^x then it will only take a few lines.

Bob

{7/3}
2013-05-23 16:42:10

Help me prove this:

for some constant a[i cannot use the fact this is ln(x)]

Board footer

Powered by FluxBB