You are not logged in.
Pages: 1
It says:
Prove that if a != b then f(x) = 1 \ ((x-a)(x-b)) is symmetric about x = (a+b)/2
I have no idea how to go on about this problem. I took the derivative of f(x) and found out that x = (a+b)/2 is when the derivative is equal to zero. But what does that tell me?
Thanks.
Last edited by LuisRodg (2007-11-12 09:43:41)
Offline
f(x) is symmetrical about the line x = k if f(k+x) = f(k−x). Thats what it means.
Offline
so following on, this is how i work it out
Last edited by luca-deltodesco (2007-11-12 10:40:32)
The Beginning Of All Things To End.
The End Of All Things To Come.
Offline
Ok got it. I just never learned that a function is symmetrical if f(k+x)=f(k-x)
What about this:
Show that y=x+3 is an oblique asymptote of the graph of f(x)=x^2/(x-3)
I dont even know what an oblique asymptote is. I guess my problem lies in pre-calc and not calculus itself.
Offline
Offline
Another way to do the first question.
Now you can easily see that
.Last edited by JaneFairfax (2007-11-13 01:46:13)
Offline
Pages: 1