C is much faster than Javascript, but Javascript (or Java, it's big cousin) is very portable. I can write a Javascript and put it on the website and nearly everyone can use it straght away. With C, I would have to compile different versions for different operating systems, create a download and install process and then hope

So it depends on what you want.

Anyway, Javascript is really easy (once you get past the wierd way you have to interact with the web page).

BTW I am not even very good at javascript (I am better at C, Visual Basic, SQL), but it's syntax is pretty standard, perhaps even a little too simple.

But once you write a few scripts and people start using them you start thinking "hey, it's pretty useful!"

]]>How fast is Javascript? Also I'd have to learn Javascript. And I don't claim to learn new stuff that

quickly. Also, the code that I write may not follow the standards you would be happy with.

I tend to use goto statements still. I like them better than "for loops" and "while loops" because you

can jump out of them without a "break" statement. And further, my program is still experimental,

but if you'd like to see it, you can do whatever you want with it, if you have the time. Or maybe

you could still convince me to learn Javascript. When I ran my C program to calculate the cosines

of ninety-one angles, using 120 iterations of a formula (120 terms). It took around five minutes

to finish, I wasn't timing it. After about 5 minutes, I had a file with cosines from 0 to 90 degrees.

I don't know much about number theory to know how accurate the series is at a given term, but I was

holding my calculations out to 500 places after the decimal for that run.]]>

That way it can be used by everyone!

]]>I knew the answer, but I couldn't prove it like you did! Neat.

The reason I am thinking about accuracy of numbers is because I wrote

a computer program recently that does +,-,*, and / out to 1000 or more

digits in the C language. And now I want to add significant figures and

plus or minus values to the input numbers and keep track of about how

far off the results are when you do multiple operations in a row. The

program calculated the cosine of 22 degrees to 200 digits, which was one

of my recent posts. Thanks again. If anyone wants a copy of the source

code, I'd be happy to present it, but I don't know what format is best.]]>

let y=x^2

the log y = 2 log x

on differenciating

dy/y = 2 dx/x

Hence the accuracy % of a square root ( x in this particular case) should be 1/2 of the accuracy % of the number (y in this particular case) itself.

Oops, that's not what I meant.

If a number greater than four is accurate to multiply or divide by 4, what is the accuracy of its square root?

]]>