Could you clarify your question? I am not really an expert in complexity theory (which I have already amply demonstrated), and thus am not certain exactly what you mean by the term.
Certain things don't count as actual "operations" when you are analyzing the complexity of the algorithm. For example, when doing binary search, you constantly divide by 2. But we (typically) don't consider this an operation, the only operation that actually counts is a comparison. That is how we come up with binary search being log(n). There are log(n) comparisons made in the worst case.
For primality testing, "is a divisor" is typically considered as the only operation, and it's complexity is 1. Note that this doesn't exactly reflect the real world as finding if 13 divides a 20-digit number takes much longer than a 3-digit number. But it is accurate enough to give comparisons between algorithms.
In this system, your algorithm for finding a divisor is O(n), which is pretty good.
Would it not be O(sqrt(10^n))? Unless you're doing something fancier than checking all integers less than sqrt(n) to see if they are a divisor.
Prime factorization would then be O(n2^(2n)), which is about what other prime factorization algorithms get.
How do you get this? If a number has a prime factor, it is at most sqrt(n). We can then divide by the number, and find prime factorization on the new number (since it's primes must be exactly the same). Taking a very high upper bound, let's say we get all the way to the sqrt(n) and we get that 2 is the divisor. In other words, we have to do a lot of searching, and then we only get to divide out by the smallest number. Also, there can be at most log(n) divisors. The complexity would then be:
Here, I am taking n to be the number, not the number of digits. If you want to rework this for n being the number of digits, please, by all means. I don't.
Amalcas
2008-11-22 12:40:20
Yes, the choice of "n" is arbitrary; however, in this case, it seems to make more sense for "n" to be the number of digits than the number itself. And yes, I am using n as the number of digits.
And what do you consider to be an operation?
Could you clarify your question? I am not really an expert in complexity theory (which I have already amply demonstrated), and thus am not certain exactly what you mean by the term.
Ricky
2008-11-21 17:11:40
For numerical algorithms, the "n" in complexity is measured as the number of digits (in binary, usually);
Typically, yes, but not always. "n" just refers to your metric, which is different for different problems.
In this system, your algorithm for finding a divisor is O(n), which is pretty good. Prime factorization would then be O(n2^(2n)), which is about what other prime factorization algorithms get. An O(√n) complexity for the brute force strategy would result in O(2^(5/2)).
You're now using n as the number of digits? And what do you consider to be an operation?
Amalcas
2008-11-21 12:54:50
Ah, my mistake. I was not thinking when I wrote that; the point I was trying to make is untrue. But yes, comparison to existing algorithms is probably the most relevant metric.
I talked this over with a friend, and the problem is the units we've been measuring complexity in. For numerical algorithms, the "n" in complexity is measured as the number of digits (in binary, usually); if "n" were simply the number, many very slow algorithms (such as prime factorization) would appear to have very good time (essentially, functions of numbers which seem fairly small, such as the square root, become quite large eventually, and adjusting the units to the number of digits helps reflect this). In this system, your algorithm for finding a divisor is O(n), which is pretty good. Prime factorization would then be O(n2^(2n)), which is about what other prime factorization algorithms get. An O(√n) complexity for the brute force strategy would result in O(2^(5/2)).
Supposedly, a primality test in polynomial time was announced a few days ago. I'll post more info here once I get the link from my friend. This test, however, doesn't give a polynomial time method for prime factorization.
Ricky
2008-11-19 12:46:26
Another problem is that log(n) -> n as n -> ∞...so for large numbers factorization still becomes impractical.
I think you mean log(n) -> ∞ as n -> ∞. But even so, can you name a non-trivial problem where that doesn't happen? With almost every problem there is (interesting or not), solutions take longer as n increases, and in an unbounded fashion. Thus, isn't really a criticism for prime factorization, since there are virtually no bounded complexities that typically arise except for constant time.
Rather, one should be comparing complexities to the (known) lower bound of the problem and the lowest known algorithm. In this respect, log(n) is good for prime factorization.
Amalcas
2008-11-19 11:08:57
Prime factorization is a nondeterministic polynomial problem (NP). In short, this means that, as far as we know, a computer must have the ability to "magically" guess right answers in order to complete the algorithm in a reasonable (polynomial, or P) time (e.g. always guess an actual divisor first). Specifically, prime factorization is believed to be super-polynomial, or sub-exponential (taking a time that is somewhere between polynomial and exponential). Assuming we had a function performing like yours, you would get polynomial complexity for prime factorization (O(log^2(n)), I think; but that might imply impossibly good complexities for naive algorithms...). However, no one knows if NP = P or not; that is, such an algorithm could exist, though I highly doubt it, even if factorization is polynomial. Another problem is that log(n) -> n as n -> ∞...so for large numbers factorization still becomes impractical. I would encourage you to look up computational complexity yourself. I only know a very little...and my explanation is more than likely riddled with flaws.
Also, the "lots of security" is, as far as I know, just the RSA cryptographic algorithm; it's just that it's one of the most popular cryptographic algorithms, being a public-key algorithm (thus providing both both secrecy and authenticity) that is reasonably secure. In fact, it was the first big solution to the secrecy vs. authenticity problem, to my knowledge.
mikau
2008-09-28 15:26:06
At least as far as we know right now.
right now? so you mean, we don't know for sure that such an algorithm can't exist?
Ricky
2008-09-28 12:21:35
Given a bound on the number (being extremely arbitrary, let's pick 2^32), there are faster ways than the square root of n. Simply construct a list of all primes and use binary search. Or you could use more "fancy" primality tests with strong pseudo-primes, and probably other wacky things that I've never heard about.
In the unbounded case, mathsyperson nailed it. At least as far as we know right now.
mathsyperson
2008-09-28 10:29:28
I'm fairly certain that's not possible. If you had two 100-digit primes and multiplied them, then if such a function existed then entering the product into it would get you the lower of the two primes around 100 times slower than if it had been two single-digit primes.
Lots of security relies on the fact that enormous numbers are practically impossible to factorise, and the existence of such an algorithm would make this untrue.
The best one I can think of would be O(√n).
mikau
2008-09-28 09:31:51
Suppose we had a function firstDivisor(unsigned int n) that returns the smallest divisor of n. Observe this number is always prime, and when n is prime, the output is n itself.
Is it possible that such an algorithm could be O(log n)? I was wondering if I could construct one today, (hoping to move another algorithm from O(n^2) to O(n log n) )
but since the output of firstDivisor is n when n is prime, we can create a primality test by the expression: (n == firstDivisor(n));
and thus have a primality test thats O(log n). Is such an algorithm possible?