• Welcome to Valhalla Legends Archive.
 

Algorithms

Started by Banana fanna fo fanna, November 26, 2003, 05:01 PM

Previous topic - Next topic

Banana fanna fo fanna

Hey Yoni...

Want to explain O(n) notation to me and how to figure it out?

UserLoser.

How about the alogirthm(s) used to hash NLS with War3? :P

Yoni

Quote from: St0rm.iD on November 26, 2003, 05:01 PM
Hey Yoni...

Want to explain O(n) notation to me and how to figure it out?

I started to type a detailed explanation but it came out too huge (I didn't finish it) so I'll do this differently. How much do you know already? Ask more specific questions about it (unless you totally don't know anything about it).

Here it is without a detailed explanation:

n is the size of the input of the algorithm.
Count how many steps the algorithm takes (as a function of n).
Then, only leave the most significant part of what you got, without its coefficient, inside the O().
For example, if you calculated that Algorithm X takes 7n^3 + 2n^2 - 3n + 8 steps to complete when given an input size of n, then the 7n^3 is the most significant part, and the coefficient (7) can be dropped, giving: O(n^3).

The significance of the Big-O notation allows you to tell whether some algorithm is more efficient than another with big inputs.
Assume you have two algorithms, Algorithm A and Algorithm B, which do the same thing in different ways. Let's say that with an input size of n, Algorithm A takes 1000n steps to complete, and Algorithm B takes 0.25n^2 steps. Then the efficiency is O(n) for Algorithm A, and O(n^2) for Algorithm B.

Considering the amount of steps, it may seem that Algorithm B is more efficient (with a small input such as n=2, Algorithm A takes 2000 steps, and Algorithm B takes just 1 step). But, and this is the important point, Algorithm B is O(n^2), which is considered less efficient than Algorithm A which is O(n), and this is noticable with very large inputs. It is proven mathematically that when you compare algorithms that have a different Big-O time efficiency, for sufficiently large inputs the less efficient one is always slower.

If the algorithm doesn't depend on the size of the input (always completes in a constant amount of steps), then we say its time efficiency is O(1).

Here are some common time efficiencies you will see in simple (high school, etc.) algorithms, from most efficient to least efficient:

O(1)
O(log(n))
O(n)
O(n*log(n))
O(n^2)
O(n^3)
O(2^n)
O(n!)

As a rule, O(n^a * log(n)) is less efficient than O(n^a), but more efficient than O(n^b) where b is any number bigger than a. This is also proven mathematically.

Quote from: UserLoser. on November 26, 2003, 06:25 PM
How about the alogirthm(s) used to hash NLS with War3? :P

How about no? :P

Adron

Generally, f(n) being O(n) means that f(n)/n doesn't approach infinity as n does. f(n) being o(n) means that f(n)/n approaches zero as n approaches infinity.