In my Foundations of Computer Science class, we are learning about growth functions (big-oh, big-omega, and big-theta). Now I understand the concept of these three. However, I still have a few questions: 1- If a function grows faster past a certain point, does that make it better in programming because it grows faster, or does it just take up more space? 2- If I have f(n) is the big-oh of g(n), then does that automatically make g(n) the big-omega of f(n)? 3- On a graph where (n) is the intersection point of the two functions, what is that point essentially saying. I believe (at least), that it means once that number is hit in both functions, then the faster accelerating one takes off and never looks back. Any help would be greatly appreciated. My teacher is a scary Russian guy who speaks broken english, and I can't understand a word he says!

these better not be homework questions 1) by grows faster I assume you mean the runtime which is usually what these are used for computing. growing fast is a bad thing because it means for n inputs the computing time of the algorithm grows much larger. 2) i think so, yes 3) essentially these are looking at the runtime of your algorithms, the point where they intersect is the point at which one function becomes more or less efficient than another based on n inputs.

No this is not HW. I'm just trying to teach this stuff to myself b/c of my given reasons. But TYTYTYTYTYTYTY for clarifying this for me