Computational Complexity
The subject of computational complexity (also known as time complexity and/or space complexity) is one of the most math-heavy topics we discuss in CS50 AP, but also perhaps one of the most fundamentally important in the real-world. As we begin to write programs that process larger and larger sets of data, analyzing those data sets systematically, it becomes increasingly important to understand exactly what effect those algorithms have in terms of taxing our computers. How much time do they take to process? How much RAM do they consume? In this section, you will begin to discuss the way in which computer scientists measure this, in particular considering the theoretical worst-case and best-case scenarios when running programs.
-
Lecture
-
Shorts
-
Notes
-
Thought Questions
- In what ways can we measure the resources that our programs consume?
- Is it always better to choose an algorithm that runs in O(n) over one that runs in O(n²)? Why or why not?
- Why do you think that we analyze algorithms from a theoretical standpoint using asymptotic notation, instead of just counting runtime in seconds or the like?
- In what ways does this adherence to asymptotic notation (disregarding constants and lower order terms) hinder our ability to speak about algorithms in the real world?
- How might we use time complexity analysis to our benefit as programmers before we even write any code?
-
Problem