thinking computationally

chapter 50

--- thinking logically and concurrently ---

these are the three control structures within the structured approach
the motivation for limiting yourself to the basics of programming is to help comprehension and readability
pseudocode is a way of showing algorithms like a flowchart and without a programming language
while b is positive
increment a
decrement b


while there are similarities between concurrent and parallel computation, they have different uses
concurrent processing is where there are several processes on one processor, this utilises scheduling algorithms
parallel processing is where many processors execute code simultaneously
this is useful in computationally intensive cases such as 3d graphics
concurrency is used in operating systems which have many tasks

chapter 51

--- problem recognition ---

what makes passwords secure?
good algorithms should terminate for any input
if this is not the case, the problem is not computable
moreover, good algorithms should terminate sooner rather than later
in situations where this is not yet possible, we call the problem practically unsolvable
for sufficiently unpredictable passwords, brute force is currently too slow

enumeration involves checking all possible solutions
for example, all orderings or subsets
as a consequence, it has unfavourable time complexities

simulation is the creation of a model of a real system
for example, a physics engine or pricing model
the program only be an approximation of this system due to necessary abstractions

divide and conquer algorithms such as binary search

chapter 52

--- solving problems ---
  1. visualisation
  2. big data
  3. time complexity
  4. heuristic methods

having a strong visual image of the problem is incredibly useful
often in programming this structure is not evident
big companies have an abundance of data
how exactly should they draw conclusions from it?
data mining is this increasingly useful skill

big O notation classifies algorithmic complexities as input size increases
what this time asymptotically grows like is what we consider
for example, bubble sort is O(n2) whereas merge sort is O(nlogn)
algorithms with non polynomial time complexity are called intractable
their solutions take too long to compute precisely
however, heuristic methods can help as they are a balance between accuracy, completeness and speed

pipelining is where sub-tasks can be solved at the same time to reduce overall time taken