Posts

Showing posts from February, 2025
  Week 7 – CST370 – Design and Analysis of Algorithms This week we learned about several programing techniques and algorithms. We started with counting sort, which is an algorithm that does not use comparison to sort. It is used to sort integer values in a range. The next algorithm we studied was the Radix sort algorithm. This algorithm is also non-comparison based and sorts numbers composed of digits and no range is required. You need to use the counting sort algorithm on each digit, and you sort them using the range 0-9 for decimal numbers. We also covered dynamic programming. This is an algorithm design technique for solving problems with overlapping subproblems. Instead of solving overlapping subproblems over and over, you solve them once and record the results in a table from which a solution to the original problem can then be obtained. An example of this technique can be illustrated using Fibonacci Numbers where you must compute F(n) by adding F(n-1) + F(n-2) for n > ...
  Week 6 – CST370 – Design and Analysis of Algorithms This week we learned about AVL trees, 2-3 trees, heap, and hashing. An AVL tree is a binary search tree that has a balance factor for each node that is either 0, 1, or -1. This means that the height of a node’s left subtree minus the height of the node’s right subtree must be either 0, 1, or -1. When inserting a new node into an AVL tree, you must recheck that the balance factor of each node is still within range. If the tree is unbalanced, you must perform either right, left, double right left, or double left right rotations to balance the tree. A 2-3 tree is a tree that can have nodes that are 2-nodes or 3-nodes. A 2-node contains a single key with two children. The left child is less than the key and the right child is more than the key. A 3-node contains two ordered keys and three children. The left child is less than the two keys, the right child is more than the two keys, and the middle child is between both keys. ...
  Week 5 – CST370 – Design and Analysis of Algorithms This week we learned about Quicksort, Binary Tree traversal, decrease and conquer, and Presorting. Quicksort is based on the divide and conquer technique. With Quicksort, you can sort an array by selecting an element as the pivot. You must partition elements that are less than or equal to the pivot to the left of the pivot and elements that are greater than or equal to the pivot to the right. At the end of this first partition, the pivot will end up in its final position. This leaves you with two sub arrays which need to be partition also. You continue to partition the subarrays until all elements in the array are sorted. Quicksort has a time complexity of O(n log n) in the best and average cases and O(n 2 ) in the worst case. A binary tree is a set of nodes that is either empty or consist of a root and two disjoint binary trees which are called the left and right subtrees of the root. There are three classic binary tree t...
  Week 4 – CST370 – Design and Analysis of Algorithms   This week we continued learning about the divide and conquer technique. We specifically learned how to apply it to the Mergesort algorithm. Mergesort sorts an array by dividing it into two halves and sorting those two halves recursively. Lastly, you merge the two sorted arrays into a single sorted array. The algorithm merges the two sorted arrays with two pointers that are initialized to point at the first elements of each array being merged. The two elements are compared and the element with smaller value is added to the new sorted array. The next step is to increment the index of the smaller element and the two elements are compared again. This is repeated until the end of an array is reached. At that point, the remaining elements of the other array are simply copied to the new array. One of the main advantages of Mergesort is that it always has a time complexity of O(n log n) regardless of the input data distribution...