Posts

  Week 8 - CST462s – Race, Gender, Class in the Digital World Service-Learning Reflections My service-learning experience was great. I learned a lot about the bug triaging process, bibisecting, and the LibreOffice open-source software. The initial process went well, our site supervisor created a guide for volunteers who wish to contribute which was highly informative and easy to follow. Once I reviewed the guide and the resources that came with it, I got started with bug testing and logging hours in the first week. Although the guide created by our site supervisor was great and covered everything you needed to get started, I would include several videos showing the bug triaging process from beginning to end. Although some people might be ready by just reading the material, I feel more confident in performing a process I have seen already. The most impactful part of this experience was knowing that I was helping to improve open-source software which is used by hundreds of mi...
  Week 7 – CST370 – Design and Analysis of Algorithms This week we learned about several programing techniques and algorithms. We started with counting sort, which is an algorithm that does not use comparison to sort. It is used to sort integer values in a range. The next algorithm we studied was the Radix sort algorithm. This algorithm is also non-comparison based and sorts numbers composed of digits and no range is required. You need to use the counting sort algorithm on each digit, and you sort them using the range 0-9 for decimal numbers. We also covered dynamic programming. This is an algorithm design technique for solving problems with overlapping subproblems. Instead of solving overlapping subproblems over and over, you solve them once and record the results in a table from which a solution to the original problem can then be obtained. An example of this technique can be illustrated using Fibonacci Numbers where you must compute F(n) by adding F(n-1) + F(n-2) for n > ...
  Week 6 – CST370 – Design and Analysis of Algorithms This week we learned about AVL trees, 2-3 trees, heap, and hashing. An AVL tree is a binary search tree that has a balance factor for each node that is either 0, 1, or -1. This means that the height of a node’s left subtree minus the height of the node’s right subtree must be either 0, 1, or -1. When inserting a new node into an AVL tree, you must recheck that the balance factor of each node is still within range. If the tree is unbalanced, you must perform either right, left, double right left, or double left right rotations to balance the tree. A 2-3 tree is a tree that can have nodes that are 2-nodes or 3-nodes. A 2-node contains a single key with two children. The left child is less than the key and the right child is more than the key. A 3-node contains two ordered keys and three children. The left child is less than the two keys, the right child is more than the two keys, and the middle child is between both keys. ...
  Week 5 – CST370 – Design and Analysis of Algorithms This week we learned about Quicksort, Binary Tree traversal, decrease and conquer, and Presorting. Quicksort is based on the divide and conquer technique. With Quicksort, you can sort an array by selecting an element as the pivot. You must partition elements that are less than or equal to the pivot to the left of the pivot and elements that are greater than or equal to the pivot to the right. At the end of this first partition, the pivot will end up in its final position. This leaves you with two sub arrays which need to be partition also. You continue to partition the subarrays until all elements in the array are sorted. Quicksort has a time complexity of O(n log n) in the best and average cases and O(n 2 ) in the worst case. A binary tree is a set of nodes that is either empty or consist of a root and two disjoint binary trees which are called the left and right subtrees of the root. There are three classic binary tree t...
  Week 4 – CST370 – Design and Analysis of Algorithms   This week we continued learning about the divide and conquer technique. We specifically learned how to apply it to the Mergesort algorithm. Mergesort sorts an array by dividing it into two halves and sorting those two halves recursively. Lastly, you merge the two sorted arrays into a single sorted array. The algorithm merges the two sorted arrays with two pointers that are initialized to point at the first elements of each array being merged. The two elements are compared and the element with smaller value is added to the new sorted array. The next step is to increment the index of the smaller element and the two elements are compared again. This is repeated until the end of an array is reached. At that point, the remaining elements of the other array are simply copied to the new array. One of the main advantages of Mergesort is that it always has a time complexity of O(n log n) regardless of the input data distribution...
  Week 3 – CST370 – Design and Analysis of Algorithms This week we learned about Brute Force String Matching and Exhaustive Search. We also learned about Depth-First search and Breadth-First search. Brute Force String Matching is an algorithm used to match a pattern against a string. You align the pattern against the first m characters from left to right until either all the m pairs of the characters match or a mismatching pair is encountered. If a mismatch happens, you simply shift the pattern one character to the right and begin the process again. Exhaustive search is a brute-force approach to combinatorial problems. Some of these problems include the traveling salesman problem, knapsack problem, and assignment problem. The traveling salesman problem is an algorithm that finds the shortest route through a given number of cities that visits all cities and ends up at the starting city. This problem is ideal when dealing with weighted graphs. The knapsack problem involves it...
  Week 2 – CST370 – Design and Analysis of Algorithms This week we covered asymptotic notations, non-recursive algorithms, and recursive algorithms. We use the order of growth to determine an algorithm’s efficiency. We also use three notations to rank the order of growth: big oh, big omega, and big theta. Big oh, is the set of all functions with a lower or same order of growth as g(n), a simple function to compare the count with. Big omega is the set of all functions with a higher or same order of growth as g(n). Lastly, big theta is the set of all functions that have the same order of growth as g(n). Non-recursive algorithms solve problems by repetition through loops to execute instructions until a condition is met. An example of a non-recursive algorithm is a Max Element algorithm where you must find the value of the largest number in an array. A recursive algorithm calls itself with smaller values and returns the result for the current input by carrying out basic operati...