Design and analysis of computer algorithms-sorting and sequential statistics

On the one hand

Chapter 1: Awakening

In the year 2219, in the city of Chronos, Dr. Emily Reed awoke from a deep slumber, her mind filled with hazy memories of a past she could barely grasp. She found herself in a world where time was no longer a linear concept but rather an intricate web woven together through advanced technology.

Chapter 2: The Timekeepers

Dr. Reed soon discovered that the city was governed by an elite group of individuals known as the Timekeepers. They possessed the ability to manipulate time, ensuring the smooth functioning of society in the ever-shifting realm. Each Timekeeper was assigned a specific sector and was responsible for maintaining the delicate balance of time within it.

Chapter 3: The Forbidden Zone

While exploring Chronos, Dr. Reed stumbled upon a hidden archive, where she uncovered a dark secret. Deep within the forbidden zone, she found records of an ancient society that had once mastered time control. According to the forbidden texts, their reckless experimentation led to catastrophic consequences, resulting in the collapse of entire civilizations.

Chapter 4: The Time Rift

Driven by curiosity and a desire to understand her own past, Dr. Reed embarked on a risky journey to locate the mythical Time Rift. Legends whispered that it held the answers to the mysteries of time and could potentially change the course of history.

Chapter 5: The Paradox

The city’s Timekeepers, led by the enigmatic Time Master, tried to stop Dr. Reed from uncovering the truth. They believed that tampering with time could lead to dire consequences. As she ventured further into the Time Rift, Dr. Reed began to witness glimpses of alternate realities and paradoxes, causing her to question the very fabric of her existence.

Chapter 6: The Final Revelation

In a climactic encounter with the Time Master, Dr. Reed learned that she herself was a descendant of the ancient time-wielders. The collapse of their civilization had scattered their bloodline across time, and she was the last remaining link to their lost knowledge. The Time Master revealed that the city of Chronos was built upon the ruins of their once-great society.

Chapter 7: Restoring the Order

Armed with her newfound understanding, Dr. Reed made a bold decision. She chose to confront the dangers of time manipulation head-on and restore the order that had been lost. With the help of a loyal group of rebels, she waged a battle against the corrupt Timekeepers, fighting not just for her own past, but for the destiny of all who lived in Chronos.

Chapter 8: The Timekeepers’ Redemption

Through a series of mind-bending events, Dr. Reed unraveled the true nature of time. Together with the Timekeepers, she developed a new system that would ensure the responsible use of time manipulation, providing balance and preserving the harmony of the universe.

Epilogue:

The city of Chronos thrived once more, standing as a symbol of the triumph of knowledge and responsibility over power and corruption. Dr. Emily Reed became a revered figure, known as the Great Time Guardian, her name forever etched in the annals of history. And as the future unfolded, the people of Chronos found solace in the notion that time, though a mystery, held the potential for both wonder and enlightenment.

Abstract

Sorting is the process of rearranging a set of elements according to specific rules. Sorting algorithms are commonly used algorithms to solve sorting problems. They implement the sorting of elements based on different principles and techniques. Common sorting algorithms include bubble sort, selection sort, insertion sort, quick sort, merge sort, etc.

Order statistics is the problem of finding the kth order statistic (i.e., the kth small element or the kth large element) in a set of elements. The sequential statistics algorithm is a specific algorithm for solving sequential statistics problems. It continuously divides and eliminates some elements in the element set to finally find the kth sequential statistic.

There is a certain connection between sorting algorithms and sequential statistical algorithms. Sequential statistics can be solved using sorting algorithms, one of the common methods is using quick sort. Quick sort can not only sort elements, but also determine the position of an element during the sorting process. By dividing the elements to be sorted into three subsets that are less than, equal to, and greater than a specific element, quick sort can determine the ranking of the specific element after each division and compare it with the target ranking k to determine the next A subset needs to be processed in one step. This idea can be used to solve sequential statistics problems.

Algorithmic complexity

Log-linear complexity O(n log n)

Loglinear complexity O(n log n) is a measure of complexity that indicates that as the input size n increases, the execution time of an algorithm increases at a logarithmic rate of n times base 2.

As an example, consider an algorithm that performs a binary search on an ordered list of n elements. At the beginning, the algorithm starts from the middle element of the list and compares the size relationship between the target element and the middle element. If the target element is smaller than the middle element, the search continues in the first half of the list; if the target element is greater than the middle element, the search continues in the second half of the list; if the target element is equal to the middle element, the target element is found. The algorithm then repeats the above steps looking in the first or second half until it finds the target element or determines that it is not in the list.

In this algorithm, each comparison halved the search range, so the time complexity of each iteration is logarithmic, that is, O(log n). A total of log n iterations are required, so the time complexity of the algorithm is O(n log n).

Complexity O(n^2)

Next consider the square complexity O(n^2). It means that as the input size n increases, the execution time of the algorithm increases at the square rate of n.

As an example, consider an algorithm that searches for pairs of unordered elements in a list of n elements. The algorithm will iterate through each element and then compare the size of that element with other elements in the list to determine whether there is a reversed pair. If an element is smaller than other elements in the list, add 1 to the number in reverse order. Finally, the algorithm returns the number of pairs in reverse order.

In this algorithm, for each element, it needs to be compared with other elements in the list, so n-1 comparisons are required. There are n elements in total, so n * (n-1) comparisons are required, which is a time complexity of O(n^2).

Complexity O(n)

Finally, consider the linear complexity O(n), which means that as the input size n increases, the execution time of the algorithm is proportional to n.

As an example, consider an algorithm that sums a list of n elements. The algorithm will sequentially iterate through each element of the list and add them to get the sum.

In this algorithm, each element needs to be added once, and only one traversal is performed, so the time complexity of the algorithm is O(n).

To sum up, the log-linear complexity O(n log n) means that the execution time of the algorithm increases at a logarithmic rate of n times the base 2 as the input size n increases; the square complexity O(n^2) means that the algorithm The execution time of increases at the square rate of n as the input size n increases; linear complexity O(n) means that the execution time of the algorithm is proportional to the input size n.

Sort algorithm

Radix sort

Radix sort is a non-comparative sorting algorithm that groups the data to be sorted according to the number on each digit, and then sorts the data from low to high in order until the highest digit is sorted. Radix sort can be applied to arrays with integer key values. The key values can be non-negative integers or strings, but certain conditions need to be met, that is, the length of each key value must be the same.

The steps to implement radix sort are as follows:

  • First determine the maximum length of the key value in the array to be sorted (assumed to be d bits), and determine the number of rounds of sorting based on the maximum key value length.
  • Sort each bit using counting sorting, bucket sorting and other sorting algorithms. The specific operation is to group the data to be sorted according to the i-th key value starting from the lowest bit, and then rearrange the array in the order of each group.
  • Repeat the above steps until the highest bit is sorted and an ordered array is obtained.
    The following uses Java code to illustrate the implementation of radix sort:
import java.util.*;

public class RadixSort {<!-- -->
    public static void radixSort(int[] arr) {<!-- -->
        if (arr == null || arr.length == 0) {<!-- -->
            return;
        }
        
        // Get the maximum value in the array to be sorted
        int max = Arrays.stream(arr).max().getAsInt();
        
        // Calculate the number of digits in the maximum value
        int digitNum = String.valueOf(max).length();
        
        //Define a bucket list, each bucket represents a number 0-9
        List<List<Integer>> buckets = new ArrayList<>();
        for (int i = 0; i < 10; i + + ) {<!-- -->
            buckets.add(new ArrayList<>());
        }
        
        // Sort by each bit
        for (int i = 0; i < digitNum; i + + ) {<!-- -->
            //Put the elements in the array to be sorted into the corresponding buckets
            for (int j = 0; j < arr.length; j + + ) {<!-- -->
                int num = (int) Math.pow(10, i);
                int digit = (arr[j] / num) % 10;
                buckets.get(digit).add(arr[j]);
            }
            
            // Rearrange the array in bucket order
            int index = 0;
            for (List<Integer> bucket : buckets) {<!-- -->
                for (int num : bucket) {<!-- -->
                    arr[index + + ] = num;
                }
                bucket.clear();
            }
        }
    }

    public static void main(String[] args) {<!-- -->
        int[] arr = {<!-- -->170, 45, 75, 90, 802, 24, 2, 66};
        radixSort(arr);
        System.out.println(Arrays.toString(arr));
    }
}

Bubble sort

Bubble sort is a simple sorting algorithm. It repeatedly compares two adjacent elements, swapping their positions if they are in the wrong order, until there is no longer any pair of elements to compare. In this way, each round of the loop “bubbles” the largest (or smallest) element to the correct position. Therefore, it is called bubble sort.

Here are the steps for bubble sort:

  • First, starting from the first element of the array, the current element is compared with the next element.
  • If the current element is larger than the next element, swap their positions.
  • Continue comparing to the next element until the end of the array is reached.
  • Repeat steps 1-3 until no more swaps occur. This means that the array has been sorted.
    Next, let’s implement bubble sort using Java:
public class BubbleSort {<!-- -->
    public static void bubbleSort(int[] array) {<!-- -->
        int n = array.length;
        for (int i = 0; i < n - 1; i + + ) {<!-- -->
            for (int j = 0; j < n - 1 - i; j + + ) {<!-- -->
                if (array[j] > array[j + 1]) {<!-- -->
                    // Swap elements
                    int temp = array[j];
                    array[j] = array[j + 1];
                    array[j + 1] = temp;
                }
            }
        }
    }
  
    public static void main(String[] args) {<!-- -->
        int[] array = {<!-- -->64, 34, 25, 12, 22, 11, 90};
        bubbleSort(array);
        System.out.println("Sort results:");
        for (int i = 0; i < array.length; i + + ) {<!-- -->
            System.out.print(array[i] + " ");
        }
    }
}

Comparison sort

Comparison sorting refers to an algorithm that uses comparison operations to sort data. Comparative sorting includes two basic sorting methods: exchange sorting and selection sorting.

Exchange sorting refers to a method of sorting data by exchanging adjacent elements. The basic idea is to divide the sequence to be sorted into two independent parts, where all elements in one part are smaller than all elements in the other part. or equal, and then sort the two parts according to this method. The whole process can be performed recursively to achieve the ordering of the entire sequence.

Selection sorting refers to a method of sorting data by selecting the smallest or largest element. The basic idea is to select the smallest or largest element every time the sequence to be sorted is traversed and put it at the starting position of the sequence. Until all the elements to be sorted are placed in the correct position.

public class SelectionSort {<!-- -->
   public static void selectionSort(int[] arr) {<!-- -->
       int n = arr.length;
       for (int i = 0; i < n - 1; i + + ) {<!-- -->
           int minIndex = i;
           for (int j = i + 1; j < n; j + + ) {<!-- -->
               if (arr[j] < arr[minIndex]) {<!-- -->
                   minIndex = j;
               }
           }
           int temp = arr[minIndex];
           arr[minIndex] = arr[i];
           arr[i] = temp;
       }
   }

   public static void main(String[] args) {<!-- -->
       int[] arr = {<!-- -->64, 34, 25, 12, 22, 11, 90};
       selectionSort(arr);
       System.out.println("Sorted array:");
       for (int i = 0; i < arr.length; i + + ) {<!-- -->
           System.out.print(arr[i] + " ");
       }
   }
}
Heap sort

Heap sort is a comparison sorting algorithm based on a binary heap, which can achieve a time complexity of O(n log n) in the worst case. The main idea of heap sorting is to construct the sequence to be sorted into a large top heap (or small top heap), then exchange the top elements of the heap with the elements at the end of the heap, reduce the size of the heap by one, then adjust the heap, and repeat this process until The size of the heap is 1. At this time, the sequence is basically in order, and the remaining elements can be inserted into the heap in sequence.

public class HeapSort {<!-- -->
   public static void main(String[] args) {<!-- -->
       int[] arr = {<!-- -->4, 6, 8, 5, 9, 2, 1, 3, 7};
       heapSort(arr);
       for (int num : arr) {<!-- -->
           System.out.print(num + " ");
       }
   }

   public static void heapSort(int[] arr) {<!-- -->
       int n = arr.length;

       //Build a maxheap
       for (int i = n / 2 - 1; i >= 0; i--) {<!-- -->
           heapify(arr, n, i);
       }

       //Extract elements from the heap one by one
       for (int i = n - 1; i >= 0; i--) {<!-- -->
           // Swap the current root (maximum) with the last element
           int temp = arr[0];
           arr[0] = arr[i];
           arr[i] = temp;

           // Heapify the reduced heap
           heapify(arr, i, 0);
       }
   }

   private static void heapify(int[] arr, int n, int i) {<!-- -->
       int largest = i; // Initialize largest as root
       int left = 2 * i + 1; // left = 2*i + 1
       int right = 2 * i + 2; // right = 2*i + 2

       // If left child is larger than root
       if (left < n & amp; & amp; arr[left] > arr[largest]) {<!-- -->
           largest = left;
       }

       // If right child is larger than largest so far
       if (right < n & amp; & amp; arr[right] > arr[largest]) {<!-- -->
           largest = right;
       }

       // If largest is not root
       if (largest != i) {<!-- -->
           int swap = arr[i];
           arr[i] = arr[largest];
           arr[largest] = swap;

           // Recursively heapify the affected sub-tree
           heapify(arr, n, largest);
       }
   }
}
Quick sort

Quicksort is a commonly used comparison sorting algorithm with an average time complexity of O(n log n). The main idea of quick sort is to select a benchmark element, divide the sequence to be sorted into two subsequences, where the elements in one subsequence are smaller than the benchmark element, and the elements in the other subsequence are larger than the benchmark element, and then sort The two subsequences perform the same operation recursively until the size of the subsequence is 1 or 0, thereby achieving sorting of the entire sequence.

public class QuickSort {<!-- -->
  public static void main(String[] args) {<!-- -->
      int[] arr = {<!-- -->4, 6, 8, 5, 9, 2, 1, 3, 7};
      quickSort(arr, 0, arr.length - 1);
      for (int num : arr) {<!-- -->
          System.out.print(num + " ");
      }
  }

  public static void quickSort(int[] arr, int low, int high) {<!-- -->
      if (low < high) {<!-- -->
          int pivotIndex = partition(arr, low, high);

          // Recursively sort the two sub-arrays
          quickSort(arr, low, pivotIndex - 1);
          quickSort(arr, pivotIndex + 1, high);
      }
  }

  private static int partition(int[] arr, int low, int high) {<!-- -->
      int pivot = arr[high]; // Use the last element as the pivot
      int i = low - 1; // Index of smaller element

      // Traverse through all elements
      for (int j = low; j <= high - 1; j + + ) {<!-- -->
          // If the current element is less than or equal to the pivot
          if (arr[j] <= pivot) {<!-- -->
              i + + ; // Increment the index of smaller element

              // Swap the elements
              int temp = arr[i];
              arr[i] = arr[j];
              arr[j] = temp;
          }
      }

      // Swap the pivot element with the smaller element
      int temp = arr[i + 1];
      arr[i + 1] = arr[high];
      arr[high] = temp;

      // Return the index of the pivot element
      return i + 1;
  }
}
Merge sort

Merge Sort is a commonly used comparison sorting algorithm with a time complexity of O(n log n). The main idea of merge sort is to divide the sequence to be sorted into two subsequences, then recursively sort the two subsequences, and finally merge the two ordered subsequences into an ordered sequence.

public class MergeSort {<!-- -->
 public static void main(String[] args) {<!-- -->
     int[] arr = {<!-- -->4, 6, 8, 5, 9, 2, 1, 3, 7};
     mergeSort(arr, 0, arr.length - 1);
     for (int num : arr) {<!-- -->
         System.out.print(num + " ");
     }
 }

 public static void mergeSort(int[] arr, int low, int high) {<!-- -->
     if (low < high) {<!-- -->
         int middle = (low + high) / 2;

         // Recursively sort the left sub-array
         mergeSort(arr, low, middle);

         // Recursively sort the right sub-array
         mergeSort(arr, middle + 1, high);

         // Merge the sorted sub-arrays
         merge(arr, low, middle, high);
     }
 }

 private static void merge(int[] arr, int low, int middle, int high) {<!-- -->
     int n1 = middle - low + 1;
     int n2 = high - middle;

     //Create temporary arrays
     int[] left = new int[n1];
     int[] right = new int[n2];

     // Copy data to temporary arrays
     for (int i = 0; i < n1; i + + ) {<!-- -->
         left[i] = arr[low + i];
     }
     for (int j = 0; j < n2; j + + ) {<!-- -->
         right[j] = arr[middle + 1 + j];
     }

     // Merge the temporary arrays back into the original array
     int i = 0, j = 0;
     int k = low;
     while (i < n1 & amp; & amp; j < n2) {<!-- -->
         if (left[i] <= right[j]) {<!-- -->
             arr[k] = left[i];
             i + + ;
         } else {<!-- -->
             arr[k] = right[j];
             j + + ;
         }
         k++;
     }

     // Copy the remaining elements of the left sub-array
     while (i < n1) {<!-- -->
         arr[k] = left[i];
         i + + ;
         k++;
     }

     // Copy the remaining elements of the right sub-array
     while (j < n2) {<!-- -->
         arr[k] = right[j];
         j + + ;
         k++;
     }
 }
}

Sequential statistics in the field of algorithms

Sequential statistics is an important concept in the field of algorithm design and analysis. It mainly focuses on how to find the kth smallest element in a set of data, where k is a given positive integer. The algorithms involved in sequential statistics can be used to solve a series of problems, such as finding the median, selecting the k-th largest element, etc.

In algorithm design, sequential statistics provides us with an efficient method to deal with related problems. The core idea is to divide the problem into smaller-scale sub-problems through the divide-and-conquer method, and solve these sub-problems through appropriate selection and recursion.

The most classic sequential statistical algorithm is the “quick selection” algorithm, which is based on the idea of the quick sort algorithm. By selecting a pivot element, the data set is divided into two subsets, and the size relationship between the pivot element and k is compared to decide which subset to continue searching in. In this way, the time complexity of each recursion can be reduced to O(n), making the average time complexity of the entire algorithm O(n), where n is the size of the data set.

In addition to the fast selection algorithm, there are other sequential statistical algorithms, such as linear time selection algorithm, heap sort algorithm, etc. These algorithms can also complete the search for the kth smallest element in O(n) time.

Sequential statistics has a wide range of applications. In practice, we often need to find the median in the data set, select the element with the most occurrences, find the top k largest or smallest elements in the data set, etc. Sequential statistics provides an efficient solution.

In summary, sequential statistics provides an efficient method in the field of algorithms to solve the problem of finding the kth smallest element. It has a wide range of applications and can solve many practical problems, and the time complexity of the algorithm is low.