Data structure – time complexity and space complexity

1. Algorithm efficiency

1.1 Complexity of algorithm

After the algorithm is written into an executable program, it takes time, resources and space to run.
(
Memory
)
resource . therefore
Measuring the quality of an algorithm, generally
Measured from two dimensions: time and space
, namely time complexity and space complexity.

Time complexity mainly measures how fast an algorithm runs, while space complexity mainly measures the extra space required to run an algorithm
. In the early days of computer development, computers had very little storage capacity. So we care a lot about space complexity. However, after the rapid development of the computer industry, the storage capacity of computers has reached a very high level. So we no longer need to pay special attention to the space complexity of an algorithm.

2. Time complexity

2.1 The concept of time complexity

Definition of time complexity: In computer science,
The time complexity of an algorithm is a function
, which quantitatively describes the running time of the algorithm. The time it takes to execute an algorithm cannot be calculated theoretically. You can only know it if you put your program on the machine and run it. But do we need to test every algorithm on a computer? It is possible to test everything on a computer, but this is very troublesome, so the time complexity analysis method is introduced. The time an algorithm takes is proportional to the number of executions of its statements. The number of executions of basic operations in an algorithm is the algorithm
Time complexity.

That is: finding the mathematical expression between a certain basic statement and the problem size N is to calculate the time complexity of the algorithm.

// Please calculate how many times the + + count statement in Func1 has been executed in total?
void Func1(int N)
{
int count = 0;
for (int i = 0; i < N ; + + i)
{
 for (int j = 0; j < N ; + + j)
 {
  + + count;
 }
}
 
for (int k = 0; k < 2 * N ; + + k)
{
  + + count;
}
int M = 10;
while (M--)
{
  + + count;
}
printf("%d\\
", count);
}

Func1
Number of basic operations performed: F(N)=N^2 + 2*N + 10

In practice, when we calculate the time complexity, we do not actually have to calculate the exact number of executions, but only need
Approximate number of executions, then this
Here we use large
O
Progressive representation of.

2.2 Asymptotic representation of bigO

BigOnotation (Big O notation): is a mathematical notation used to describe the asymptotic behavior of a function .

Derivation large
O
Step method:

1
, use constant
1
Replaces all additive constants in run time.

2
, in the modified run times function, only the highest order term is retained.

3
, if the highest order term exists and is not
1
, remove the constant multiplied by this item. The result is big
O
level.

Use large
O
After the asymptotic representation of ,
Func1
The time complexity of

O(N^2)

Through the above we will find that the large
O
asymptotic representation of
Removed items that have little impact on the results
, showing the number of executions concisely and clearly.

In addition, there are best, average and worst cases for the time complexity of some algorithms:

In practice, the general concern is the worst-case operating situation of the algorithm

2.3 Common time complexity calculation examples

Example
1
:

// Calculate the time complexity of Func2?
void Func2(int N)
{
 int count = 0;
 for (int k = 0; k < 2 * N ; + + k)
 {
  + + count;
 }
 int M = 10;
 while (M--)
 {
  + + count;
 }
 printf("%d\\
", count);
}

Analysis:
The basic operation is performed 2N + 10 times. By deriving the big O-order method, we know that the time complexity is O(N)

Example
2:

// Calculate the time complexity of Func3?
void Func3(int N, int M)
{
 int count = 0;
 for (int k = 0; k < M; + + k)
 {
  + + count;
 }
 for (int k = 0; k < N ; + + k)
 {
  + + count;
 }
 printf("%d\\
", count);
}

Analysis:
The basic operation is performed M + N times, there are two unknowns M and N, and the time complexity is O(N + M)

Example
3:

// Calculate the time complexity of Func4?
void Func4(int N)
{
 int count = 0;
 for (int k = 0; k < 100; + + k)
 {
  + + count;
 }
 printf("%d\\
", count);
}

Analysis:
The basic operation was executed 100 times. By deriving the big O-order method, the time complexity is O(1)

Example
4:

// Calculate the time complexity of BubbleSort?
void BubbleSort(int* a, int n)
{
 assert(a);
 for (size_t end = n; end > 0; --end)
 {
 int exchange = 0;
 for (size_t i = 1; i < end; + + i)
 {
 if (a[i-1] > a[i])
 {
 Swap( & amp;a[i-1], & amp;a[i]);
 exchange = 1;
 }
 }
 if (exchange == 0)
 break;
 }
}

Analysis:
The basic operation is executed N times at best, and 1 + 2 + ……. + (n-1) times at worst. By deriving the big O-order method + the time complexity is generally considered worst, time The complexity is O(N^2).

Example 5
:

// Calculate the time complexity of BinarySearch?
int BinarySearch(int* a, int n, int x)
{
 assert(a);
 int begin = 0;
 int end = n-1;
 // [begin, end]: begin and end are left-closed and right-closed intervals, so there is an = sign
 while (begin <= end)
 {
 int mid = begin + ((end-begin)>>1);//can also be written as int mid = begin + (end-begin)/2;
 if (a[mid] < x)
 begin = mid + 1;
 else if (a[mid] > x)
 end = mid-1;
 else
 return mid;
 }
 return -1;
}

Analysis: This is a binary search,
The basic operation is performed once at best and O(logN) times at worst, and the time complexity is O(logN).

ps:
logN means that the base is 2 and the logarithm is N in algorithm analysis. In some places it will be written lgN.

Example 6
:

//The time complexity of calculating factorial recursion Fac?
long long Fac(size_t N)
{
if(0==N)
return 1;

return Fac(N-1)*N;
}
 

Analysis: The basic operation is recursive N times, and the time complexity is O(N).

Example 7
:

//The time complexity of calculating Fibonacci recursion Fib?
long long Fib(size_t N)
{
 if(N < 3)
 return 1;
 
 return Fib(N-1) + Fib(N-2);
}

Analysis: The basic operation is recursive 2^N times, and the time complexity is O(2^N).

3. Space complexity

Space complexity is also a mathematical expression that determines the performance of an algorithm during its operation.
Measurement of temporary storage space occupied
.

Space complexity is not how much the program takes up
bytes
space, because this doesn’t make much sense, so the space complexity is calculated by the number of variables.

The space complexity calculation rules are basically similar to the practical complexity, and also use
Big
O
Progressive notation
.

Notice:
Stack space required when the function is running
(
Storage parameters, local variables, some register information, etc.
)
This is determined during compilation, so
This space complexity is mainly determined by the additional space explicitly requested by the function at runtime.

Example1:

// Calculate the space complexity of BubbleSort?
void BubbleSort(int* a, int n)
{
  assert(a);
 for (size_t end = n; end > 0; --end)
 {
 int exchange = 0;
 for (size_t i = 1; i < end; + + i)
 {
 if (a[i-1] > a[i])
 {
 Swap( & amp;a[i-1], & amp;a[i]);
 exchange = 1;
 }
 }
 if (exchange == 0)
 break;
 }
}

Analysis: Bubble sort uses a constant amount of extra space, so the space complexity is O(1)

Example
2
:

// Calculate the space complexity of Fibonacci?
// Return the first n terms of the Fibonacci sequence
long long* Fibonacci(size_t n)
{
 if(n==0)
 return NULL;
 
 long long * fibArray = (long long *)malloc((n + 1) * sizeof(long long));
 fibArray[0] = 0;
 fibArray[1] = 1;
 for (int i = 2; i <= n ; + + i)
 {
 fibArray[i] = fibArray[i - 1] + fibArray [i - 2];
 }
 return fibArray;
}

Analysis:
Dynamicly opens up N + 1 spaces, and the space complexity is O(N)

Example
3
:

// Calculate the space complexity of factorial recursion Fac?
long long Fac(size_t N)
{
if(N==0)
return 1;

return Fac(N-1)*N;
}

Analysis: Recursively called N times, N stack frames are opened, and each stack frame uses a constant amount of space. The space complexity is O(N)

4. ComplexityojExercises

4.1 Rotating array

Solution 1: Rotate one number at a time, put the last number in the temporary variable, move the other elements of the array backward by one, and then put the value of the temporary variable in the array index 0 Position, rotate k times

Time complexity: O(N*K), the worst case of K is N-1, so the time complexity is O(N^2)

Space complexity: O(1)

void rotate(int* nums, int numsSize, int k) {
   
    k=k%numsSize;
    for(int i=0;i<k;i + + )
    {
        int tmp=nums[numsSize-1];
        for(int j=numsSize-1;j>0;j--)
        {
            nums[j]=nums[j-1];
        }
        nums[0]=tmp;
    }
}

Solution 2: Reverse the first n-k numbers, reverse the last k numbers, and reverse the entire

Time complexity: A total of two arrays are traversed, O(N)

Space complexity: O(1)

    void reverse(int nums[],int left,int right)
{
  while(left<right)
  {
      int tmp=nums[left];
      nums[left]=nums[right];
      nums[right]=tmp;
      left + + ;
      right--;
  }
}
void rotate(int* nums, int numsSize, int k) {
   
    k=k%numsSize;
    reverse(nums,0,numsSize-k-1);
    reverse(nums,numsSize-k,numsSize-1);
    reverse(nums,0,numsSize-1);
   
}
    

Solution three: Open an array, put the last k data of the original array into the first k positions of the new array, put the first n-k data of the original array into the last n-k positions of the new array, and then copy the new array back.

Time complexity: O(N)

Space complexity: O(N)

void rotate(int* nums, int numsSize, int k) {
   
    k=k%numsSize;
   int arr[numsSize];
   for(int i=0;i<k;i + + )
{
    arr[i]=nums[numsSize-k + i];
}
for(int i=0;i<numsSize-k;i + + )
{
    arr[k + i]=nums[i];
}
for(int i=0;i<numsSize;i + + )
{
    nums[i]=arr[i];
}
   
}

4.2 Disappearing numbers

Solution 1:0^a=a,a^a=0

XOR all elements of the array, and then XOR XOR with 0 to n numbers, the result is the missing number

Time complexity: O(N)

Space complexity: O(1)

int missingNumber(int* nums, int numsSize){

    int tmp=0;
    for(int i=0;i<numsSize;i + + )
    {
        tmp=tmp^nums[i];
    }
    for(int i=0;i<=numsSize;i + + )
    {
        tmp=tmp^i;
    }
    return tmp;
}

Solution 2: Add up 0~n numbers, and then subtract the numbers in the array one by one. The final result is the missing number.

Time complexity: O(N)

Space complexity: O(1)

int missingNumber(int* nums, int numsSize){

   int tmp=numsSize*(numsSize + 1)/2;
   for(int i=0;i<numsSize;i + + )
   {
       tmp=tmp-nums[i];
   }
   return tmp;
}

4.3 Removing elements

Idea:

Because the array is required to be modified in place, the space complexity is O(1), and additional space cannot be added. Then we define two subscripts, dst and scr, and give the value of the subscript scr to the value of the subscript dst. When nums[scr]! =val, scr + +, dst + +; When scr encounters an element with nums[scr]==val, scr will skip it, dst will remain unchanged, and the final value of dst will be the new length of the array after removal.

Time complexity: O(N)

Space complexity: O(1)

int removeElement(int* nums, int numsSize, int val) {
      int scr=0;
      int dst=0;
      while(scr<numsSize)
      {
          if(nums[scr]!=val)
          {
              nums[dst]=nums[scr];
             scr++;
              dst++;
          }
          else
          {
             scr++;
          }
      }
      return dst;
}

4.4 Remove duplicates from ordered arrays

Idea: double pointers

It should be noted that the conditions given in the question are not strictly increasing. This is the key to solving the problem. If a duplicate item is encountered, the elements after the duplicate item will be overwritten forward Post Duplicates

1) When the array length is 1, return 1

2) When the array element is greater than 2,

If nums[fast] ≠ nums[slow], then nums[slow + 1] = nums[fast];

If nums[fast] == nums[slow], then the pointer fast continues to search backward.

Time complexity: O(N)

Space complexity: O(1)

int removeDuplicates(int* nums, int numsSize) {
    if(numsSize==1)
    {
        return 1;
    }
    int fast=1;
    int slow=0;
    while(fast<numsSize)
    {
       if(nums[fast]!=nums[slow])
       {
          
            nums[slow + 1]=nums[fast];
             slow + + ;
             fast++;
       }
       else
       {
          
             fast++;
      
       }
     

    }
    return slow + 1;
    
}

4.5 Merge two ordered arrays

Idea 1: Re-create an array, use the double pointer method to point to two arrays respectively, put the smaller one into the new array, and finally copy it back to the nums1 array

Time complexity: O(n + m)

Space complexity: O(n + m)

void merge(int* nums1, int nums1Size, int m, int* nums2, int nums2Size, int n) {
     int arr[n + m];
     int i=0;
     int j=0;
     int k=0;
     while(i<m & amp; & amp;j<n)
     {
         if(nums1[i]>nums2[j])
         {
             arr[k]=nums2[j];
             j + + ;
             k++;
         }
         else
         {
        arr[k]=nums1[i];
        i + + ;
        k++;
         }
     }
     if(i==m)
     {
        while(j<n)
        {
             arr[k]=nums2[j];
         k++;
         j + + ;
        }
     }
      if(j==n)
     {
        while(i<m)
        {
             arr[k]=nums1[i];
         k++;
         i + + ;
        }
     }
     for(i=0;i<m + n;i + + )
     {
         nums1[i]=arr[i];
     }
    
}

Idea 2: We combine nums1 and nums2 into an array, and then use quick sort

int cmp(int* a, int* b) {
    return *a - *b;
}

void merge(int* nums1, int nums1Size, int m, int* nums2, int nums2Size, int n) {
    for (int i = 0; i <n; + + i) {
        nums1[m + i] = nums2[i];
    }
    qsort(nums1, nums1Size, sizeof(int), cmp);
}