Algorithm time complexity and space complexity and space complexity

When we usually do math problems, there are simple methods and complex methods for the same problem. And the same goes for our computer programs. When an algorithm is written as an executable program, it requires time resources and space (memory) resources when running. Therefore, measuring the quality of an algorithm is generally measured from the two dimensions of time and space, that is, the time complexity and space complexity we want to understand today.

1. Time complexity of the algorithm

1. The concept of time complexity

The time complexity of an algorithm is a function that quantitatively describes the running time of the algorithm. The time an algorithm takes is proportional to the number of executions of its statements. The number of executions of basic operations in the algorithm is the time complexity of the algorithm.

That is: finding the mathematical expression between a certain basic statement and the problem size N is to calculate the time complexity of the algorithm.

void Func1(int N)
{
int count = 0;
for (int i = 0; i < N; + + i)
{
for (int j = 0; j < N; + + j)
{
+ + count;
}
}//The above two for loops are executed a total of N*N=N^2 times
for (int k = 0; k < 2 * N; + + k)
{
+ + count;
}//Execute 2*N times
int M = 10;
while (M--)
{
+ + count;
}//Execute M=10 times
printf("%d\
", count);
}
int main()
{
int N = 0;
Func1(N);
}

The number of basic operations performed by Func1: F(N) = N^2 + 2*N + 10

Number of executions Accurate value Estimated value

N = 10 F(N) = 130 100

N = 100 F(N) = 10210 10000

N = 1000 F(N) = 1002010 1000000

Summary: The larger N·, the smaller the impact on subsequent results.

When we calculate the time complexity, we do not necessarily need to calculate the exact number of executions, but only the approximate number of executions. So how to express the time complexity? We need to use the asymptotic representation of Big O.

2. Big O asymptotic representation

Big O notation: is a mathematical notation used to describe the asymptotic behavior of functions.

Derive the Big O method:

1. Replace all additive constants in the run time with constant 1.

2. In the modified running times function, only the highest order term is retained.

3. If the highest-level term exists and is not 1, remove the constant multiplied by this term. The result is Big O order.

After using the big O asymptotic method, the time complexity of Func1 is: O(N^2)

There are best, average and worst cases for the time complexity of the algorithm:

Worst case: maximum number of runs (upper bound) for any input size.

Average case: the expected number of runs for any input size.

Best case: minimum number of runs (lower bound) for any input size.

For example: Search for a data x in an array of length N

Best case: found 1 time

Worst case: Found N times

Average situation: N/2 times found

In practice, we generally focus on the worst-case operating situation of the algorithm, so the time complexity of searching for data in the array is O(N)

3. Example questions about time complexity:

Example 1: Calculate the time complexity of Func(2)

void Func2(int N)
{
int count = 0;
//Number of executions: 2*N
for (int k = 0; k < 2 * N; + + k)
{
+ + count;
}
//Number of executions: M = 10
int M = 10;
while (M--)
{
+ + count;
}
printf("%d\
", count);
}

Time complexity: O(N)

Example 2: Calculate the time complexity of Func3

void Func3(int N, int M)
{
int count = 0;
//Number of executions: M
for (int k = 0; k < M; + + k)
{
+ + count;
}
//Execution times N
for (int k = 0; k < N; + + k)
{
+ + count;
}
printf("%d\
", count);
}

The time complexity is: O(M + N)

We cannot determine the order of magnitude of M and N, cannot determine their size, and cannot discard them arbitrarily. (In case M is much larger than N, or N is much larger than M and other complex situations, this is something we cannot determine in this code.)

Example 3: Calculate the time complexity of Func4

void Func4(int N)
{
int count = 0;
for (int k = 0; k < 100; + + k)
{
+ + count;
}
printf("%d\
", count);
}

Time complexity is: O(1)

As long as it is a constant, use O(1) instead in time complexity. Replace all additive constants in runtime with the constant 1.

Example 4: Calculate the time complexity of strchr. (strchr is used to find a single character in a string)

const char* strchr(const char* str, int character);

Appends a character in a string, possibly last, possibly in the middle, possibly first.

So three cases: Best case: first character in string

Worst case: the last character in the string

Average case: one character half the length of the string

So take the worst case time complexity: O(N)

Example 5: Calculating the time complexity of BubbleSort

void BubbleSort(int* a, int n)
{
assert(a);
for (size_t end = n; end > 0; --end)
{
int exchange = 0;
for (size_t i = 1; i < end; + + i)
{
if (a[i - 1] > a[i])
{
Swap( & amp;a[i - 1], & amp;a[i]);
exchange = 1;
}
}
if (exchange == 0)
break;
}
}

This code is bubble sorting. We directly consider the worst case scenario. Suppose there is an array and bubble sort it. The worst case scenario is to arrange it from large to small, for example, arr[] = {9, 8, 7, 6, 5, 4, 3, 2, 1}

Bubble sort this array, even if it is arranged from small to large

The time complexity of BubbleSort is: O(N^2)

Example 6:

int BinarySearch(int* a, int n, int x)
{
assert(a);
int begin = 0;
int end = n - 1;
while (begin <= end)
{
int mid = begin + ((end - begin) >> 1);
if (a[mid] < x)
{
begin = mid + 1;
}
else if (a[mid] > x)
{
end = mid - 1;
}
else
return mid;
}
return -1;
}

The analysis is as follows:

Example 7: Time complexity of calculating factorial recursion Fac

long long Fac(size_t N)
{
if (0 == N)
return 1;
return Fac(N - 1) * N;
}

Factorial recursion: 5! = 5*4*3*2*1

The number of executions is an arithmetic sequence, the first item is N-1, the second item is N-2, and the last item is 1.

According to the formula of the sum of the first n terms of the arithmetic sequence: Sn(N) = (N(N-1) )/2

So the time complexity of factorial recursion Fac is: O(N^2)

In the modified number of runs function, only the highest order terms are retained.

Example 8: Calculating the time complexity of Fibonacci recursion Fib

long long Fib(size_t N)
{
if (N < 3)
return 1;
return Fib(N-1) + Fib(N-2)
}

So the time complexity of the algorithm is: O(2^N)

2. Space complexity of the algorithm

1. The concept of algorithm time complexity

Space complexity is also a mathematical expression that is a measure of the amount of additional storage space an algorithm temporarily occupies during operation.

Space complexity is not how many bytes of space the program occupies, because this is not very meaningful, so space complexity is calculated by the number of variables (it is a rough calculation). The space complexity calculation rules are basically similar to the time complexity, and also use the big O asymptotic notation.

The stack space required when the function is running (storing parameters, local variables, some register information, etc.) has been determined during compilation, so the space complexity is mainly determined by the additional space requested by the function at runtime.

2. Classic examples of algorithm space complexity

Example 1: Calculate the space complexity of BubbleSort

void BubbleSort(int* a, int n)
{
assert(a);
for (size_t end = n; end > 0; --end)
{
int exchange = 0;
for (size_t i = 1; i < end; + + i)
{
if (a[i - 1] > a[i])
{
Swap( & amp;a[i - 1], & amp;a[i]);
exchange = 1;
}
}
if (exchange == 0)
{
break;
}
}
}

Only a constant level of extra space is used, which does not increase as the input size n increases. I didn’t really understand why it was a constant? Because in the for loop, when i = 0, i cannot = 1; when i = 1, i cannot equal 2; therefore, only one space is needed.

Example 2: Space complexity of Fibonacci, returning the first n terms of the Fibonacci sequence

Space can be reused and is not calculated cumulatively. (space complexity)

Time is gone forever and time is accumulated. (time complexity)

The essence of space destruction is to return the right to use it. It’s like going out to stay in a hotel and booking a room. Now that room belongs to you. If you check out that room, it no longer belongs to you, which is equivalent to returning the right to use it to you. But someone else can book that room. Therefore, the essence of space destruction is to return the right to use it

long long* Fibonacci(size_t n)
{
if(n==0)
return NULL;
long long* fibArray = (long long*)malloc((n + 1) * sizeof(long long));
fibArray[0] = 0;
fibArray[1] = 1;
for (int i = 2; i <= n; + + i)
{
fibArray[i] = fibArray[i - 1] + fibArray[i - 2];
}
return fibArray;
}

Most of the functions called in the figure are repeated. This algorithm will reuse memory space in the process of creating function stack frames. The leftmost branch creates the most stack frames and completely includes the right one. That is to say, the essence of stack frame destruction is to return the right to use. It can be seen that its space complexity is the number of stack frames created on the leftmost side, which is: O(N)

Example 3: Calculating the space complexity of factorial recursion Fac

long long Fac(size_t N)
{
if(N==0)
return 1;
return Fac(N - 1) * N;
}

The function recurses N times, creating a stack frame each time. The function does not create temporary variables, and the space complexity is O(N).

as the picture shows:

3. Summary of key points

1. The number of executions of basic operations in an algorithm is the time complexity of the algorithm.

2. Under normal circumstances, we focus on the worst-case scenario of the algorithm, that is, pessimistic expectations.

3. The measure of the additional storage space temporarily occupied by the algorithm during operation is the space complexity of the algorithm.

4. Space can be reused and is not calculated cumulatively (space complexity). Time is gone forever and time is calculated cumulatively (time complexity).

5. The essence of space destruction is to return the right to use it.

I hope you all make progress every day!