The sorting algorithms in this paper are explained in ascending order .

Common sorting algorithms are ：

* Insert sort —— Direct insert sort && Shell Sort
* Select sort —— Select sort && Heap sort
* Exchange sort —— Bubble sort && Quick sort
* Merge sort
* Cardinality sort
* Count sort （ Non comparison sort ）
<> Insert sort

Insertion sorting is like touching a card and inserting it into the right position .

hypothesis [0, end] Orderly , end+1 Insert location into [0, end] in , Give Way [0, end+1] Orderly .
void InsertSort(int* a, int n) { for (int i = 0; i < n - 1; ++i) { // Single sorting int
end= i; int tmp = a[end + 1]; while (end >= 0) { if (a[end] > tmp) { a[end + 1]
= a[end]; --end; } else { break; } } a[end + 1] = tmp; } }
Time complexity ：O(N^2)
Worst case move ：1 + 2 + …… + n-1
Best case ： Completely ordered Time complexity yes O(N)
stability ： stable

<> Shell Sort

Hill sort is an optimized sort algorithm based on direct insertion sort .

* Pre sort , Make the array close to order . Pre sort , It's grouping (gap > 1)
* Direct insert sort （gap == 1）

void ShellSort(int* a, int n) { int gap = n; while (gap > 1) { gap = gap / 2;
// Interval is gap Simultaneous sorting of multiple groups of data for (int i = 0; i < n - gap; ++i) { // One gap A set of data in int end = i;
int tmp = a[end + gap]; while (end >= 0) { if (a[end] > tmp) { a[end + gap] = a[
end]; end -= gap; } else { break; } } a[end + gap] = tmp; } } }
Time complexity ：
gap = gap/ 2 Time , yes logN
gap = gap/3 + 1 Time , yes log3N

When gap When I was very old , The time complexity of pre sorting is O(N).
When gap When I was very young , The array is nearly ordered , The time complexity is O(N).

So the time complexity is O(logNN) perhaps O(log3NN)
The average time complexity is O(N^1.3)

<> Direct selection sorting

basic thought ： Select the smallest one from the data elements to be sorted each time （ Or maximum ） An element of , Stored at the beginning of the sequence , Until all the data elements to be sorted are finished .

* In element collection array[i] ~ array[n-1] Select the maximum （ Small ） Element of
* If it's not the last element in this group （ first ） element , Then it is associated with the last element in this group （ first ） Element exchange .
* In the remaining array[i] ~array[n-2] （array[i+1] ~ array[n-1]） In collection , Repeat the above steps , Until the collection remains 1 Elements .
The following implementation is to select the maximum and minimum number , Sort together .
void SelectSort(int* a, int n) { int begin = 0, end = n - 1; while (begin < end
) { int mini = begin, maxi = end; // Single pass selection sorting // Find the maximum , Minimum subscript for (int i = begin; i <=
end; i++) { if (a[i] < a[mini]) { mini = i; } if (a[i] > a[maxi]) { maxi = i; }
} // minimum value , The maximum values are and respectively begin,end exchange Swap(&a[begin], &a[mini]);
// If begin and maxi coincidence , To put mini Assign to maxi if (begin == maxi) { maxi = mini; } Swap(&a[end],
&a[maxi]); ++begin; --end; } }
Time complexity : O(N^2)

<> Heap sort

Logical structure of heap ： Complete binary tree
Physical structure ： array （ sequence ）

The relationship between parent and child nodes can be obtained through observation :
leftchild = parent *2 + 1
rightchild = parent *2 + 2
parent = (child - 1) / 2

In a pile , All the fathers in the tree are greater than or equal to the children . In a small pile , All fathers in the tree are less than or equal to children .

here , We first introduce the downward adjustment algorithm （ For pile building ）, Take building a small pile as an example . Here is a very important thing !!! The precondition for the downward adjustment algorithm to establish a small heap is
The left and right subtrees are small piles . The idea of the algorithm is to start from the root node , Choose the small and medium-sized sub tree , Compare with parent node , Swap if it is smaller than the parent node , Then continue to adjust downward , Know leaf node termination .

Build a lot of code
// Build a pile void AdjustDown(int* a, int n, int root) { int parent = root; int child
= parent * 2 + 1; // The default is the left node while (child < n) { // Choose the older of the left and right children if (child + 1 < n
&& a[child + 1] > a[child]) { ++child; } if (a[child] > a[parent]) { Swap(&a[
child], &a[parent]); parent = child; child = parent * 2 + 1; } else { break; } }
}
<> Bottom up reactor building

Downward adjustment algorithm is a single step of heap building . If the left and right subtrees are not small piles , The downward adjustment algorithm cannot be used directly . What should we do at this time ?
—— Adjust backwards from the last subtree . But think again , The leaves of the last subtree do not need to be adjusted when walking backwards , So it was finally decided to adjust from the last but one non leaf subtree .

The following is the code for building a heap
// Build pile for (int i = (n - 1 - 1) / 2; i >= 0; --i) { AdjustDown(a, n, i); }
By this time , There will be a question , Ascending order , Is it a big pile or a small pile ?

If it's a small pile , The decimal point is at the top of the heap , After being selected , Use the remaining trees to select the number , The rest of the tree is in disorder , The reactor needs to be rebuilt , The time complexity of building a reactor is O(N), It's not impossible , But heap sorting has no efficiency advantage , The overall time complexity is O(N^2).

The right way is to build a pile . The first number and the last number are exchanged , Don't treat this number as part of the tree , front n-1 Number down adjustment , Choose the smaller number , Then exchange with the penultimate number ……

Implemented code
void HeapSort(int* a, int n) { // Build pile Time complexity ：O(N) for (int i = (n - 1 - 1) / 2; i
>= 0; --i) { AdjustDwon(a, n, i); } // Ascending order , Build a pile or a small pile ? Build a pile int end = n - 1; while (
end> 0) { Swap(&a, &a[end]); AdjustDwon(a, end, 0); --end; } }
Time complexity ： O(N*log N)

<> Bubble sort

This is the most basic sorting algorithm , Two layer cycle , The outer loop represents the number of sortings , The inner loop represents the number of comparisons , Here, set a variable to distinguish whether the array has been ordered .
void BubbleSort(int* a, int n) { for (int i = 0; i < n - 1; i++) { int exchange
= 0; for (int j = 0; j < n - i - 1; j++) { if (a[j] > a[j + 1]) { Swap(&a[j], &a
[j + 1]); exchange = 1; } } if (!exchange) break; } }
Time complexity ：O（N*N）
Best case :O(N)

<> Quick sort

basic thought ： Select one key keyword （ Usually the first or last one ）, The numbers on the left are better than key Smaller , dexter key All big .

After that, you only need to sort the number on the left and the number on the right .
There are three common methods for single pass sorting , Excavation method , Left and right pointer , Front and back pointer .

Fast line up in an orderly situation, the worst case , In the worst case, it degenerates into bubble sorting , The time complexity is O(N^2). in view of this situation , The original quick sort is optimized by using the method of three data extraction and inter cell optimization .
Avoid occurrence in three data fetching key The value of is the minimum or maximum .
code ：
int GetMidIndex(int* a, int left, int right) { int mid = (left + right) >> 1;
if (a[left] < a[mid]) { if (a[mid] < a[right]) { return mid; } else if (a[left]
> a[right]) { return left; } else { return right; } } else //a[left] > a[mid] {
if (a[mid] > a[right]) { return mid; } else if (a[left] > a[right]) { return
right; } else { return left; } }
The quick sort here adopts divide and conquer algorithm , When the number of each group reaches the back , The stack frame of the call is 2 Exponential growth of , The inter cell sorting method is adopted .
below , We only need to consider single pass sequencing , As mentioned earlier, there are three methods for single pass sorting , This will be described in detail below .
Function interface
void QuickSort(int* a, int left, int right) { if (left <= right) { return; }
int keyindex = PartSort1(a, left, right); //[left, keyIndex - 1]
keyIndex[keyIndex + 1, right] //QuickSort(a, left, keyindex - 1);
//QuickSort(a, keyindex + 1, right); // Inter cell optimization if (keyindex - 1 - left > 10) {
QuickSort(a, left, keyindex - 1); } else { InsertSort(a + left, keyindex - left)
; } if (right - (keyindex + 1) > 10) { QuickSort(a, keyindex + 1, right); } else
{ InsertSort(a + keyindex + 1, right - (keynidex + 1) + 1); } }
<> Excavation method

* Find the ratio on the right key A number whose value is small , Put it on the left , Put the small number in the pit on the left , Form a new pit by yourself
* Left find ratio key The number whose value is greater , Put it on the right , Put the big number in the pit on the right , Form a new pit by yourself
code implementation
int PartSort1(int* a, int left, int right) { int index = GetMidIndex(a, left,
right); Swap(&a[left], &a[index]); int begin = left, end = right; int pivot =
begin; int key = a[begin]; while (begin < end) { // Look for the small one on the right , Put it on the left while (begin < end
&& a[end] >= key) { --end; } // Put the small one in the pit on the left , Their original position forms a new pit a[pivot] = a[end]; pivot
= end; // Look big on the left , Put it on the right while (begin < end && a[begin] <= key) { ++begin; }
// Put the big one in the pit on the right , Form a new pit by yourself a[pivot] = a[begin]; pivot = begin; } a[pivot] = key;
return pivot; }
<> Left and right pointer method

basic thought ：

* begin Start on the left and find the big one ,end Start on the right and look for a small one , Exchange when you find a big one and a small one .
* last , exchange begin and keyi Value of , Returns the encountered subscript int PartSort2(int* a, int left, int right) { int
index= GetMidIndex(a, left, right); Swap(&a[left], &a[index]); int begin = left,
end= right; int keyi = begin; while (begin < end) { // Look for the small one on the right while (begin < end
&& a[end] >= a[keyi]) { --end; } // Look big on the left while (begin < end && a[begin] <= a[keyi
]) { ++begin; } // exchange Swap(&a[begin], &a[end]); } Swap(&a[begin], &a[keyi]);
return begin; }
<> Front and back pointer method

* cur Find ratio key Small number , Every time I meet key Small values stop ,++prev, exchange prev and cur Value of position ,++cur.
* exchange keyi and prev Corresponding value .
int PartSort3(int* a, int left, int right) { int index = GetMidIndex(a, left,
right); Swap(&a[left], &a[index]); int keyi = left; int prev = left, cur = left
+ 1; while (cur <= right) { if (a[cur] < a[keyi] && ++prev != cur) { Swap(&a[
prev], &a[cur]); } ++cur; } Swap(&a[keyi], &a[prev]); return prev; }
<> Merge sort

basic thought ：

Merge sort （MERGE-SORT） It is an effective sorting algorithm based on merging operation , The algorithm is a very typical application of divide and conquer method , Merge existing subsequences , Get a completely ordered sequence ; That is, first make each sub sequence orderly , Then make the subsequence segments orderly , If two ordered tables are combined into one ordered table , It is called two-way merging .

<> Recursive solution
void _MergeSort(int* a, int left, int right, int* tmp) { if (left >= right)
return; // decompose int mid = (right + left) >> 1; //[left, mid] [mid+1, right]
// It is assumed that the left and right intervals are orderly , You can start merging _MergeSort(a, left, mid, tmp); _MergeSort(a, mid + 1, right,
tmp); // Merge int begin1 = left, end1 = mid; int begin2 = mid + 1, end2 = right;
int index = left; while (begin1 <= end1 && begin2 <= end2) { if (a[begin1] < a[
begin2]) { tmp[index++] = a[begin1++]; } else { tmp[index++] = a[begin2++]; } }
while (begin1 <= end1) { tmp[index++] = a[begin1++]; } while (begin2 <= end2) {
tmp[index++] = a[begin2++]; } // Copy back to the original array for (int i = left; i <= right; ++i) { a[
i] = tmp[i]; } } void MergeSort(int* a, int n) { int* tmp = (int*)malloc(sizeof(
int) * n); _MergeSort(a, 0, n - 1, tmp); free(tmp); }
<> Non recursive solution

set up gap To simulate the merging process after array decomposition ,gap from 1 start , Every time 2 Increasing .

void MergeSortNonR(int* a, int n) { int* tmp = (int*)malloc(sizeof(int) * n);
int gap = 1; // Number of data in each group while (gap < n) { for (int i = 0; i < n; i += 2 * gap) {
int begin1 = i, end1 = i + gap - 1; int begin2 = i + gap, end2 = i + 2 * gap - 1
; // The right half interval does not exist in the merging process if (begin2 >= n) { break; } // The right half interval is insufficient in the merging process , Fix it if (end2 >= n) {
end2= n - 1; } int index = i; while (begin1 <= end1 && begin2 <= end2) { if (a[
begin1] < a[begin2]) { tmp[index++] = a[begin1++]; } else { tmp[index++] = a[
begin2++]; } } while (begin1 <= end1) { tmp[index++] = a[begin1++]; } while (
begin2<= end2) { tmp[index++] = a[begin2++]; } // Copy back for (int j = i; j <= end2
; ++j) { a[j] = tmp[j]; } } gap *= 2; } free(tmp); }
<> Cardinality sort

123 45 12 9 88 43

Take their positions in turn , Ten , Hundredth …… sort

Bit ： 12 123 43 45 88 9

Ten ：9 12 123 43 45 88

Hundredth ： and so on

Only integers can be sorted , In practice, this ranking is meaningless .

<> Count sort

Count sort is a non comparative sort , The thought is very clever , The scope of application is limited , It is suitable for sorting a set of shaping data in the range set .
void CountSort(int* a, int n) { int max = a, min = a; for (int i = 0; i <
n; ++i) { if (a[i] > max) { max = a[i]; } if (a[i] < min) { min = a[i]; } } int
range= max - min + 1; int* count = (int*)malloc(sizeof(int)*range); memset(
count, 0, sizeof(int)*range); // Statistical times for (int i = 0; i < n; ++i) { count[a[i]-
min]++; } int j = 0; for (int i = 0; i < range; ++i) { while (count[i]--) { a[j
++] = i+min; } } free(count); }

Technology
Daily Recommendation