When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Sorting algorithm - Wikipedia

    en.wikipedia.org/wiki/Sorting_algorithm

    Sorting small arrays optimally (in the fewest comparisons and swaps) or fast (i.e. taking into account machine-specific details) is still an open research problem, with solutions only known for very small arrays (<20 elements). Similarly optimal (by various definitions) sorting on a parallel machine is an open research topic.

  3. Bubble sort - Wikipedia

    en.wikipedia.org/wiki/Bubble_sort

    Bubble sort, sometimes referred to as sinking sort, is a simple sorting algorithm that repeatedly steps through the input list element by element, comparing the current element with the one after it, swapping their values if needed. These passes through the list are repeated until no swaps have to be performed during a pass, meaning that the ...

  4. Insertion sort - Wikipedia

    en.wikipedia.org/wiki/Insertion_sort

    The average case is also quadratic, [4] which makes insertion sort impractical for sorting large arrays. However, insertion sort is one of the fastest algorithms for sorting very small arrays, even faster than quicksort; indeed, good quicksort implementations use insertion sort for arrays smaller than a certain threshold, also when arising as ...

  5. Sorted array - Wikipedia

    en.wikipedia.org/wiki/Sorted_array

    Sorted arrays are the most space-efficient data structure with the best locality of reference for sequentially stored data. [citation needed]Elements within a sorted array are found using a binary search, in O(log n); thus sorted arrays are suited for cases when one needs to be able to look up elements quickly, e.g. as a set or multiset data structure.

  6. In-place algorithm - Wikipedia

    en.wikipedia.org/wiki/In-place_algorithm

    As another example, many sorting algorithms rearrange arrays into sorted order in-place, including: bubble sort, comb sort, selection sort, insertion sort, heapsort, and Shell sort. These algorithms require only a few pointers, so their space complexity is O(log n). [1] Quicksort operates in-place on the data to be sorted.

  7. Selection sort - Wikipedia

    en.wikipedia.org/wiki/Selection_sort

    Selection sort is not difficult to analyze compared to other sorting algorithms, since none of the loops depend on the data in the array. Selecting the minimum requires scanning n {\displaystyle n} elements (taking n − 1 {\displaystyle n-1} comparisons) and then swapping it into the first position.

  8. Cocktail shaker sort - Wikipedia

    en.wikipedia.org/wiki/Cocktail_shaker_sort

    Like most variants of bubble sort, cocktail shaker sort is used primarily as an educational tool. More efficient algorithms such as quicksort, merge sort, or timsort are used by the sorting libraries built into popular programming languages such as Python and Java. [4] [5]

  9. Cycle sort - Wikipedia

    en.wikipedia.org/wiki/Cycle_sort

    When the array contains only duplicates of a relatively small number of items, a constant-time perfect hash function can greatly speed up finding where to put an item 1, turning the sort from Θ(n 2) time to Θ(n + k) time, where k is the total number of hashes. The array ends up sorted in the order of the hashes, so choosing a hash function ...