When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Bubble sort - Wikipedia

    en.wikipedia.org/wiki/Bubble_sort

    Bubble sort, sometimes referred to as sinking sort, is a simple sorting algorithm that repeatedly steps through the input list element by element, comparing the current element with the one after it, swapping their values if needed. These passes through the list are repeated until no swaps have to be performed during a pass, meaning that the ...

  3. Best, worst and average case - Wikipedia

    en.wikipedia.org/wiki/Best,_worst_and_average_case

    Heapsort has O(n) time when all elements are the same. Heapify takes O(n) time and then removing elements from the heap is O(1) time for each of the n elements. The run time grows to O(nlog(n)) if all elements must be distinct. Bogosort has O(n) time when the elements are sorted on the first iteration. In each iteration all elements are checked ...

  4. Sorting algorithm - Wikipedia

    en.wikipedia.org/wiki/Sorting_algorithm

    For example, if any number of elements are out of place by only one position (e.g. 0123546789 and 1032547698), bubble sort's exchange will get them in order on the first pass, the second pass will find all elements in order, so the sort will take only 2n time.

  5. Time complexity - Wikipedia

    en.wikipedia.org/wiki/Time_complexity

    For example, simple, comparison-based sorting algorithms are quadratic (e.g. insertion sort), but more advanced algorithms can be found that are subquadratic (e.g. shell sort). No general-purpose sorts run in linear time, but the change from quadratic to sub-quadratic is of great practical importance.

  6. Algorithmic efficiency - Wikipedia

    en.wikipedia.org/wiki/Algorithmic_efficiency

    For example, bubble sort and timsort are both algorithms to sort a list of items from smallest to largest. Bubble sort organizes the list in time proportional to the number of elements squared ( O ( n 2 ) {\textstyle O(n^{2})} , see Big O notation ), but only requires a small amount of extra memory which is constant with respect to the length ...

  7. Average-case complexity - Wikipedia

    en.wikipedia.org/wiki/Average-case_complexity

    As mentioned above, much early work relating to average-case complexity focused on problems for which polynomial-time algorithms already existed, such as sorting. For example, many sorting algorithms which utilize randomness, such as Quicksort, have a worst-case running time of O(n 2), but an average-case running time of O(n log(n)), where n is ...

  8. Talk:Bubble sort - Wikipedia

    en.wikipedia.org/wiki/Talk:Bubble_sort

    On pipelined architectures, Bubble Sort results in O(N*log(N)) branch mispredictions (that is, the total count of left-to-right minima found during the sort). Insertion sort: O(N). ...and so bubble sort's asymptotic running time is - typically - twice that of insertion sort. When N is small, on a pipelined architecture, it is worse even than that.

  9. Worst-case complexity - Wikipedia

    en.wikipedia.org/wiki/Worst-case_complexity

    In computer science (specifically computational complexity theory), the worst-case complexity measures the resources (e.g. running time, memory) that an algorithm requires given an input of arbitrary size (commonly denoted as n in asymptotic notation). It gives an upper bound on the resources required by the algorithm.