Programming/Algorithms: Difference between revisions

From etwiki
Jump to navigation Jump to search
No edit summary
 
(9 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{Note|Adapted from: https://www.bigocheatsheet.com/ }}
== Pseudocode ==


=== Fibonacci At Index ===
<pre>
def fib(n):
  if n <= 1:
      return n
  else:
      return(fib(n-1) + fib(n-2))
</pre>
=== Binary Tree Search ===
<pre>
Iterative-Tree-Search(x, key)
  while x ≠ NIL and key ≠ x.key then
    if key < x.key then
      x := x.left
    else
      x := x.right
    end if
  repeat
  return x
</pre>
[https://en.wikipedia.org/wiki/Binary_search_tree#Iterative_search Source]
=== BFS ===
<pre>
procedure BFS(G, root) is
    let Q be a queue
    label root as explored
    Q.enqueue(root)
    while Q is not empty do
        v := Q.dequeue()
        if v is the goal then
            return v
        for all edges from v to w in G.adjacentEdges(v) do
            if w is not labeled as explored then
                label w as explored
                w.parent := v
                Q.enqueue(w)
</pre>
[https://en.wikipedia.org/wiki/Breadth-first_search#Pseudocode Source]
=== DFS ===
<pre>
procedure DFS(G, v) is
    label v as discovered
    for all directed edges from v to w that are in G.adjacentEdges(v) do
        if vertex w is not labeled as discovered then
            recursively call DFS(G, w)
</pre>
<pre>
procedure DFS_iterative(G, v) is
    let S be a stack
    S.push(v)
    while S is not empty do
        v = S.pop()
        if v is not labeled as discovered then
            label v as discovered
            for all edges from v to w in G.adjacentEdges(v) do
                S.push(w)
</pre>
[https://en.wikipedia.org/wiki/Depth-first_search#Pseudocode Source]
=== Dijkstra ===
<pre>
function Dijkstra(Graph, source):
 
    for each vertex v in Graph.Vertices:
        dist[v] ← INFINITY
        prev[v] ← UNDEFINED
        add v to Q
    dist[source] ← 0
    while Q is not empty:
        u ← vertex in Q with min dist[u]
        remove u from Q
       
        for each neighbor v of u still in Q:
            alt ← dist[u] + Graph.Edges(u, v)
            if alt < dist[v]:
                dist[v] ← alt
                prev[v] ← u
    return dist[], prev[]
</pre>
[https://en.wikipedia.org/wiki/Dijkstra%27s_algorithm#Pseudocode Source]
== Big O Notation ==
{{Note|Adapted from [https://www.bigocheatsheet.com/ Big-O Cheat Sheet] }}


{| class="wikitable"  
{| class="wikitable"  
|+ Data Structure Operations
|+ Data Structure Operations
|-
|-
! rowspan="3"| Data Structure !! colspan="4"| Time Complexity !! rowspan="3"| Space Complexity
! rowspan="3"| Data Structure !! colspan="4"| Time Complexity !! rowspan="3"| Space Complexity !! rowspan="3"| Notes
|-
|-
! colspan="4"|Average -> Worst
! colspan="4"|Average -> Worst
Line 25: Line 125:
| Binary Search Tree || Θ(log(n)) -> O(n) || Θ(log(n)) -> O(n) || Θ(log(n)) -> O(n) || Θ(log(n)) -> O(n) || O(n)
| Binary Search Tree || Θ(log(n)) -> O(n) || Θ(log(n)) -> O(n) || Θ(log(n)) -> O(n) || Θ(log(n)) -> O(n) || O(n)
|-
|-
| B Tree || Θ(log(n)) -> O(log(n)) || Θ(log(n)) -> O(log(n)) || Θ(log(n)) -> O(log(n)) || Θ(log(n)) -> O(log(n)) || O(n)
| B Tree || Θ(log(n)) -> O(log(n)) || Θ(log(n)) -> O(log(n)) || Θ(log(n)) -> O(log(n)) || Θ(log(n)) -> O(log(n)) || O(n) || Self-balancing
|-
| Heap || || || O(log(n)) || O(log(n)) || || Complete binary tree, one maximally efficient implementation of a priority queue
|}
 
{| class="wikitable"
|+ Array Sorting Algorithms
|-
! rowspan="2"| Data Structure !! colspan="3"| Time Complexity !! rowspan="2"| Space Complexity !! rowspan="2"| Notes
|-
! Best !! Average !! Worst
|-
| Bubble Sort || Ω(n) || Θ(n^2) || O(n^2) || O(1)
|-
| Quicksort || Ω(n log(n)) || Θ(n log(n)) || O(n^2) || O(log(n))
|-
| Mergesort || Ω(n log(n)) || Θ(n log(n)) || O(n log(n)) || O(n)
|-
| Insertion Sort || Ω(n) || Θ(n^2) || O(n^2) || O(1)
|-
| Timsort || Ω(n) || Θ(n log(n)) || O(n log(n)) || O(n) || Combination of Mergesort and Insertion Sort made for Python
|-
| Heapsort || Ω(n log(n)) || Θ(n log(n)) || O(n log(n)) || O(1)
|}
 
{| class="wikitable"
|+ Tree Traversing Algorithms
|-
! rowspan="2"| Data Structure !! colspan="3"| Time Complexity !! rowspan="2"| Space Complexity !! rowspan="2"| Notes
|-
! Best !! Average !! Worst
|-
| BFS ||  ||  || O(V + E) = O(b^d) || O(V) = O(b^d)
|-
| DFS ||  ||  || O(b^d) || O(E) = O(bd)
|-
| DFS, recursive ||  ||  || O(V + E) || O(V)
|-
| Dijkstra || || || ||
|-
| A* || || || O(E) = O(b^d) || O(V) = O(b^d)
|}
|}

Latest revision as of 08:18, 19 January 2023

Pseudocode

Fibonacci At Index

def fib(n):
   if n <= 1:
       return n
   else:
       return(fib(n-1) + fib(n-2))

Binary Tree Search

Iterative-Tree-Search(x, key)
   while x ≠ NIL and key ≠ x.key then
     if key < x.key then
       x := x.left
     else
       x := x.right
     end if
   repeat
   return x

Source

BFS

procedure BFS(G, root) is
    let Q be a queue
    label root as explored
    Q.enqueue(root)
    while Q is not empty do
        v := Q.dequeue()
        if v is the goal then
            return v
        for all edges from v to w in G.adjacentEdges(v) do
            if w is not labeled as explored then
                label w as explored
                w.parent := v
                Q.enqueue(w)

Source

DFS

procedure DFS(G, v) is
    label v as discovered
    for all directed edges from v to w that are in G.adjacentEdges(v) do
        if vertex w is not labeled as discovered then
            recursively call DFS(G, w)
procedure DFS_iterative(G, v) is
    let S be a stack
    S.push(v)
    while S is not empty do
        v = S.pop()
        if v is not labeled as discovered then
            label v as discovered
            for all edges from v to w in G.adjacentEdges(v) do 
                S.push(w)

Source


Dijkstra

function Dijkstra(Graph, source):
  
    for each vertex v in Graph.Vertices:
        dist[v] ← INFINITY
        prev[v] ← UNDEFINED
        add v to Q
    dist[source] ← 0

    while Q is not empty:
        u ← vertex in Q with min dist[u]
        remove u from Q
        
        for each neighbor v of u still in Q:
            alt ← dist[u] + Graph.Edges(u, v)
            if alt < dist[v]:
                dist[v] ← alt
                prev[v] ← u

    return dist[], prev[]

Source

Big O Notation

Adapted from Big-O Cheat Sheet
Data Structure Operations
Data Structure Time Complexity Space Complexity Notes
Average -> Worst
Access Search Insertion Deletion
Array Θ(1) -> O(1) Θ(n) -> O(n) Θ(n) -> O(n) Θ(n) -> O(n) O(n)
Stack Θ(n) -> O(n) Θ(n) -> O(n) Θ(1) -> O(n) Θ(1) -> O(n) O(n)
Queue Θ(n) -> O(n) Θ(n) -> O(n) Θ(1) -> O(n) Θ(1) -> O(n) O(n)
Linked List Θ(n) -> O(n) Θ(n) -> O(n) Θ(1) -> O(n) Θ(1) -> O(n) O(n)
Double Linked List Θ(n) -> O(n) Θ(n) -> O(n) Θ(1) -> O(n) Θ(1) -> O(n) O(n)
Hash Table N/A Θ(1) -> O(n) Θ(1) -> O(n) Θ(1) -> O(n) O(n)
Binary Search Tree Θ(log(n)) -> O(n) Θ(log(n)) -> O(n) Θ(log(n)) -> O(n) Θ(log(n)) -> O(n) O(n)
B Tree Θ(log(n)) -> O(log(n)) Θ(log(n)) -> O(log(n)) Θ(log(n)) -> O(log(n)) Θ(log(n)) -> O(log(n)) O(n) Self-balancing
Heap O(log(n)) O(log(n)) Complete binary tree, one maximally efficient implementation of a priority queue
Array Sorting Algorithms
Data Structure Time Complexity Space Complexity Notes
Best Average Worst
Bubble Sort Ω(n) Θ(n^2) O(n^2) O(1)
Quicksort Ω(n log(n)) Θ(n log(n)) O(n^2) O(log(n))
Mergesort Ω(n log(n)) Θ(n log(n)) O(n log(n)) O(n)
Insertion Sort Ω(n) Θ(n^2) O(n^2) O(1)
Timsort Ω(n) Θ(n log(n)) O(n log(n)) O(n) Combination of Mergesort and Insertion Sort made for Python
Heapsort Ω(n log(n)) Θ(n log(n)) O(n log(n)) O(1)
Tree Traversing Algorithms
Data Structure Time Complexity Space Complexity Notes
Best Average Worst
BFS O(V + E) = O(b^d) O(V) = O(b^d)
DFS O(b^d) O(E) = O(bd)
DFS, recursive O(V + E) O(V)
Dijkstra
A* O(E) = O(b^d) O(V) = O(b^d)