Skip to content

Commit 3e8b6f1

Browse files
committed
Update DSA Notes
1 parent d0c8346 commit 3e8b6f1

File tree

78 files changed

+3596
-1
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

78 files changed

+3596
-1
lines changed

pages/AA Tree.md

Lines changed: 88 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,88 @@
1+
# Explanation
2+
- The **AA Tree** is a self-balancing binary search tree (BST). It’s a variation of the Red-Black Tree but with a simpler set of rules. The primary difference lies in how the tree is balanced, and it ensures that operations like insertion, deletion, and searching remain efficient.
3+
-
4+
- # Steps:
5+
- The **balance condition** is enforced with a single balance factor for each node (the "level"), which simplifies the balancing process.
6+
-
7+
- **Level Rule**: The left child of a node must have the same level or one more level than its parent.
8+
-
9+
- **Rotation Rule**: The tree uses rotations (like right or left) to ensure the balance after insertion or deletion.
10+
-
11+
- # Time Complexity
12+
- **Insertion**: O(log n)
13+
- **Deletion**: O(log n)
14+
- **Search**: O(log n)
15+
-
16+
- ```python
17+
class AATreeNode:
18+
def __init__(self, key):
19+
self.key = key
20+
self.level = 1
21+
self.left = None
22+
self.right = None
23+
24+
class AATree:
25+
def __init__(self):
26+
self.root = None
27+
28+
def skew(self, node):
29+
if node and node.left and node.left.level == node.level:
30+
node = self.rotate_right(node)
31+
return node
32+
33+
def split(self, node):
34+
if node and node.right and node.right.right and node.right.right.level == node.level:
35+
node = self.rotate_left(node)
36+
node.level += 1
37+
return node
38+
39+
def rotate_right(self, node):
40+
temp = node.left
41+
node.left = temp.right
42+
temp.right = node
43+
return temp
44+
45+
def rotate_left(self, node):
46+
temp = node.right
47+
node.right = temp.left
48+
temp.left = node
49+
return temp
50+
51+
def insert(self, node, key):
52+
if node is None:
53+
return AATreeNode(key)
54+
55+
if key < node.key:
56+
node.left = self.insert(node.left, key)
57+
elif key > node.key:
58+
node.right = self.insert(node.right, key)
59+
else:
60+
return node
61+
62+
node = self.skew(node)
63+
node = self.split(node)
64+
return node
65+
66+
def search(self, node, key):
67+
if node is None or node.key == key:
68+
return node
69+
elif key < node.key:
70+
return self.search(node.left, key)
71+
else:
72+
return self.search(node.right, key)
73+
74+
def add(self, key):
75+
self.root = self.insert(self.root, key)
76+
77+
# Example usage
78+
aatree = AATree()
79+
aatree.add(10)
80+
aatree.add(20)
81+
aatree.add(5)
82+
83+
result = aatree.search(aatree.root, 10)
84+
if result:
85+
print(f"Found key {result.key}")
86+
else:
87+
print("Key not found")
88+
```

pages/Ackermann Function.md

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
# Explanation
2+
- The **Ackermann function** is a well-known recursive function that grows very quickly. It is often used in theoretical computer science to illustrate the difference between primitive recursive functions and general recursive functions.
3+
-
4+
- # Steps
5+
- The function is defined as:
6+
-
7+
- A(0, n) = n + 1
8+
-
9+
- A(m, 0) = A(m - 1, 1) for m > 0
10+
-
11+
- A(m, n) = A(m - 1, A(m, n - 1)) for m > 0 and n > 0
12+
-
13+
- # Time Complexity
14+
- The **time complexity** of the Ackermann function is not expressible in simple Big-O notation, as it grows faster than any primitive recursive function.
15+
-
16+
- ```python
17+
def ackermann(m, n):
18+
if m == 0:
19+
return n + 1
20+
elif m > 0 and n == 0:
21+
return ackermann(m - 1, 1)
22+
else:
23+
return ackermann(m - 1, ackermann(m, n - 1))
24+
25+
# Example usage
26+
print(ackermann(3, 4)) # This will calculate A(3, 4)
27+
```

pages/Bellman Ford Algorithm.md

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
# Explanation
2+
- **Bellman-Ford Algorithm** is a dynamic programming algorithm used for finding the shortest path in graphs, even with negative weight edges. It also detects negative weight cycles.
3+
-
4+
- # Steps:
5+
- Initialize distances to all vertices as infinity, and set the distance to the source vertex as 0.
6+
- Relax all edges `V-1` times (where `V` is the number of vertices).
7+
- Check for negative weight cycles by attempting to relax all edges once more.
8+
-
9+
- # Time Complexity:
10+
- **O(V * E)**, where `V` is the number of vertices and `E` is the number of edges.
11+
-
12+
- ```python
13+
def bellman_ford(graph, source, n):
14+
dist = [float('inf')] * n
15+
dist[source] = 0
16+
for _ in range(n-1):
17+
for u, v, w in graph:
18+
if dist[u] + w < dist[v]:
19+
dist[v] = dist[u] + w
20+
# Check for negative-weight cycles
21+
for u, v, w in graph:
22+
if dist[u] + w < dist[v]:
23+
print("Graph contains negative weight cycle")
24+
return
25+
return dist
26+
27+
# Example usage
28+
graph = [(0, 1, -1), (0, 2, 4), (1, 2, 3), (1, 3, 2), (1, 4, 2), (3, 2, 5), (3, 4, -3), (4, 3, 3)]
29+
n = 5
30+
source = 0
31+
print("Shortest distances:", bellman_ford(graph, source, n))
32+
```

pages/Binary Decision Diagram.md

Lines changed: 86 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,86 @@
1+
# Explanation
2+
- A **Binary Decision Diagram (BDD)** is a data structure used to represent boolean functions. It uses a directed acyclic graph where each node represents a decision based on a variable, and the edges represent the outcomes of that decision.
3+
-
4+
- # Steps
5+
- **Construct the decision diagram**: Represent the boolean function in terms of binary decisions.
6+
- The diagram is built such that each node represents a binary decision based on a variable.
7+
-
8+
- The edges represent the outcome of that decision (true or false).
9+
-
10+
- **Minimize the diagram**: After constructing the initial BDD, apply techniques (like reduction rules) to minimize the number of nodes and edges. This step makes the BDD more efficient for function evaluation.
11+
-
12+
- **Evaluate the boolean function**: Once the BDD is constructed, evaluate the function by traversing the tree-like structure and applying the decisions.
13+
-
14+
- # Time Complexity
15+
- O(n) for evaluating the boolean function, but complexity can vary depending on the size and structure of the diagram.
16+
-
17+
- **Construction**: O(n) where `n` is the number of variables, assuming the boolean function is represented as a decision tree.
18+
-
19+
- **Evaluation**: O(n) for evaluating the function, where `n` is the number of variables in the BDD (traversing each node once).
20+
-
21+
- ```python
22+
class BDDNode:
23+
def __init__(self, variable=None, high=None, low=None, value=None):
24+
self.variable = variable # Variable name
25+
self.high = high # Pointer to high branch (True)
26+
self.low = low # Pointer to low branch (False)
27+
self.value = value # Value for terminal nodes (True/False)
28+
29+
class BDD:
30+
def __init__(self):
31+
self.nodes = {} # Dictionary to store nodes for caching
32+
self.var_count = 0 # Variable counter to name the variables
33+
34+
def add_node(self, variable, high, low):
35+
"""Create a new node for the BDD."""
36+
if (variable, high, low) in self.nodes:
37+
return self.nodes[(variable, high, low)] # Return cached node
38+
node = BDDNode(variable, high, low)
39+
self.nodes[(variable, high, low)] = node
40+
return node
41+
42+
def create_bdd(self, expression):
43+
"""Create a BDD for a given boolean expression."""
44+
if not expression:
45+
return None
46+
if expression == "True":
47+
return BDDNode(value=True)
48+
if expression == "False":
49+
return BDDNode(value=False)
50+
51+
# Handle variable (e.g., "x1" or "x2")
52+
var = expression[0]
53+
high = self.create_bdd(expression[1:])
54+
low = self.create_bdd(expression[1:])
55+
56+
return self.add_node(var, high, low)
57+
58+
def evaluate(self, node, assignments):
59+
"""Evaluate the BDD with the given variable assignments."""
60+
if node is None:
61+
return False
62+
63+
if node.value is not None: # Terminal node (True or False)
64+
return node.value
65+
66+
# Traverse according to the variable's value in assignments
67+
var_value = assignments.get(node.variable)
68+
if var_value: # If True, move to the 'high' branch
69+
return self.evaluate(node.high, assignments)
70+
else: # If False, move to the 'low' branch
71+
return self.evaluate(node.low, assignments)
72+
73+
# Example usage
74+
bdd = BDD()
75+
76+
# Constructing a simple BDD for the expression "x1 AND x2"
77+
# This represents the boolean function x1 AND x2
78+
bdd_root = bdd.create_bdd("x1x2")
79+
80+
# Define variable assignments for testing
81+
assignments = {"x1": True, "x2": True}
82+
83+
# Evaluate the BDD with the assignments
84+
result = bdd.evaluate(bdd_root, assignments)
85+
print(f"Result of BDD evaluation: {result}") # Expected output: True
86+
```
Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
- Explanation:
2+
- The **Binary Indexed Tree (BIT)**, also known as a **Fenwick Tree**, is a data structure that efficiently supports dynamic cumulative frequency tables.
3+
-
4+
- It allows you to compute prefix sums and update elements in **O(log n)** time. It is particularly useful for problems that require frequent range sum queries and updates.
5+
-
6+
- # Steps:
7+
- The tree is implemented using an array, where each node represents a cumulative sum of elements.
8+
-
9+
- **Update Operation**: Updates an element in the array, and its impact propagates upward.
10+
-
11+
- **Query Operation**: Computes the sum of elements from the start to a given index by summing the values of the relevant nodes.
12+
-
13+
- #### Time Complexity:
14+
- **Update**: O(log n)
15+
- **Query**: O(log n)
16+
-
17+
- ```python
18+
class FenwickTree:
19+
def __init__(self, n):
20+
self.n = n
21+
self.tree = [0] * (n + 1)
22+
23+
def update(self, index, delta):
24+
while index <= self.n:
25+
self.tree[index] += delta
26+
index += index & -index # Move to the parent node
27+
28+
def query(self, index):
29+
sum = 0
30+
while index > 0:
31+
sum += self.tree[index]
32+
index -= index & -index # Move to the parent node
33+
return sum
34+
35+
# Example usage
36+
fenwick = FenwickTree(5)
37+
fenwick.update(1, 3)
38+
fenwick.update(3, 5)
39+
40+
print(f"Sum of first 3 elements: {fenwick.query(3)}") # Output: 8
41+
print(f"Sum of first 5 elements: {fenwick.query(5)}") # Output: 8
42+
```

pages/Binary Search.md

Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
# Explanation
2+
- Binary Search is an efficient algorithm used to find an element in a sorted list or array. It works by dividing the list into two halves and repeatedly narrowing down the search space until the target element is found.
3+
-
4+
- # Steps:
5+
- The **list must be sorted** first.
6+
- Start with the low pointer at index 0 and the high pointer at the last index of the list.
7+
-
8+
- Find the middle element: `mid = (low + high) // 2`.
9+
-
10+
- **Compare the middle element** with the target:
11+
- If the middle element is equal to the target, return the index.
12+
-
13+
- If the middle element is greater than the target, the target must be in the left half, so set `high = mid - 1`.
14+
-
15+
- If the middle element is smaller than the target, the target must be in the right half, so set `low = mid + 1`.
16+
-
17+
- This process repeats until the target element is found or the search space is exhausted.
18+
-
19+
- # Time Complexity:
20+
- The time complexity of Binary Search is **O(log n)**, where **n** is the number of elements in the list or array.
21+
-
22+
- ```python
23+
def binary_search(arr, target):
24+
low = 0
25+
high = len(arr) - 1
26+
27+
while low <= high:
28+
mid = (low + high) // 2 # Find middle index
29+
if arr[mid] == target: # Target found
30+
return mid
31+
elif arr[mid] < target: # Target is in the right half
32+
low = mid + 1
33+
else: # Target is in the left half
34+
high = mid - 1
35+
36+
return -1 # Target not found
37+
38+
# Example usage
39+
arr = [1, 3, 5, 7, 9, 11, 13]
40+
target = 9
41+
result = binary_search(arr, target)
42+
43+
if result != -1:
44+
print(f"Element found at index {result}")
45+
else:
46+
print("Element not found")
47+
```

pages/Binary Space Partitioning.md

Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
#### Explanation:
2+
- **Binary Space Partitioning** is a technique used for recursively dividing a space into two half-spaces by hyperplanes (lines in 2D, planes in 3D).
3+
-
4+
- It is used in **computer graphics**, **collision detection**, and **ray tracing**.
5+
-
6+
- #### Steps:
7+
- The space is recursively divided by choosing hyperplanes (like splitting a 2D plane into two parts).
8+
-
9+
- BSP trees store this hierarchical structure, which can be traversed for efficient querying of the space.
10+
-
11+
- #### Time Complexity:
12+
- **Insertion**: O(log n)
13+
- **Querying**: O(log n) for spatial queries.
14+
-
15+
- ```python
16+
class BSPTreeNode:
17+
def __init__(self, partition_plane=None):
18+
self.partition_plane = partition_plane
19+
self.front = None
20+
self.back = None
21+
22+
class BSPTree:
23+
def __init__(self):
24+
self.root = None
25+
26+
def insert(self, partition_plane):
27+
# Insert partition and construct BSP tree
28+
pass
29+
30+
# Example usage
31+
bsp_tree = BSPTree()
32+
bsp_tree.insert("partition1")
33+
```

0 commit comments

Comments
 (0)