Level 1: Core Definitions
Q1: What is Problem-Solving?
A1: The process of identifying, analyzing, and resolving complex issues using logical and structured methods.
Q2: Why is Problem-Solving important in programming?
A2: It enables developers to break down challenges into manageable steps, design efficient solutions, and debug errors systematically.
Q3: What are the main steps in problem-solving?
A3:
- Understand the problem.
- Plan an approach (algorithm).
- Implement the solution.
- Test and debug.
- Optimize for efficiency.
Q3A: What are some common problem-solving strategies?
Common strategies include:
- Divide and conquer
- Dynamic programming
- Greedy algorithms
- Backtracking
/w
Q4: What is an Algorithm?
A4: A step-by-step procedure to solve a problem or perform a task (e.g., recipe for baking a cake).
Q4A: What are some fundamental algorithms in computer science?
Fundamental algorithms include:
- Sorting algorithms (e.g., quicksort, mergesort)
- Searching algorithms (e.g., binary search)
- Graph algorithms (e.g., Dijkstra's shortest path)
Q5: What is Sorting?
A5: Arranging data in a specific order (e.g., ascending, descending).
Q6: What is Searching?
A6: Finding the location of a specific element in a dataset.
Q7: What is Memory Management?
A7: The process of allocating, using, and freeing memory during program execution.
Level 2: Problem-Solving Strategies
Q8: What is Divide-and-Conquer?
A8: Splitting a problem into smaller subproblems, solving them recursively, and combining results (e.g., Merge Sort).
Q9: What is a Greedy Algorithm?
A9: Makes the locally optimal choice at each step to find a global optimum (e.g., Dijkstra’s algorithm).
Q10: What is Dynamic Programming (DP)?
A10: Solves problems by breaking them into overlapping subproblems and storing solutions to avoid recomputation (e.g., Fibonacci series).
Q11: What is Backtracking?
A11: A trial-and-error method to explore all possible solutions and backtrack if a dead end is reached (e.g., solving Sudoku).
Q12: What is Brute-Force?
A12: Exhaustively checks all possible solutions (e.g., linear search).
Level 3: Sorting Algorithms
Q13: What is Bubble Sort?
A13: Compares adjacent elements and swaps them if they’re in the wrong order.
- Time Complexity: O(n²).
- Use Case: Small datasets or educational purposes.
Q14: How does Selection Sort work?
A14: Repeatedly selects the smallest element from the unsorted portion and swaps it with the first unsorted element.
- Time Complexity: O(n²).
Q15: Explain Insertion Sort.
A15: Builds the sorted array one element at a time by inserting each element into its correct position.
- Time Complexity: O(n²) (best case O(n) for nearly sorted data).
Q16: What is Merge Sort?
A16: A Divide-and-Conquer algorithm that splits the array, sorts subarrays recursively, and merges them.
- Time Complexity: O(n log n).
Q17: How does Quick Sort work?
A17: Selects a pivot, partitions the array into elements less/greater than the pivot, and recursively sorts the partitions.
- Time Complexity: O(n log n) average case, O(n²) worst case.
Q18: What is Heap Sort?
A18: Converts the array into a max-heap and repeatedly extracts the largest element.
- Time Complexity: O(n log n).
Level 4: Searching Algorithms
Q19: What is Linear Search?
A19: Checks each element sequentially until the target is found.
- Time Complexity: O(n).
Q20: Explain Binary Search.
A20: Searches a sorted array by repeatedly dividing the search interval in half.
- Time Complexity: O(log n).
Q21: What is Hashing?
A21: Uses a hash function to map keys to values for O(1) average-time lookups (e.g., hash tables).
Level 5: Memory Management
Q22: What is Dynamic Memory Allocation?
A22: Allocating memory at runtime (e.g., malloc
, calloc
, new
).
Q23: What is a Pointer?
A23: A variable that stores the memory address of another variable.
Q24: What is Static vs. Dynamic Memory?
A24:
- Static: Fixed-size allocation at compile-time (e.g., global variables).
- Dynamic: Flexible allocation at runtime (e.g.,
malloc
).
Q25: What is a Memory Leak?
A25: Occurs when dynamically allocated memory is not freed, leading to wasted resources.
Q26: What is Fragmentation?
A26: Wasted memory due to inefficient allocation:
- Internal: Unused space within a memory block.
- External: Gaps between allocated blocks.
Q27: What is a Dangling Pointer?
A27: A pointer pointing to memory that has been freed, causing undefined behavior.
Level 6: Data Manipulation
Q28: What is an Array?
A28: A collection of elements stored in contiguous memory locations.
Q29: What is a Linked List?
A29: A data structure where elements (nodes) are linked using pointers.
Q30: How do Stacks and Queues work?
A30:
- Stack: LIFO (Last-In-First-Out) – e.g., undo/redo operations.
- Queue: FIFO (First-In-First-Out) – e.g., print job scheduling.
Level 7: Advanced Concepts
Q31: What is Time Complexity?
A31: A measure of the time an algorithm takes relative to input size (e.g., O(n), O(n²)).
Q32: What is Space Complexity?
A32: A measure of the memory an algorithm uses relative to input size.
Q33: What are In-Place Algorithms?
A33: Algorithms that use constant extra memory (e.g., Quick Sort).
Q34: What is Cache Locality?
A34: Optimizing data access patterns to leverage CPU cache efficiency (e.g., using arrays over linked lists).
Real-World Applications
Q35: Where is Merge Sort used in practice?
A35: In external sorting (large datasets that don’t fit in RAM).
Q36: Why are hash tables used in databases?
A36: For O(1) average-time lookups in indexing (e.g., primary keys).
Q37: How does dynamic memory help in games?
A37: Allows flexible allocation for game objects created/destroyed during gameplay.
Problem-Solving Approaches
Q1: What is the Divide-and-Conquer strategy?
A1: Break a problem into smaller subproblems, solve them recursively, and combine results (e.g., Merge Sort).
Q2: When is a Greedy Algorithm optimal?
A2: When locally optimal choices (e.g., picking the largest coin first) lead to a globally optimal solution (e.g., fractional knapsack).
Q3: How does Dynamic Programming differ from Divide-and-Conquer?
A3: DP solves overlapping subproblems once and stores solutions (memoization), while Divide-and-Conquer solves independent subproblems.
Q4: What is Backtracking?
A4: A trial-and-error approach to explore all possible solutions (e.g., N-Queens, Sudoku).
Q5: Define Brute-Force strategy.
A5: Exhaustively check all possibilities to find a solution (e.g., linear search).
Q6: What does Big-O notation represent?
A6: It describes the upper bound of an algorithm’s time/space complexity as input size grows (e.g., O(n²) for Bubble Sort).
Algorithms
Sorting
Q7: Explain Bubble Sort.
A7: Repeatedly swaps adjacent elements if they are in the wrong order. Complexity: O(n²).
Q8: How does Quick Sort work?
A8: Uses a pivot to partition the array into subarrays, recursively sorting them. Average case: O(n log n).
Q9: What is the advantage of Merge Sort over Quick Sort?
A9: Merge Sort is stable and guarantees O(n log n) time, but uses O(n) extra space.
Q10: When is Insertion Sort preferred?
A10: For small datasets or nearly sorted arrays due to its O(n) best-case time.
Q11: What is Heap Sort?
A11: Builds a max-heap and repeatedly extracts the largest element. Complexity: O(n log n).
Searching
Q12: How does Binary Search work?
A12: Searches a sorted array by repeatedly dividing the search interval in half. Complexity: O(log n).
Q13: What is Hashing?
A13: Maps keys to values using a hash function. Ideal for average O(1) search (e.g., hash tables).
Q14: What is the worst-case time for Linear Search?
A14: O(n), as it checks every element sequentially.
Graph Algorithms
Q15: What is BFS used for?
A15: Breadth-First Search explores all neighbors at the present depth before moving deeper (e.g., shortest path in unweighted graphs).
Q16: How does DFS differ from BFS?
A16: Depth-First Search explores as far as possible along a branch before backtracking (e.g., cycle detection).
Memory Management
Q17: What is Dynamic Memory Allocation?
A17: Runtime allocation of memory (e.g., malloc
in C, new
in C++).
Q18: What is a memory leak?
A18: Failure to deallocate dynamically allocated memory, causing wasted resources.
Q19: What is a dangling pointer?
A19: A pointer pointing to a deallocated memory location (e.g., after calling free
).
Q20: How does calloc differ from malloc?
A20: calloc
initializes allocated memory to zero, while malloc
leaves it uninitialized.
Q21: What is fragmentation?
A21: Wasted memory due to inefficient allocation. Types:
- Internal: Unused memory within allocated blocks.
- External: Free memory gaps between allocated blocks.
Q22: Explain pointers in C/C++.
A22: Variables storing memory addresses. Used for dynamic data structures (e.g., linked lists).
Q23: How do arrays differ from linked lists in memory?
A23: Arrays use contiguous memory, while linked lists use non-contiguous memory with pointers.
Q24: What is garbage collection?
A24: Automatic memory management (e.g., in Java/Python) that reclaims unused memory.
Data Manipulation
Q25: How are stacks and queues implemented?
A25:
- Stack: LIFO (Last-In-First-Out) using arrays/linked lists.
- Queue: FIFO (First-In-First-Out) using arrays/linked lists.
Q26: What is a binary tree?
A26: A hierarchical data structure where each node has ≤ 2 children. Used in search algorithms (e.g., BST).
Q27: How do trees improve search efficiency?
A27: Balanced trees (e.g., AVL, Red-Black) reduce search time to O(log n).
Best Practices
Q28: Why use smart pointers in C++?
A28: They automate memory deallocation (e.g., unique_ptr
, shared_ptr
), preventing leaks.
Q29: What are memory pools?
A29: Pre-allocated blocks of memory for efficient allocation/deallocation in real-time systems.
Q30: How to avoid buffer overflow?
A30: Validate input sizes, use safe functions (e.g., strncpy
over strcpy
), and bounds checking.
Real-World Applications
Q31: Where is Quick Sort used practically?
A31: In programming language libraries (e.g., C’s qsort
) due to its average-case efficiency.
Q32: How do hash tables speed up databases?
A32: They enable O(1) average-time lookups for key-value pairs (e.g., indexing).
Q33: Why is BFS used in peer-to-peer networks?
A33: To locate the shortest path to a resource (e.g., BitTorrent).