This roadmap is designed to help you build skill gradually, with enough depth to matter and enough breathing room to stay consistent. Let’s start with the basics.
1. Weeks 1–2: Language Fundamentals & Basic Programming
Goal: Get comfortable with syntax, writing code, and basic logic.
Choose one language and stick with it throughout your DSA journey. Java, Python, or C++ are all solid options, but Java often strikes the right balance between industry use, OOP fundamentals, and clarity for beginners.
Tip: If you’re confused between Java and Python, Python is beginner-friendly, but Java prepares you better for system design and real-world codebases. Pick what you’ll stay consistent with.
Once your choice is made, focus on the bare-bones basics:
- Variables, data types, loops, functions, conditionals
- Input/output
- Error handling and debugging
Start solving basic problems—like printing patterns, reversing a string, and checking for prime numbers. It’s not about speed. It’s about comfort.
Solve at least 10 entry-level problems for each concept (like loops, functions, and conditionals) on LeetCode or HackerRank to truly grasp it and lock it into long-term memory.
You should also get familiar with your language’s built-in collections. In Python, that means lists, sets, and dictionaries. In Java, look into ArrayList, HashMap, and TreeSet. These will save you time later when solving real problems.
Mini Project: Build a Simple Calculator
Once you’re familiar with variables, conditionals, and functions, put them to use in a small project. Build a calculator that performs basic operations like addition, subtraction, multiplication, and division. No need for UI—just focus on combining what you’ve learned into a working script.
Not to miss:
- Print patterns (pyramids, number grids)
- Basic math operations
- Palindrome check
- Swapping variables, reversing strings
- Prime numbers, factorials
By the End of Week2, You Should:
- You can write clean, functional code for basic problems.
- You’re comfortable with the syntax and can debug small issues.
- You’ve developed the habit of solving and learning through code, not just watching.
2. Weeks 3-4: Arrays, Linked Lists & Core Linear Structures
Goal: Build strong problem-solving intuition with linear data structures.
There are two main types of data structures: Linear (data is organized sequentially) and Non-linear (data is hierarchical or interconnected). This week, you’ll focus entirely on linear structures—the foundation of most early-stage coding problems.
You’ll start with arrays, and then move into linked lists, stacks, queues, and hash tables. These will train you to think in terms of memory, traversal, and order—skills that carry into more advanced topics like trees, graphs, and dynamic programming.
Once you’ve understood the basics, here’s what to cover:
- Arrays: Traversal, insertion, deletion, prefix sums, sliding window
- Linked Lists: Reversing, cycle detection (Floyd’s algorithm), fast & slow pointers
- Stacks & Queues: Push/pop operations, implemented with arrays or lists
- Hash Tables: Key-value storage, frequency counting, basic hashing logic
Code each structure at least once from scratch. Then practice solving real problems using them.
Solve 15–20 problems across these topics on LeetCode or NeetCode. Filter by tags like “array”, “linked list”, “stack”, “queue”, and “hash table”. Stick to easy and beginner-medium levels for now.
You should also continue applying your core language knowledge—use loops, conditionals, and functions together inside your data structure problems. This is where your skills start to stack up.
Mini Project: To-Do List or Simple Address Book
Use arrays or lists to store items. Allow the user to add, view, or delete entries.
This brings together logic, input handling, and your first taste of how data is structured in real-world tasks.
Not to miss:
- Prefix sum problems
- Sliding window patterns
- Cycle detection in linked lists
- Stack-based problems like Next Greater Element
- Basic hash map challenges, like counting word frequencies
By the End of Week 4, You Should:
- Understand how each linear data structure works and when to use it
- Be comfortable with basic implementation and logic flow
- Start recognizing patterns that appear across multiple problems
You’ll start thinking more about time and space complexity from week 4 onward, once recursion and algorithm design begin to show up.
3. Weeks 5-6: Recursion, Backtracking & Stack Memory
Goal: Learn how problems break down and how to use function calls to explore possibilities.
You’ll learn recursion—not just as a concept, but as a practical tool to break down problems that involve branching, repetition, or reversal. It’s the foundation for solving tree problems, graph traversal, backtracking, and even dynamic programming later on.
At its core, recursion is when a function calls itself to solve a smaller version of the same problem. You’ll also understand how these calls are handled by your computer using something called the call stack, which helps keep track of where you are in the problem.
Once you’re comfortable with recursion, you’ll move on to backtracking. It’s used when you need to try different options and undo decisions if something doesn’t work, like solving a Sudoku puzzle or placing queens on a chessboard without attacking each other.
If solving a problem feels like exploring all options or making choices step-by-step, you’re probably looking at a recursion or backtracking solution.
This week you’ll cover:
- What recursion is and how it works
- Base case vs recursive case
- Visualizing recursion through dry runs
- Stack memory during recursive calls
- What is backtracking, and how does it improve over brute force
What to Practice:
Start simple with recursion:
- Factorial
- Fibonacci series
- Reverse a string
- The sum of the first N numbers
Then move into beginner-friendly backtracking:
- Generate all subsets
- String permutations
- N-Queens problem
- Sudoku solver (optional but rewarding)
Solve at least 15–20 problems this week. Struggling is normal—what matters is tracing the flow and building logic step by step.
Mini Project: Folder Explorer
Build a script that takes a folder path and prints all files and subfolders using recursion. It’s a real-world use case of recursion, like navigating a tree structure.
Not to miss:
- How missing a base case causes infinite recursion
- What the call stack looks like at each step
- Dry-running recursion by hand
- Recognizing backtracking patterns in problems like N-queens or subset sums
By the End of Week 6, You Should:
- Understand how recursion works—both in logic and memory
- Write basic recursive and backtracking functions with confidence
- Recognize when a problem requires exploring multiple paths
- Be ready for tree structures, where recursion shows up everywhere
4. Weeks 7-8: Trees, Binary Search Trees & Heaps
Goal: Understand how hierarchical data is stored, traversed, and optimized for different use cases.
This week, your code starts to reflect how real systems are built.
Everything you’ve learned about recursion, problem-solving, and stack memory starts showing up here—in trees. Unlike arrays and lists, trees don’t go straight. They branch. Which means your logic needs to do the same.
Binary Trees teach you how data grows in levels. Binary Search Trees (BSTs) teach you how structure makes searching faster. You’ll start seeing problems that look more like the systems you use every day—file explorers, auto-suggestions, and recommendation engines.
That’s also why we didn’t jump into trees earlier. They click better once you’ve worked with recursion and understand how function calls move through memory. During this week, aim to cover…
- Basics: What is a tree? Root, child, height, depth, leaf nodes
- Binary Trees vs BSTs vs Heaps: Structure and Purpose
- Tree Traversals:
- In-order
- Pre-order
- Post-order
- Level-order (using queues)
- BST operations: Search, Insert, Delete, Validate
- Min Heaps & Max Heaps: Insert, extract, heapify
- Applications: Autocomplete (Tries), priority queues, shortest paths (Dijkstra starts here)
What to Practice:
Start with binary trees:
- Calculate height/depth
- Count nodes, leaves
- Mirror a tree
- Check symmetry and identity
Move into traversal:
- In-order, pre-order, post-order
- Level-order (BFS style with queue)
- Iterative vs recursive approaches
BST problems:
- Insert/delete/search
- Validate BST
- Lowest Common Ancestor
- Kth smallest element
If you’ve ever wondered how Google prioritizes autocomplete suggestions, this is where it begins. Solve 15–25 targeted problems using tags like “binary-tree”, “best”, and “heap” on LeetCode or NeetCode.
Project: Directory Tree Visualizer
Create a CLI program that prints folder structures like a tree with indentation.
Not to miss:
- How in-order traversal naturally gives a sorted output in a BST
- How heaps use arrays internally but behave like trees
- What makes a tree balanced, and why does that matter
- How the call stack plays into recursive traversals
By the End of Week 8, You Should:
- See recursion as a tool, not a trick
- Feel comfortable navigating and building tree-based structures
- Understand when to use a BST vs when to use a Heap
- Realize you’re not “learning DSA” anymore—you’re learning how real systems think
5. Weeks 9-10: Heaps & Priority-Based Problem Solving
Goal: Learn how to handle problems where priority matters—and how heaps offers a fast, structured way to do it.
By now, you’ve worked with arrays, linked lists, trees, and recursion. This week shifts your focus to a powerful but underused data structure: the heap.
Heaps let you access the largest or smallest element quickly, making them perfect for problems like “get top K elements”, task scheduling, and finding medians in a stream of data. They’re used in real-world systems like operating systems, autocomplete systems and even cache eviction logic. What you’ll cover:
- What is a Heap?
- Min Heap vs Max Heap
- Binary Heap implementation (array-based)
- Heap Operations:
- Insert
- Delete (extract min/max)
- Heapify
- Priority Queues (built-in in most languages)
- Use Cases:
- Top K elements
- Kth smallest/largest
- The median in a stream
- Merge K-sorted arrays/lists
- Task scheduling problems
What to Practice:
- Build a heap from an array
- Heapify logic
- Priority queue implementation
- Solve:
- Kth Largest Element in an Array
- Top K Frequent Elements
- Merge K-Sorted Lists
- Find Median from Data Stream
- Is Heap / Validate Heap
Target 15–20 problems focused on heap operations and use cases. Use tags like heap, priority-queue, and top-k on LeetCode or NeetCode.
Mini Project: Task Scheduler Simulation
Build a simple CLI-based task scheduler that runs high-priority tasks first using a heap-based priority queue. This reflects how operating systems manage job queues in real-time.
Not to miss:
- How heaps differ from BSTs: No order, just hierarchy based on priority
- Heaps are stored as arrays but behave like trees
- Why heapify is often implemented recursively
- When to use heaps over sorting or linear scans
By the End of Week 10, You Should:
- Confidently use heaps in problems that involve ranking, priority, or frequent elements
- Implement heap operations manually and via built-in libraries
- Know when to choose heaps over brute force or sorting
Be ready to shift into classic algorithm territory next: sorting, searching, and binary search techniques
6. Weeks 11-12: Problem-Solving Patterns That Stick
Goal: Stop solving every problem from scratch—start spotting what works, and reuse it with purpose.
If you’ve followed the roadmap so far, you’ve already written dozens of solutions using arrays, recursion, and heaps. Now it’s time to stop looking at problems as isolated puzzles—and start seeing them as variations of familiar patterns.
This week introduces real-world techniques that help you:
- Reduce time complexity from O(n²) to O(n)
- Solve long-form problems in fewer steps
- Choose the right approach faster during contests or interviews
The goal here is pattern fluency—so when you see a subarray, string window, or pointer-based condition, your brain knows what to try.
What You’ll Learn:
Technique | What It Solves | Why It Matters |
Two Pointers | Sorting, searching pairs, and duplicates removal | Often replaces nested loops with O(n) scan |
Sliding Window | Longest/shortest substrings, max subarrays | Used in string + array problems everywhere |
Prefix Sum | Range queries, subarray sums | Fast lookup without extra loops |
Fast & Slow Pointers | Cycle detection, linked list middle | Used in Floyd’s Cycle Detection, etc. |
These are industry-standard mental models that appear in everything from basic DSA to system design.
For each technique:
- Learn the pattern logic (visualize it on paper)
- Code the base version manually
- Practice 3–4 variations (easy ➝ medium)
- Spot it again later (repeat in graphs, DP, etc.)
Two Pointers:
- Remove duplicates from the sorted array
- Merge two sorted arrays
- Container With Most Water
- Three Sum / Two Sum (sorted)
Sliding Window:
- Maximum sum subarray of size K
- Longest substring without repeating characters
- Minimum window substring
- Count anagrams in a string
Prefix Sum:
- Subarray sum equals K
- Range sum queries
- Find the equilibrium index
- Matrix prefix sum (2D variant, optional)
Fast & Slow Pointers:
- Detect a cycle in a linked list
- Find the middle of the linked list
- Length of cycle
- Starting point of the cycle
Aim for 20–25 well-selected medium to high problems this week. Focus on how each solution avoids brute force and scales better.
Make Your Pattern Reference Sheet
Create your own “DSA Pattern Sheet” in Notion, Excel, or a notebook.
Every time you solve a problem using a known technique, log it.
- What was the pattern?
- What problem did it solve?
- Could you use it again somewhere else?
This becomes your cheat code in the final interview phase.
Not to Miss
- Don’t just memorize syntax—focus on why the pattern works
- Learn to map questions to patterns (“sliding window question” vs “prefix sum question”)
- Understand edge cases: what resets the window? When do pointers move?
By the End of Week 12, You Should:
- Spot the right technique in the first 2–3 minutes of reading a question
- Know how to reduce time from O(n²) to O(n) using patterns
- Start grouping problems by “approach,” not just “topic.”
- Be ready for optimization-focused interviews where brute force won’t cut it
7. Weeks 13-14: Greedy Algorithms & Dynamic Programming
Goal: Learn to solve problems that need more than just logic—problems that need strategy.
By now, you’ve built up enough intuition to handle structured problems. But some problems require you to choose smartly, and others demand that you remember what you’ve already solved. That’s where Greedy and Dynamic Programming come in.
Greedy Algorithm
Greedy algorithms make decisions step-by-step, choosing what looks best in the moment, without looking back. It doesn’t always give the perfect result, but when it does, it’s beautifully efficient. Use greedy when:
- You can prove that local decisions = global optimal
- Sorting and simple selection solve the problem
- You’re optimizing cost, time, or value step-by-step
Think of greed as the “clean win”—minimal code, maximum impact.
What to practice:
- Activity Selection
- Fractional Knapsack
- Minimum Platforms
- Huffman Encoding Tree
- Jump Game
- Gas Station Circular Tour
Dynamic Programming
It’s about breaking problems into smaller subproblems, solving them once, and building up the final answer without repeating work. It’s slower to learn, but powerful once it clicks. Use DP when:
- The problem has overlapping subproblems
- There’s an optimal substructure
You need to make decisions with memory.
You’ll start with 1D DP, then move to 2D, and eventually state-based approaches used in system design and competitive programming.
What to practice (1D):
- Fibonacci with memoization
- Climbing Stairs
- House Robber
- Coin Change
- Maximum Subarray (Kadane’s Algorithm)
What to practice (2D):
- Longest Common Subsequence (LCS)
- Edit Distance
- 0/1 Knapsack
- Subset Sum
- Matrix Chain Multiplication
Project: Scheduling Optimizer
Build a task or job scheduler that picks tasks based on earliest deadlines, profits, or non-overlapping windows. Try a greedy version, then rebuild it using DP.
You’ll start feeling the trade-off between speed and thoroughness.
Not to Miss:
- Greedy works fast—but only when it’s provably correct
- DP is hard because it forces you to slow down and think in layers
- Most DP problems look hard, but once you solve 10+, you’ll start seeing patterns and state diagrams repeat
By the End of Week 14, You Should:
- Distinguish between when to use greedy and when to use DP
- Have 15–20 problems solved in each category
- Be able to reduce recursive brute force into an optimized bottom-up DP
- Think in terms of state, decision, and memory, not just loops and conditions
8. Weeks 15-16: Graphs, DFS, BFS, Dijkstra, Topological Sort, Union-Find
Goal: Learn to navigate systems, networks, dependencies, and connections.
Graphs are everywhere. Maps, social networks, dependency chains, scheduling—if something connects to other things, it’s a graph. This week is about movement. How to explore, traverse, and find the best paths across nodes and edges. Here’s what you’ll learn:
Core Concepts:
- Graph representations: adjacency list, matrix
- Directed vs Undirected
- Weighted vs Unweighted
- Cycles and connected components
Traversal Techniques:
- Depth-First Search (DFS)
- Breadth-First Search (BFS)
- Topological Sort (for dependencies)
Shortest Path Algorithms:
- Dijkstra’s Algorithm
- Bellman-Ford (optional, but good to know)
Disjoint Sets (Union-Find):
- Cycle detection
- Kruskal’s algorithm (MST)
- Path compression and union by rank
What to Practice
- DFS/BFS traversal
- Detecting cycles
- Topological sort (Kahn’s and DFS-based)
- Number of islands / Floodfill
- Dijkstra’s shortest path
- Union-Find problems (connected components, redundant connections)
Use visualizations. Draw graphs. Dry-run DFS and BFS on paper.
Project: Build a Graph Visualizer
Use simple input (like a map or task dependencies) and draw connections. Highlight the difference between DFS paths, BFS layers, and shortest path trees.
By the End of Week 16:
- You’ll know how to model real-world problems as graphs
- You’ll be comfortable with traversal and pathfinding
And you’ll have everything it takes to walk into any interview round—confident, structured, and pattern-aware
9. Weeks 17-21: Revision, Retention & Real-World Readiness
Goal: Tie everything together—and test it under real conditions.
You’re about to reach the finish line. Now it’s less about what you know and more about how well you can recall, apply, and perform under pressure. These weeks are about revising what matters, fixing weak spots, and practicing with intent.
That means going back to problems you couldn’t solve the first time. Re-trying them without help. Spotting what you missed. Grouping problems by pattern instead of topic. And slowly, building a system that reminds you where you’re still shaky.
Don’t just grind random problems. Pick 2–3 a day that feel slightly uncomfortable—the ones that make you pause. Time yourself. Try to solve them without looking up the solution. If you fail, that’s fine. Just make sure you come back to it in a few days.
Also, test yourself under pressure. Sign up for a coding contest. Try a hackathon. Pair up with a friend and do a mock round.
- Revision List
Track problems you couldn’t solve. Re-attempt after a week. Promote only when you solve without help.
- Pattern Log
Group problems by approach—Sliding Window, Binary Search, Backtracking—not by topic. It trains you to think in patterns.
- Cheat Sheet
Keep a one-pager of base cases, transitions, and edge cases for DP, recursion, and graphs. Just the essentials.
- Recode Stack
Re-implement 10–15 important problems from memory. No hints, no peeking. It locks in what you’ve learned.
- Timed Practice
Pick 2 problems a day. Set a timer. No distractions. Focus on accuracy, then speed. Repeat until calm becomes your default.
Common Pitfalls to Avoid
Everyone hits a few bumps while learning DSA. Here are some common ones worth catching early, so you can stay focused on what helps.
1. Switching languages mid-journey
It’s tempting to try something new when things get tough. But constantly shifting between Java, Python, and C++ usually resets your progress more than it moves you forward. The syntax isn’t the problem—the logic is. Stick to one and go deep.
2. Consuming too much, solving too little
Videos and blogs are helpful, but at some point, watching another explanation becomes a way to delay the hard part. If you understand the concept but haven’t written code for it, pause the content. Try it out yourself. Learning happens when you apply, not just absorb.
3. Avoiding dry runs and pen-paper thinking
It feels easier to debug in the editor, but tracing your logic on paper builds real intuition. Whether it’s recursion, loops, or a sliding window, walking through a few inputs manually often reveals what’s missing faster than running it ten times.
4. Relying on memorized solutions
Copy-pasting code or memorizing steps may get you through a problem, but it doesn’t prepare you for variations. If you needed hints the first time, that’s fine. Just make sure to resolve the same problem later, from a blank screen, without help.
5. Skipping the hard ones
It’s easy to scroll past the problems that look overwhelming, but those are usually the ones that teach you the most. If a question made no sense a week ago, try it again now. These revisits are what push your thinking forward, not the ones you solved in 15 minutes.
One Last Thing
DSA doesn’t just make you interview-ready. It makes you a better engineer. It teaches you how to think clearly when things get messy—how to break down problems, recognize patterns, and build with intention.
You’ll revisit topics. You’ll forget and relearn. That’s part of the process. What matters is you’ve built the habit of solving, thinking, and improving.
If you’ve made it this far, you don’t need shortcuts. You just need consistency.