What the data analyst interview looks like

Data analyst interviews typically follow a structured process that takes 1–3 weeks from first contact to offer. The process tests both technical skills and your ability to communicate insights clearly. Here’s what each stage looks like and what they’re testing.

  • Recruiter screen
    30 minutes. Background overview, tool proficiency (SQL, Excel, Python, Tableau), and salary expectations. They’re filtering for relevant analytics experience and communication clarity.
  • SQL / technical assessment
    45–60 minutes. Live SQL coding or a take-home SQL test. Expect queries involving JOINs, GROUP BY, window functions, subqueries, and CTEs. Some companies add basic statistics questions.
  • Case study / analytics presentation
    45–60 minutes. You’re given a dataset or a business problem and asked to analyze it, draw conclusions, and present your findings. Tests your ability to go from raw data to actionable insight.
  • Behavioral / hiring manager
    30–45 minutes. Stakeholder management stories, examples of driving decisions with data, and how you handle ambiguous requests. Often the final signal before the offer.

Technical questions you should expect

These are the questions that come up most often in data analyst interviews. They cover SQL, statistics, business problem-solving, and data visualization — the core areas you’ll need to demonstrate competence in.

Write a SQL query to find the top 3 products by revenue for each category in the last 90 days.
Tests window functions and filtering — two areas where many candidates struggle.
Use a CTE with ROW_NUMBER() or RANK() partitioned by category and ordered by revenue descending. Filter the date column to the last 90 days in the WHERE clause, then filter the outer query to rank ≤ 3. Discuss the difference between ROW_NUMBER() (no ties) and RANK() (allows ties). Mention that you’d check for NULL revenues and consider whether returned/cancelled orders should be excluded. A clean, readable query with proper aliasing matters as much as getting the right answer.
A stakeholder says ‘our conversion rate dropped 15% last week.’ How would you investigate?
This is a root cause analysis question — they want structured thinking, not guessing.
Start by verifying the metric: How is conversion rate defined? What’s the baseline, and is 15% statistically significant or within normal variance? Then segment: break it down by traffic source, device type, geography, and user type (new vs. returning). Check for external factors: was there a site change, a broken page, a marketing campaign that shifted traffic mix? Look at the funnel step by step — did traffic increase (denominator change) or did completions decrease (numerator change)? Present your findings with data, not opinions: “Mobile conversion dropped 25% while desktop was flat, and this coincides with the checkout page redesign deployed on Tuesday.”
Explain the difference between INNER JOIN, LEFT JOIN, and FULL OUTER JOIN with a practical example.
Fundamental SQL question. Clear explanation with real-world context is what they’re looking for.
INNER JOIN returns only rows that have matches in both tables — use it when you only want complete records (e.g., orders with matching customers). LEFT JOIN returns all rows from the left table and matching rows from the right, with NULLs where there’s no match — use it when you want all customers even if they haven’t placed orders. FULL OUTER JOIN returns all rows from both tables — useful for reconciliation tasks where you need to find mismatches between two data sources. Mention that you always think about the cardinality: does the join create duplicates? A one-to-many join can silently inflate your row count if you’re not careful.
How would you design a dashboard for the executive team to track business health?
Tests your ability to think about audience, hierarchy of information, and actionability.
Start with the audience: executives want high-level KPIs, not granular detail. Identify 5–7 key metrics (revenue, customer acquisition cost, retention rate, NPS, burn rate) and display them prominently with trend lines and comparison to targets. Use a top-down layout: summary KPIs at top, then sections that drill down by business unit or product line. Include context: show YoY and MoM comparisons, not just raw numbers. Add a “what changed” callout section for anomalies. Discuss design principles: minimize clutter, use color intentionally (red/green for status, not decoration), and ensure it loads fast. The best dashboards answer “how are we doing?” in under 5 seconds.
What is the difference between correlation and causation? Give an example where confusing the two would lead to a bad business decision.
Tests statistical literacy and business judgment — both are critical for data analysts.
Correlation means two variables move together; causation means one actually drives the other. Example: you notice that users who use the mobile app have 2x higher retention than web-only users. A naive conclusion would be “push everyone to mobile to increase retention.” But the correlation might be driven by a confounding variable: highly engaged users are more likely to download the app and more likely to retain. Forcing less-engaged users to the app won’t fix their engagement. To establish causation, you’d need an A/B test: randomly encourage a subset of new users to download the app and measure retention vs. a control group. Always ask: is there a plausible mechanism, and have we controlled for confounders?

Behavioral and situational questions

Data analysis is fundamentally about helping people make better decisions. Behavioral questions assess how you communicate findings, handle ambiguity, and navigate stakeholder relationships. Use the STAR method (Situation, Task, Action, Result) for every answer.

Tell me about a time your analysis changed a business decision.
What they’re testing: Impact-driven thinking, ability to translate data into action, stakeholder influence.
Use STAR: describe the Situation (what decision was being made and what the initial assumption was), your Task (what analysis you were asked to do or proactively initiated), the Action (your methodology, data sources, and how you presented the findings), and the Result (the decision that changed and the business impact). Quantify the result if possible: “The analysis showed we were over-investing in a channel with 3x higher CAC, and redirecting that budget increased overall ROI by 20%.”
Describe a time you had to deal with messy or incomplete data.
What they’re testing: Data quality awareness, problem-solving, transparency about limitations.
Pick an example where the data wasn’t clean and you had to decide what to do about it. Explain the issue (missing values, duplicates, inconsistent formats, unclear definitions), your approach (how you assessed the impact, what cleaning steps you took, what assumptions you made), and the outcome. The key: show that you documented your assumptions and communicated data limitations to stakeholders rather than presenting uncertain results as definitive.
Tell me about a time a stakeholder disagreed with your analysis.
What they’re testing: Stakeholder management, humility, ability to defend your work constructively.
Describe the disagreement (what they challenged and why), how you responded (did you listen first? did you re-check your work?), and the resolution. The best answers show that you were open to being wrong — maybe they had context you didn’t, or maybe your methodology had a gap. If you were right, explain how you backed it up with data, not ego. If you were wrong, explain what you learned. Either way, the relationship with the stakeholder matters as much as the analysis.
Give an example of how you prioritized competing analysis requests.
What they’re testing: Prioritization, communication, ability to manage expectations.
Pick a situation where multiple teams needed your analysis simultaneously. Explain how you evaluated the requests (business impact, urgency, level of effort), how you communicated timelines and tradeoffs to stakeholders, and what you delivered. Show that you didn’t just work on whatever came in last or whoever was loudest. The best answers demonstrate that you aligned priorities with business objectives and kept stakeholders informed.

How to prepare (a 2-week plan)

Week 1: Build your foundation

  • Days 1–2: Practice SQL daily. Focus on JOINs, GROUP BY with HAVING, window functions (ROW_NUMBER, RANK, LAG/LEAD, running totals), CTEs, and subqueries. Use platforms like DataLemur, LeetCode (database section), or StrataScratch.
  • Days 3–4: Review statistics fundamentals: mean vs. median, standard deviation, hypothesis testing (p-values, confidence intervals), A/B testing methodology, correlation vs. causation. You don’t need PhD-level stats, but you need to reason about data correctly.
  • Days 5–6: Practice case studies. Take a business question (“Why did signups drop?” or “Should we launch in this new market?”) and practice structuring your analysis out loud: define the metric, segment the data, identify hypotheses, describe what data you’d need, and what you’d recommend.
  • Day 7: Rest. Review your notes lightly but don’t cram.

Week 2: Simulate and refine

  • Days 8–9: Build or refresh a portfolio piece. If you have a take-home case study, do a practice run with a public dataset. Create a clean, well-structured analysis with clear visualizations and a concise summary of findings.
  • Days 10–11: Prepare 4–5 STAR stories from your resume. Map each to common themes: driving a decision with data, handling messy data, stakeholder disagreement, prioritizing requests, learning a new tool.
  • Days 12–13: Research the specific company. Understand their product, key metrics, and business model. If they’re e-commerce, review e-commerce metrics. If they’re SaaS, review SaaS metrics (ARR, churn, LTV). Prepare 3–4 specific questions.
  • Day 14: Light review only. Do 2–3 SQL problems to stay sharp, review your STAR stories, and get a good night’s sleep.

Your resume is the foundation of your interview story. Make sure it sets up the right talking points. Our free scorer evaluates your resume specifically for data analyst roles — with actionable feedback on what to fix.

Score my resume →

What interviewers are actually evaluating

Data analyst interviews evaluate candidates on a combination of technical skill and business thinking. Understanding these dimensions helps you focus your preparation.

  • SQL proficiency: Can you write correct, efficient queries? This is table stakes. Interviewers want to see that you can work with real-world data structures, not just textbook examples. Clean, readable SQL with good naming conventions also signals professionalism.
  • Analytical thinking: When given a business problem, can you break it down into answerable questions? Do you think about segmentation, confounding variables, and edge cases? The best candidates don’t just query the data — they think critically about what the data means.
  • Communication: Can you explain your findings to someone who doesn’t know SQL? Can you distill a complex analysis into a clear recommendation? This is often the differentiator between good and great data analysts.
  • Business acumen: Do you understand why a metric matters, not just how to calculate it? Can you connect your analysis to revenue, user experience, or operational efficiency? Data analysts who understand the business context produce more valuable insights.
  • Attention to data quality: Do you notice when something looks off? Do you validate your results before presenting them? Interviewers listen for whether you naturally check for duplicates, NULLs, outliers, and definition mismatches.

Mistakes that sink data analyst candidates

  1. Writing SQL that works but is unreadable. If your query is one massive block with single-letter aliases and no formatting, you’ve lost points even if it returns the right answer. Use CTEs, meaningful aliases, and proper indentation. Interviewers imagine maintaining your code.
  2. Jumping to conclusions without checking assumptions. If you’re given a metric drop and immediately blame one factor without segmenting the data, you’re showing a bias toward gut feel over analysis. Always ask clarifying questions and verify the data first.
  3. Presenting data without a recommendation. Stakeholders don’t want a spreadsheet — they want to know what to do. If your case study answer ends with “here are the numbers” without a clear recommendation, you’ve missed the point of the exercise.
  4. Ignoring data quality issues. If you’re given a dataset with obvious issues (NULLs, duplicates, outliers) and don’t mention them, interviewers will question your attention to detail. Always note data quality concerns and explain how they affect your analysis.
  5. Overcomplicating your analysis. Using advanced statistical methods when a simple bar chart tells the story suggests you’re showing off rather than solving the problem. Start simple. Complexity should be justified by the question, not by your desire to impress.
  6. Not asking clarifying questions. When given a case study, candidates who dive straight in without asking “How is this metric defined?” or “What time period are we looking at?” miss a chance to show analytical rigor and risk solving the wrong problem.

How your resume sets up your interview

Your resume is the foundation for most interview conversations. In data analyst interviews, interviewers will ask you to walk through specific analyses, tools, and business impacts listed on your resume — so every bullet needs to hold up under follow-up questions.

Before the interview, review each bullet on your resume and prepare to discuss:

  • What was the business question you were answering?
  • What data sources did you use, and how did you clean/prepare the data?
  • What methodology did you apply, and why that approach?
  • What was the recommendation, and did the stakeholder act on it?

A well-tailored resume creates natural segues into your strongest stories. If your resume says “Built a customer segmentation model that increased email campaign ROI by 30%,” be ready to discuss the segmentation methodology, the metrics you tracked, and how you validated the results.

If your resume doesn’t set up these conversations well, our data analyst resume template can help you restructure it before the interview.

Day-of checklist

Before you walk in (or log on), run through this list:

  • Review the job description and note which tools (SQL, Python, Tableau, Excel) and domains they emphasize
  • Prepare 3–4 STAR stories that demonstrate impact through data-driven decisions
  • Practice 5–10 SQL problems covering JOINs, window functions, and aggregations
  • Test your audio, video, and screen sharing setup if the interview is virtual
  • Prepare 2–3 thoughtful questions about the team’s data stack and analytics culture
  • Look up your interviewers on LinkedIn to understand their backgrounds
  • Have water and a notepad nearby
  • Plan to log on or arrive 5 minutes early