Skip to main content
Data & Analytics
Mid-Level

Data Analyst (Mid-Level, SMB) Hiring Guide

Responsibilities, must-have skills, 30-minute assessment, 3 interview questions, and a scoring rubric for this role.

Role Overview

A Data Analyst (mid-level, 3-5 years experience) in a small-to-mid-sized business is responsible for turning raw data into actionable insights that inform business decisions . They collect, clean, and analyze data from various sources, then translate the numbers into clear findings and recommendations for stakeholders

In an SMB environment, the mid-level analyst often handles end-to-end data tasks - from gathering data and managing reports to presenting results - working both independently and collaboratively. The role bridges the gap between data and decision-making by not only performing technical analysis but also communicating the "story" behind the data in plain language for non-technical teams . This position is typically hybrid (remote-friendly with some on-site days), so the analyst must be effective using online collaboration tools and also comfortable with face-to-face meetings when needed. Ultimately, a mid-level Data Analyst helps the company make evidence-based decisions by providing accurate analyses, dashboards, and insights that drive strategy and improve operations.

Core Responsibilities

  • Data Collection & Cleaning: Gather data from multiple internal and external sources (databases, spreadsheets, APIs, etc.) and clean or validate it to ensure accuracy and consistency before analysis

This includes handling missing values, removing duplicates, and resolving any inconsistencies in datasets.

  • Data Analysis & Pattern Identification: Explore and analyze large datasets to identify trends, correlations, and anomalies that align with business questions Use statistical methods and critical thinking to interpret what the data is revealing about business performance or customer behavior.
  • Database Querying (SQL): Write and execute SQL queries or scripts to extract relevant data for analysis from relational databases Optimize queries for efficiency and accuracy, ensuring the right data is pulled to answer specific business inquiries.
  • Reporting & Visualization: Develop reports and interactive dashboards to present data in a meaningful way, using tools like Tableau or Power BI (or equivalents) Summarize complex data into charts, graphs, and tables that highlight key metrics and insights for decision-makers.
  • Insights Presentation: Present and communicate key findings to stakeholders and leadership, translating technical results into clear business insights This involves preparing summaries or slide presentations and emphasizing the implications of the data in terms that various business teams (e.g. marketing, finance, operations) can understand and act on.
  • Cross-Functional Collaboration: Work with cross-functional teams - for example, collaborating with business managers to clarify their data needs and with IT or data engineers to improve data pipelines and sources Help define and refine key performance indicators (KPIs) and ensure everyone uses consistent metrics
  • Continuous Improvement of Data Systems: Maintain and enhance the data analysis infrastructure, such as updating reporting systems or data models for new business requirements Stay

updated on industry trends and new analytics tools, suggesting improvements to current processes to increase efficiency or data quality over time.

Must-Have Skills

Soft Skills

  • Communication: Excellent written and verbal communication skills are a must The analyst needs to explain complex data findings in simple terms, whether writing a summary email or presenting to a group. They should adjust their language to the audience, ensuring that even non"technical stakeholders grasp the insight and significance of the data.
  • Problem-Solving & Critical Thinking: A strong problem-solving mindset to approach data questions methodically The analyst should be able to break down vague business problems into analytical steps, interrogate the data critically, and validate whether findings truly explain the issue. Critical thinking also involves being skeptical of initial results and double-checking for biases or errors.
  • Attention to Detail: High level of attention to detail to avoid mistakes in analysis and reporting This means carefully checking calculations, ensuring data integrity, and catching inconsistencies.

Small errors can lead to faulty decisions, so a successful analyst double-checks their work (and has quality control steps) before delivering results.

  • Collaboration & Teamwork: Ability to work collaboratively with others, including technical teams and non-technical colleagues In practice, this means being open to feedback, sharing knowledge with teammates, and being able to gather requirements or clarify needs through active listening. A mid-level analyst in an SMB might often serve as the liaison between data and various departments, so being approachable and cooperative is key.
  • Time Management & Organization: Strong organizational skills to handle multiple projects or data requests and meet deadlines The analyst should be capable of prioritizing tasks based on business urgency and managing their time in a semi-autonomous hybrid work setting. This includes keeping track of regular reporting schedules while also tackling ad-hoc analysis requests.
  • Adaptability: Flexibility and adaptability in a fast-changing environment. SMBs often evolve quickly, and data needs can shift as the business grows or priorities change. The analyst should be comfortable adjusting their focus, learning new tools or techniques as needed, and handling some ambiguity. (For example, adapting from one software to another if the company adopts new technology.)
  • Presentation Skills: (Related to communication) - ability to present data insights confidently in meetings or via video calls. This includes using storytelling techniques to make the data memorable and using visuals effectively. While not every analyst is a formal presenter, mid-level roles often involve briefing managers or teams on what the numbers mean.

Hiring for Attitude

  • Curiosity and Inquisitiveness: A natural curiosity about data and the business is one of the hallmark traits of a great analyst . The ideal candidate enjoys digging into the "why" behind the numbers - treating anomalies or patterns as puzzles to solve, rather than just tasks. This investigative mindset drives them to explore data deeply and ask insightful questions that lead to meaningful discoveries.
  • Continuous Learning: A desire and commitment to continuously learn and improve is critical Data tools and techniques evolve quickly; a strong candidate stays up-to-date with emerging technologies or analysis methods and is eager to broaden their skill set. They seek feedback on their work and view each project as an opportunity to learn something new
  • Adaptability & Openness to Change: Comfort with change and the ability to adapt to new tools, requirements, or business priorities In a growing SMB, processes and data systems can mature over time - the analyst should embrace new solutions (for example, adopting a new BI tool or adjusting to a new data source) rather than sticking rigidly to old ways. They remain curious and flexible in the pursuit of better answers
  • Integrity and Ethical Approach: Strong sense of ethics in handling data - the candidate must value honesty and accuracy over telling people what they might want to hear. For instance, if data reveals an uncomfortable truth (e.g., a project isn"t performing well), a good analyst reports it candidly rather than manipulating or cherry-picking the data. Since analysts may work with sensitive information, a commitment to data privacy and ethical use of data is non-negotiable
  • Accountability & Ownership: Takes ownership of their work from start to finish analyst should demonstrate accountability by following through on tasks, double-checking their results, and addressing any mistakes proactively. This trait means they don"t pass blame for data issues - instead, they strive to fix problems and learn from them. They are reliable to deliver what they promised and to communicate early if a deadline is in jeopardy.
  • Business-Mindedness: An orientation toward business outcomes - i.e., always considering how the analysis connects to real business questions and goals. The candidate should show interest in understanding the "bigger picture" of the organization"s strategy, ensuring their analytical work is aligned with what actually drives value 26 . This might be demonstrated by asking clarifying questions about what a stakeholder will do with the data or by prioritizing projects that have highest impact. An analyst who only focuses on numbers without context might miss the mark, so a keen business sense is an attitude to hire for.
  • Collaborative Attitude: (Complementing teamwork skill) - a willingness to help others and a positive approach to teamwork. For example, an analyst with a good attitude will happily assist a non-technical colleague in understanding a report, or will pair with IT to figure out a data pipeline issue, even if it"s "not my job. They see sharing knowledge and working together as beneficial. They are also open to constructive feedback, seeing it as a means to improve rather than a personal attack.

Tools & Systems

Systems / Artifacts

Data & Analysis Tools: This role uses a typical SMB analytics tech stack. On the data side, that includes relational databases (such as MySQL, PostgreSQL, or SQL Server) for storing company data - the analyst should be comfortable querying these via SQL. For day-to-day analysis and data manipulation, spreadsheets like Microsoft Excel or Google Sheets are indispensable (e.g. for quick data cleanup, pivot tables, and ad-hoc calculations). In terms of analytics programming, the team might use Python (with libraries like pandas) or R for more complex analyses or automation, though these are often supplemental to core tools in many SMBs.

Business Intelligence & Visualization: The company will likely use a BI tool to create dashboards and visual reports. Common choices are Tableau or Power BI (in some cases Google Data Studio or Looker Studio for Google-centric shops). This dossier assumes Tableau as the default dashboard tool (a standard choice in many mid-sized businesses), but any similar platform could apply. The Data Analyst should be skilled in designing clear and effective dashboards, using features like filters, drill-downs, and calculated fields. They also might use visualization libraries (e.g., matplotlib or ggplot if coding) for custom charts when needed.

Collaboration & Communication Systems: Given the hybrid work setup, the analyst will rely on collaboration tools such as Slack or Microsoft Teams for daily communication and coordination with colleagues. Video conferencing (Zoom/Teams) is used for meetings, especially when presenting findings remotely. Project or task management software (Trello, Asana, Jira, or even Excel trackers) may be used to organize analytics projects and data requests. Documentation and knowledge sharing might occur in tools like Confluence or Google Docs, where the analyst maintains data dictionaries or report guides for others.

What to Assess

Situational Judgment Scenarios

To evaluate how candidates apply judgment in realistic situations, consider the following situational scenarios relevant to a Data Analyst. These scenarios are designed as Situational Judgment Tests (SJT) where the candidate must choose or describe the best course of action:

  • Scenario 1 - Conflicting Deadline vs. Data Quality: A department head requests an urgent analysis to be delivered by end of day, but the dataset available is incomplete or potentially inaccurate. This scenario tests whether the analyst will balance speed with quality - do they communicate the data limitations and negotiate the deadline or deliver questionable results just to meet the time constraint?
  • Scenario 2 - Mistake Discovered in a Report: The analyst has sent out a weekly report to stakeholders, and later discovers that one of the metrics in the report was calculated incorrectly (a mistake in the data or formula). This scenario assesses the candidate"s accountability and communication: how do they handle owning up to the error and correcting it? Do they inform stakeholders proactively and fix it with minimal disruption?
  • Scenario 3 - Ethical Data Dilemma: A manager suggests excluding or adjusting certain data points in an analysis because the true results look unfavorable (for example, removing a low customer satisfaction survey to lift the average). This scenario examines integrity and professional ethics. The candidate should decide how to respond - whether to comply, push back with an explanation of why that"s misleading, or escalate the concern - all while maintaining professionalism.

Each scenario is meant to gauge how the candidate would navigate common challenges a data analyst might face, including prioritization, error management, and ethical decision-making. The best responses typically involve transparency, effective communication, and a balance of analytical thinking with understanding of business needs. These SJT scenarios will be presented as questions in the assessment to see if the candidate"s judgments align with the company"s values and best practices.

Assessment Tasks

Attention to Detail Tasks

is critical in data analysis, so the assessment includes tasks to examine the candidate"s attention to detail. Examples of such tasks:

  • Data Consistency Check: Present a small dataset or report snippet with deliberate inconsistencies or errors and ask the candidate to identify them. For instance, a table might show quarterly totals that do not actually add up from the individual monthly values, or an average that"s been miscalculated. The candidate is expected to spot these errors. (e.g., "In the sales summary below,

identify any incorrect figures or discrepancies. - where perhaps a total is wrong or a category is duplicated.)

  • Verification Task: Provide a summary statistic (like a mean or percentage) along with the raw numbers, and ask if the summary is correct. For example, show five data points and a stated average that is slightly off. A detail-oriented analyst will quickly recalc and catch the mistake. This tests whether the candidate double-checks results and notices small errors that could have big implications.

In these tasks, correctness and thoroughness are evaluated. An ideal candidate will methodically verify each figure, demonstrate their approach (e.g., re-adding numbers or cross-checking references), and clearly point out the discrepancies. These exercises ensure the person won"t let subtle errors slip through in real business reports.


Effective communication is a core part of the Data Analyst role, so the assessment includes tasks to gauge how well candidates convey technical information to non-technical audiences:

  • Written Summary (Email Scenario): The candidate might be asked to draft a brief email or memo explaining a data finding to a stakeholder in plain language. For example: "Compose a short email to the Sales Director summarizing that last month"s sales dropped by 10%, and your analysis suggests it was due to seasonal factors. Include a simple chart and a recommendation for next steps. This task checks if the candidate can tell a clear story from data: highlighting the key point (the 10% drop), explaining the likely cause in non-technical terms, and suggesting an action or reassurance. The tone should be professional and concise, avoiding jargon.
  • Presentation Outline: The assessment might also ask how the candidate would present complex data in a meeting. For instance, "Outline how you would present the findings of an analysis on customer churn to a team of managers. The candidate would need to describe how they"d organize the information, what visuals they might use, and how they"d make it understandable. This isn"t an actual live presentation but tests whether they know how to focus on the "so what" of data when talking to others.

Key things being evaluated in communication tasks are clarity, correctness, and audience-appropriateness. The best responses will structure information logically (e.g. lead with the conclusion, then support with data), use simple language or analogies, and emphasize what the data means for the business. This ensures the analyst can bridge the gap between data and action in the real world.


Tasks

In this category, the assessment looks at the candidate"s practical technical skills and their approach to solving data problems or designing analytical processes. It can include:

  • SQL Query Challenge: The candidate could be given a simple database schema (e.g., tables for

Orders , Customers , etc.) and a business question, then asked to write an SQL query to retrieve the answer. For example: "Write a SQL query to find the total sales per region for the last quarter. This tests knowledge of SELECT, SUM, GROUP BY, JOINs if needed, etc. The expected answer is a correct query or close-to-correct syntax that would produce the intended result.

  • Data Analysis Process Scenario: Present a scenario requiring a plan or steps to approach it. For instance: "Your manager asks you to create a new Company KPI Dashboard from scratch. How would you go about this project? The candidate should outline the process: from gathering requirements (identifying which KPIs and who will use the dashboard), to collecting the necessary data, choosing the right tool (e.g. deciding between Tableau or Excel based on complexity), designing the dashboard layout, iterating with stakeholder feedback, and finally deploying it. This measures the ability to structure and manage an analytics project.
  • Problem-Solving/Critical Thinking Task: Another example could be a brief case where a key metric changed unexpectedly (say, web traffic dropped 25% in a month) and the candidate must list what steps or analyses they would do to find out why. A strong answer would mention checking different data sources (analytics, campaigns), segmenting the data (by channel or demographic), looking for external factors, etc., showing a systematic investigative approach.

These tasks ensure the candidate not only has book knowledge but can apply it. Technical correctness (like writing a syntactically correct SQL query or accurately describing a method) is scored, as well as the quality of approach (do they hit all the important steps? do they consider edge cases or business context?). A mid-level analyst is expected to demonstrate both skill proficiency and good process thinking - meaning they know how to tackle a real-world data project from start to finish, not just answer theoretical questions.

Recommended Interview Questions

  1. 1

    correctness (like writing a syntactically correct SQL query or accurately describing a method) is scored, as well as the quality of approach (do they hit all the important steps?

  2. 2

    Describe a time you had to explain a complex data insight or report to someone who isn"t familiar with data (for example, a senior manager or a client). How did you approach it, and what was the result?

  3. 3

    Give an example of a challenging interaction with a stakeholder or team member in the context of a data project. Perhaps a situation where they disagreed with your analysis or wanted something unrealistic. How did you handle it?

Scoring Guidance

To ensure fair and structured hiring, use the following scoring guidelines for both the assessment and the interview:

  • Assessment Scoring: Each section of the assessment is scored objectively using the answer keys provided. The total possible score in this assessment is 30 points (this can be scaled or adjusted as needed, but for example, 3 points from SJT + 5 from accuracy + 10 from communication + 10 from technical = 28, which could be normalized to 30 or 100). It"s important that the scorer adheres strictly to the key:
  • Situational Judgment: Award 1 point for each scenario question where the candidate chose the "best" answer (as indicated in the key). No partial credit - answers are either correct or not, based on the provided ideal decision .
  • Accuracy Task: Use a point system for each identified error. For instance, if one major error was embedded and the candidate catches it, full points; if not, zero. If there are multiple issues, assign points per issue. The grading notes specify what"s expected, so the scoring should be deterministic (e.g., "North total incorrect" = 5 points, each additional irrelevant issue mentioned = -1).
  • Communication Task: Utilize a simple rubric dividing the 10 points into components: did the candidate include the key fact (3 pts), provide a logical explanation (3 pts), use clear language/tone (2 pts), and maintain professionalism/format (2 pts). Two scorers can independently rate the written answer using this rubric and compare to ensure consistency. Minor grammar issues should not heavily penalize unless they impede understanding. The main focus is on content and clarity.
  • Technical/Process Task: As outlined, assign points for each key step or element present in the answer. This is essentially a checklist - the answer key breaks the ideal solution into parts (requirements, data, build, validation, etc.). The scorer checks which parts were mentioned and tallies points accordingly. This approach yields a deterministic score (for example, candidate mentioned 4 of 5 main steps correctly = 8/10 points). If the answer is disorganized but contains the points, they still get credit; we"re not judging writing style here, but completeness and correctness of process.

Before the assessment, prepare a scoring sheet listing each expected point so the evaluator can tick off what the candidate did right. This will make the scoring faster and audit-proof - anyone reviewing can see exactly why a candidate got, say, 7/10 on a section (which points were missing).

  • Interview Scoring: Use a structured interview scorecard with predetermined criteria for each question. Each interview question can be rated on a scale (for example, 1 to 5 or 1 to 3) based on how well the candidate"s answer demonstrated the target skill or trait:
  • Define anchors for scores. For instance, for the communication question, a "5 -Excellent" answer means the candidate gave a clear, relatable example explaining technical information to a non-tech person with great success; a "3 -Satisfactory" might mean they gave an example but lacked some detail or clarity; a "1 -Poor" would be an unclear or irrelevant answer . Do this for each question competency (e.g., Q1: depth and impact of project; Q3: ownership of mistake; Q5: conflict resolution skill; etc.).
  • Have at least two interviewers independently score the candidate on each question directly after the interview, then compare and discuss to reach a consensus. This reduces individual bias. Each question"s score can be weighted equally or certain critical questions (for example, ethics or communication) can be weighted slightly more - but any weighting should be decided in advance.
  • The interview scorecard should also allow space for notes justifying each score. For audit purposes, note key remarks from the candidate that led to the score (e.g., "Q3: candidate described blaming a teammate for an error - gave score 2 (below average) for accountability"). This documentation ensures transparency in how evaluations are made.
  • Overall Decision Combining Assessment and Interview: Determine if the assessment is a pass/fail gate or if it contributes to an overall score. For instance, you might require a minimum assessment score (e.g., 70% of points) to move on to interview. Or if both are done for all candidates, you can combine scores. One approach: Convert both assessment and interview to a common scale (say each 50 points, total 100). A candidate"s total score = assessment score + interview score. You might set a cutoff (e.g., 80/100) or take the top candidate by score. Alternatively, use the assessment to eliminate those who clearly lack fundamentals, then use interview performance for final selection among those who passed.
  • Red Flags and Qualitative Overrides: Incorporate a check for any red flags. If a candidate scored well numerically but exhibited a disqualifying red flag (for example, they chose unethical options in the SJT or said something very concerning in interview), the hiring team should have a policy for how to handle that. Typically, a serious red flag means the candidate is removed from consideration regardless of score. Ensure that this is noted (with reasons) in the evaluation record. For example,
  • Candidate scored high, but demonstrated disregard for data integrity in scenario question 3 - disqualified. This keeps the process fair and justified.
  • Calibration: Before finalizing any hiring using this kit, consider running a calibration session. That means having a team member (or a trial candidate) take the assessment and going through an interview dry-run, then having multiple evaluators score it. Compare results and interpretations of the rubric. Adjust the scoring guides if needed to tighten consistency. This upfront work helps make sure that when real candidates come through, the scoring is as objective and uniform as possible. Overall, maintain a paper or digital trail of all scores, notes, and decisions. This not only helps in justifying choices to any internal or external audit, but also improves the hiring team"s ability to reflect and improve the process for next time.

Red Flags

Disqualifiers

During the hiring process (both assessment and interview), watch out for the following red flags that could indicate a candidate is not the right fit for this Data Analyst role:

  • Poor Communication Skills: If the candidate cannot clearly articulate their thoughts or struggles to explain technical concepts in simple terms, that"s a warning sign . A data analyst who "knows their stuff" but can"t communicate it will have a hard time driving any action from their insights. For example, rambling, using excessive jargon, or failing to answer the question directly in the interview or written tasks could indicate weak communication.
  • Lack of Attention to Detail: Sloppy work or frequent mistakes in the assessment is a major red flag. An example is if the candidate overlooks obvious errors in the Accuracy task or submits an analysis with calculation mistakes. Data analysts must be detail-oriented - rushing through and delivering false results can have "dire consequences" If the candidate doesn"t catch an inconsistency that was intentionally placed in the test, it calls into question the quality of work they"d produce on the job.
  • Ethical Concerns or Dishonesty: Any hint that a candidate is willing to misrepresent data or hide information to make results look better is disqualifying In the SJT ethical scenario, for instance, if a candidate chooses to delete unfavorable data without valid justification or says they"d simply do whatever the boss asks even if it"s misleading, that"s a serious red flag. You need someone who will uphold integrity, even under pressure.
  • Outdated or Rigid Skillset: A candidate who shows no awareness of modern tools or refuses to learn new technologies may not thrive in a growing SMB. The data field evolves quickly; be cautious if the candidate only knows one tool and dismisses others, or has not updated their skills in years

For example, if their experience is only with a very old software and they haven"t tried to learn a more current tool (like still using Excel 2003 techniques and unaware of visualization tools), they might struggle to adapt.

  • No Business Insight: If the candidate seems "nose-deep in data with blinders on" and fails to understand or care about the business context , that"s problematic. Signs of this could be answers that are overly technical without addressing what the numbers mean for the company, or an inability to connect analysis work to business outcomes. A good analyst should demonstrate at least some curiosity about how their work impacts the organization. Lacking this perspective could mean missed opportunities to provide value.
  • Inability to Collaborate or Accept Feedback: While harder to gauge on a test, interview behavior can reveal this. Red flags include speaking negatively of past team experiences, showing a know-it-all attitude, or dismissing alternate approaches. A data analyst who can"t work with others or learn from feedback will struggle, especially in a small team. If the candidate, for example, becomes defensive when gently challenged on a technical choice during discussion, it may indicate they"re not open to collaboration.

Any one of these red flags should give pause. It"s important to consider the whole picture (a candidate might be nervous and communicate poorly in one answer but better in others). However, multiple red flags or a severe issue (like an ethics lapse) are usually disqualifying. Document any red flag observations during scoring so they can be factored into the final hiring decision.

Assessment Blueprint (30 minutes, 5 sections)

The following is a 30-minute assessment divided into 5 sections, designed to test the candidate"s skills and judgment in a structured, job-relevant way. For each question or task, an answer key or grading notes are provided to ensure deterministic scoring.

Section 1: Role-Specific Scenarios (Situational Judgment) - This section presents 3 situational judgment questions. Each scenario is described with multiple-choice actions (A, B, C, D). The candidate must choose the best response for an analyst. Scoring: 1 point for the correct/best choice per question (3 points total). No partial credit.

Q1: A marketing manager asks you for a detailed sales analysis to inform an urgent meeting at the end of the day. You realize that the data for the current month is incomplete and may lead to an inaccurate analysis. What do you do?

A. Proceed with the analysis using the data available and deliver the report by end of day without mentioning the data issues - speed is the priority.

B. Refuse the request, explaining that you cannot produce the analysis in time with the given data.

C. Explain to the manager that the data is incomplete and propose a solution: provide a preliminary analysis by end of day with clear caveats, and a more thorough report once full data is available .

D. Do the analysis quickly with the available data and include all data points except the ones you suspect are inaccurate, hoping it will be "good enough.

Answer Key: Correct answer is C. The best approach is to set expectations about data quality and suggest an interim solution . Option C demonstrates professionalism - communicating the limitation, maintaining transparency, and still trying to help by providing something useful under the deadline. Option A (ignoring data problems) could lead to a wrong decision; Option B is overly rigid and not helpful; Option D involves cherry-picking data (borderline unethical and risky). Award 1 point for selecting C; 0 points for any other choice.

Q2: Last week, you sent out a weekly performance report to the team. Today you discover that you made a mistake in one of the charts - the conversion rate was calculated incorrectly. What is your course of action?

A. Quietly correct the error in your copy of the report but do nothing else, hoping no one notices in the published report.

B. Immediately inform the team (via email or message) that there was an error, provide the corrected number/chart, and briefly explain the impact (while apologizing for the oversight and assuring them it"s fixed) .

C. Wait to see if someone asks about the number, and only explain if needed, to avoid drawing attention to the mistake.

D. Blame the mistake on a data issue or a colleague"s input, so it doesn"t reflect on your abilities.

Answer Key: Correct answer is B. The candidate should take accountability and correct the information proactively. Option B earns full credit because it shows honesty and responsibility - notifying stakeholders with the accurate data and a quick apology/clarification. Options A and C try to hide or minimize the issue, which is not acceptable as it erodes trust. Option D (blame-shifting) shows poor ownership and teamwork. Score 1 point for B; 0 for others. In evaluating an open explanation (if this were open-ended), look for keywords like "inform stakeholders, "corrected report, and an owning of the mistake. Anything short of that is not full credit.

Q3: You are preparing an analysis on customer survey feedback. The initial results show a decline in customer satisfaction. Your manager suggests dropping the lowest 5% of survey responses (the very unhappy customers) from the data to see if the scores improve, implying this might make the presentation to executives look better. How do you respond?

A. Agree and remove those low scores from the analysis to improve the overall metric - the manager is your boss, and you want to keep them happy.

B. Explain that excluding data without a valid methodological reason is misleading

You propose keeping all responses, but perhaps segmenting the data or investigating why those customers are unhappy. You emphasize that accurately representing customer sentiment (good or bad) is important for making genuine improvements.

C. Remove the low scores as asked, but keep a secret backup of the full data just in case it"s needed, without telling the manager.

D. Say nothing to the manager but escalate the issue to the ethics hotline or a higher-up immediately, since you were asked to do something questionable.

Answer Key: The best answer is B. This response shows integrity and sound reasoning - the candidate stands up for ethical analysis by not misrepresenting the data, and offers an alternative approach (like deeper analysis of the unhappy customers) . This demonstrates they prioritize truthful insights over making the numbers look "good. Option A is a red flag (willingness to fudge data). Option C, while preserving data privately, is still going along with a dishonest approach (no points). Option D shows concern for ethics but is an overreaction in most contexts; it"s better to first have an open conversation with the manager (escalation might be step 2 if the manager insists on something unethical). Award 1 point for choosing B. In an explanation, credit answers that mention honesty, the importance of full data, and how to handle the situation professionally.

Section 2: Accuracy / Attention to Detail - This section has 1 task. The candidate is given a small data snippet and must identify errors. Scoring: 0-5 points, based on errors correctly identified.

Q4: Data Quality Check. Examine the following mini-report for accuracy and consistency, and list any errors you find:

| Region | Q1 Sales | Q2 Sales | Q3 Sales | Q4 Sales | Total Sales |\n |--------|----------|----------|----------|----------|-------------|\n | North | 10 | 15 | 20 | 25 | 80 |\n | South | 8 | 12 | 18 | 22 | 60 |\n

The report claims that "North region total sales for Q1"Q4 = 80" and "South region total sales = 60. Check the math and consistency.

Answer Key: The North region Total Sales is incorrect. If we add Q1+Q2+Q3+Q4 for North: 10 + 15 + 20 + 25 = 70, not 80. This is a clear error. The South region"s total (8+12+18+22) correctly equals 60. So the only discrepancy in the table is the North total, which appears overstated by 10 units. A full-credit answer (5/5 points) will explicitly identify that the North total should be 70 (and that 80 is wrong).

Grading notes: Deduct points if the candidate misses the error or reports incorrect issues. For example, if a candidate said "North total seems off" but didn"t do the math, partial credit (e.g. 3/5) might be given for at least flagging the inconsistency. If they identify the exact mistake (North total mis-sum) and provide the correct total, that earns full points. Any additional errors the candidate writes that aren"t actually present (false positives) should knock off a point, as that indicates a lack of accuracy in their checking. Overall, we expect an attentive analyst to catch the North total error quickly and nothing else, since South"s data is fine.

Section 3: Communication - This section has 1 task. It evaluates written communication. Scoring: 0-10 points, using a rubric (clarity, completeness, tone).

Q5: Communication Task - Email Summary. You have analyzed the monthly revenue and discovered that revenue in the last month dropped by 10% compared to the previous month, primarily due to a seasonal slowdown. Write a short email (approximately one paragraph) to the Sales Director explaining this finding. In your email, include: the key statistic (the 10% drop), a likely reason (seasonality), and a concise, reassuring or action-oriented closing (for example, a suggestion on next steps or that you will keep an eye on the trend). Assume the Sales Director is not very technical - keep the language clear and avoid technical jargon.

Answer Key: Grading will be based on the quality of the explanation:

A model answer would be:

  • Subject: Monthly Revenue Update - 10% Drop in December Hi [Sales Director Name], I wanted to update you on December"s revenue. We saw about a 10% decline in revenue compared to November. After looking into it, the data suggests this dip was largely due to seasonal factors - December is historically slower post-holiday, and we had fewer billing days. Importantly, our year-to-date revenue is still on track. I recommend we monitor January"s numbers closely to ensure this was just seasonal. I"m happy to discuss further or dive deeper into any product-specific trends if you need. Best regards, the candidate We are looking for the following elements in the candidate"s response: -Accuracy: They mention the 10% drop clearly (and not, say, the wrong figure). -Explanation/Reason: They cite a plausible reason like seasonality (given in the prompt) or another sensible cause such as "fewer sales days" or "end-of-year holidays" - something that shows they thought about why revenue fell, in terms the Sales Director cares about. -Clarity and Tone: The writing should be clear and professional, understandable to a non-analyst. No heavy jargon or overly technical detail. The tone should be solution-oriented or at least not alarming - ideally a bit reassuring or focusing on next steps (like monitoring or offering further help). -Brevity: The email should be reasonably short (a few sentences, as requested). Scoring rubric (10 points): Start at 10 and deduct for any missing element. For example, if the candidate fails to include any reason for the drop, subtract 3 points. If the wording is too technical or confusing, subtract 2 points. If the email is extremely short or lacks a polite tone (e.g., just "Revenue fell 10%. It"s seasonal. "Analyst"), subtract points for tone/format. An excellent answer that hits all points gets 9-10. A satisfactory answer that conveys the drop and maybe a light reason but isn"t very polished might get around 7-8. Anything that misses the key figure or is unclear would be below 5 (since that would not accomplish the goal of informing the director). Section 4: Technical/Process Knowledge - This section has 1 multi-part question. It assesses the candidate"s approach to a project and understanding of technical steps. Scoring: 0-10 points, broken into parts for coverage of key steps and considerations. Q6: Process Planning Scenario. "Designing a New KPI Dashboard" - Imagine our company wants a new dashboard to track key performance indicators (KPIs) across departments. You, as the data analyst, are tasked with leading this project. Briefly outline the steps you would take to go from initial request to a delivered dashboard. In your answer, list the main phases or actions you would perform, and mention any tools or collaboration needed at each step. (For example, how will you gather requirements, what tool might you use to build it, how will you ensure it"s accurate and useful, etc.) Answer Key: We expect a logical sequence of steps covering Requirements . Data Preparation . Building . Validation . Deployment/Feedback. An outline of an ideal answer: 1. Gather Requirements: Meet with stakeholders (the department heads or the requestor) to understand which KPIs are needed on the dashboard, the frequency of updates, and who the audience is. Determine the key metrics (e.g., sales, costs, customer metrics) and the goals for each. Also clarify what data sources are available for these KPIs. 2. Assess Data & Tools: Identify where the data for each KPI comes from (databases, spreadsheets, etc.). Ensure you have access and that the data is reliable. Choose the dashboard tool - likely Tableau or Power BI since those are common (for this scenario, assume we use Tableau, which we"re familiar with). Plan the data model: will you need to create any aggregated tables or run SQL queries to feed the dashboard? Decide on the update process (e.g., will it be an automated refresh from a database or manual data pulls). 3. Build the Dashboard Iteratively: Start designing the dashboard layout. Create initial charts/visualizations for each KPI. For example, a line chart for monthly sales trend, a gauge or indicator for % to target, etc., based on what was requested. Pay attention to clarity (good labels, color coding). Use the chosen tool to build these visuals, and bring them together on a dashboard page. 4. Validate Accuracy: Before releasing, double-check all numbers. Compare the dashboard figures against known reports or do spot checks with raw data (this ensures the calculations/formulas in the dashboard are correct). If possible, have a colleague or the requestor review for correctness. This QA step is essential so that the final product is trusted. 5. Feedback & Iterate: Present the draft dashboard to a few stakeholders (maybe in a quick meeting or by sending a test link). Gather feedback - are the metrics clear? Anything missing or not useful? Also ensure it"s user-friendly (maybe they want a filter to select a date range, etc.). Incorporate the feedback and refine the dashboard. 6. Deployment & Training: Publish the dashboard in the appropriate system (e.g., on the Tableau Server or share the Power BI link) and make sure the target users have access. Provide a short guide or walkthrough for users, especially if they are not used to the tool - explain how to read the figures, how often it updates, and who to contact (you) with questions. Ensure scheduling of data refresh if applicable. 7. Maintenance Plan: Note that you will monitor the dashboard over the first few refresh cycles to ensure it"s updating correctly. Also, be open to future changes as KPIs evolve. Grading notes: The answer doesn"t need to use exactly the above words, but it should touch on most of these phases: -Requirement gathering/user needs -Data source identification and selection of tool "Building the visuals -Verification of data -Soliciting feedback and refining -Deployment/sharing and maintenance We will award points in segments: -Mentioning requirement gathering & understanding KPIs: +2 points "Mentioning data sourcing and choice of tool/technology: +2 points -Describing the build/design process (at least some detail on creating charts or structure): +2 points -Including testing/validation of the data accuracy: +2 points -Discussing user feedback or iteration and final rollout (training or documentation): +2 points This gives a total of 10. Partial credit: If a candidate misses one of the segments, they lose those points. For example, if they outline steps but never mention checking the data correctness, deduct the validation points. A very strong candidate might also mention time management or setting milestones, which isn"t explicitly required but would show thoroughness. We"d still cap at 10 points (with possibly a bonus point in mind to differentiate an exceptional answer, but for scoring consistency, max is 10). An answer that is in a bullet list or paragraph form is fine as long as the steps are clear. If the sequence is slightly out of order, that"s okay - focus on whether all key considerations are present.

Interview Blueprint (30 minutes, 6 questions)

The structured interview is approximately 30 minutes, with six open-ended questions designed to further evaluate the candidate"s experience, problem-solving approach, and culture fit. Each question targets specific competencies or traits identified for the role:

1.

Project Experience: "Tell us about a data analysis project you worked on that you"re particularly proud of. What was the goal, what did you do, and what was the outcome or impact on the business?

2.

Rationale: This question assesses the candidate"s hands-on experience and ability to deliver results. A strong answer will highlight technical skills (tools used, analysis methods) and the business impact (e.g., insights that led to decisions or improvements).

3.

Communication to Non-Tech Audience: "Describe a time you had to explain a complex data insight or report to someone who isn"t familiar with data (for example, a senior manager or a client). How did you approach it, and what was the result?

4.

Rationale: Tests the candidate"s communication and data storytelling skill. We want to hear if they can distill complexity into clarity, and perhaps what techniques they use (analogies, focusing on conclusions, etc.). Look for mention of tailoring the message to the audience

5.

Dealing with a Mistake: "Everyone makes mistakes. Can you give an example of a mistake or error in your data work that you encountered? How did you handle it and what did you learn from the experience?

6.

Rationale: This probes integrity, accountability, and learning attitude. Good candidates will openly admit a real example (not "I"ve never made a mistake") and will emphasize how they corrected it (and perhaps prevented it from happening again). It also indicates their attention to detail - ideally the mistake was caught and fixed by them.

7.

Time Management/Prioritization: "In this role you may get multiple requests at the same time. How do you prioritize your tasks when you have several high-priority projects or ad-hoc requests due around the same time?

8.

Rationale: Assesses organization and the ability to handle pressure. A solid answer might mention methods like setting clear priorities with managers, estimating effort, possibly pushing back or negotiating deadlines when needed, and using tools or schedules to keep on track. It also gives insight into their work ethic and communication (do they inform stakeholders of timelines?).

9.

Handling Stakeholder Challenges: "Give an example of a challenging interaction with a stakeholder or team member in the context of a data project. Perhaps a situation where they disagreed with your analysis or wanted something unrealistic. How did you handle it?

10.

Rationale: This question examines interpersonal skills, diplomacy, and problem-solving in a collaborative setting. We"re looking for an ability to handle conflict or pushback professionally - e.g., describing how they provided additional explanation, sought compromise, or provided alternative solutions. It also reveals how they manage expectations and maintain integrity under pressure (for instance, not changing results to appease someone, but rather explaining or investigating further).

11.

Continuous Learning & Tools: "The data field changes rapidly. What do you do to stay updated on new analytics tools or techniques? Can you give an example of something new you learned recently that helped in your work?

12. Rationale: This targets the attitude of continuous learning and adaptability

We want to hear that the candidate is proactive about keeping their skills fresh - for example, they might mention following industry blogs, taking an online course, experimenting with a new tool (like learning a bit of Python if they mainly used SQL, or trying out a new visualization library). A great answer might highlight a specific new skill and how they applied it to solve a problem or improve a process at work.

Each question is open-ended to encourage the candidate to tell stories from their experience. Interviewers will use a consistent rubric to score answers (see Scoring Guidance), focusing on clarity, relevance, and demonstration of the sought-after skills/traits in each response. Follow-up probing questions can be used if an answer is too vague, but in general, these six questions should allow a comprehensive evaluation within 30 minutes.

When to Use This Role

Data Analyst (Mid-Level, SMB) is a mid-level-level role in Data & Analytics. Choose this title when you need someone focused on the specific responsibilities outlined above.

Deploy this hiring playbook in your pipeline

Every answer scored against a deterministic rubric. Full audit log included.