Back to Skills

Analyze

Answer data questions -- from quick lookups to full analyses. Use when looking up a single metric, investigating what's driving a trend or drop, comparing segments over time, or preparing a formal data report for stakeholders.

$ npx promptcreek add analyze

Auto-detects your installed agents and installs the skill to each one.

What This Skill Does

This skill answers data questions, ranging from quick lookups to full analyses and formal reports. It's designed for users who need to extract insights from data, regardless of their technical expertise.

When to Use

  • Determine the number of new users signed up last week.
  • Analyze the factors driving a drop in conversion rate.
  • Prepare a quarterly business review of subscription metrics.
  • Identify trends in customer behavior.
  • Compare performance across different segments.
  • Investigate anomalies in data.

Key Features

Understands the complexity level of the question.
Identifies the necessary data requirements.
Connects to data warehouses to gather data.
Handles data provided by the user.
Calculates relevant metrics and aggregations.
Identifies patterns, trends, and outliers.

Installation

Run in your project directory:
$ npx promptcreek add analyze

Auto-detects your installed agents (Claude Code, Cursor, Codex, etc.) and installs the skill to each one.

View Full Skill Content

/analyze - Answer Data Questions

> If you see unfamiliar placeholders or need to check which tools are connected, see CONNECTORS.md.

Answer a data question, from a quick lookup to a full analysis to a formal report.

Usage

/analyze <natural language question>

Workflow

1. Understand the Question

Parse the user's question and determine:

  • Complexity level:

- Quick answer: Single metric, simple filter, factual lookup (e.g., "How many users signed up last week?")

- Full analysis: Multi-dimensional exploration, trend analysis, comparison (e.g., "What's driving the drop in conversion rate?")

- Formal report: Comprehensive investigation with methodology, caveats, and recommendations (e.g., "Prepare a quarterly business review of our subscription metrics")

  • Data requirements: Which tables, metrics, dimensions, and time ranges are needed
  • Output format: Number, table, chart, narrative, or combination

2. Gather Data

If a data warehouse MCP server is connected:

  • Explore the schema to find relevant tables and columns
  • Write SQL query(ies) to extract the needed data
  • Execute the query and retrieve results
  • If the query fails, debug and retry (check column names, table references, syntax for the specific dialect)
  • If results look unexpected, run sanity checks before proceeding

If no data warehouse is connected:

  • Ask the user to provide data in one of these ways:

- Paste query results directly

- Upload a CSV or Excel file

- Describe the schema so you can write queries for them to run

  • If writing queries for manual execution, use the sql-queries skill for dialect-specific best practices
  • Once data is provided, proceed with analysis

3. Analyze

  • Calculate relevant metrics, aggregations, and comparisons
  • Identify patterns, trends, outliers, and anomalies
  • Compare across dimensions (time periods, segments, categories)
  • For complex analyses, break the problem into sub-questions and address each

4. Validate Before Presenting

Before sharing results, run through validation checks:

  • Row count sanity: Does the number of records make sense?
  • Null check: Are there unexpected nulls that could skew results?
  • Magnitude check: Are the numbers in a reasonable range?
  • Trend continuity: Do time series have unexpected gaps?
  • Aggregation logic: Do subtotals sum to totals correctly?

If any check raises concerns, investigate and note caveats.

5. Present Findings

For quick answers:

  • State the answer directly with relevant context
  • Include the query used (collapsed or in a code block) for reproducibility

For full analyses:

  • Lead with the key finding or insight
  • Support with data tables and/or visualizations
  • Note methodology and any caveats
  • Suggest follow-up questions

For formal reports:

  • Executive summary with key takeaways
  • Methodology section explaining approach and data sources
  • Detailed findings with supporting evidence
  • Caveats, limitations, and data quality notes
  • Recommendations and suggested next steps

6. Visualize Where Helpful

When a chart would communicate results more effectively than a table:

  • Use the data-visualization skill to select the right chart type
  • Generate a Python visualization or build it into an HTML dashboard
  • Follow visualization best practices for clarity and accuracy

Examples

Quick answer:

/analyze How many new users signed up in December?

Full analysis:

/analyze What's causing the increase in support ticket volume over the past 3 months? Break down by category and priority.

Formal report:

/analyze Prepare a data quality assessment of our customer table -- completeness, consistency, and any issues we should address.

Tips

  • Be specific about time ranges, segments, or metrics when possible
  • If you know the table names, mention them to speed up the process
  • For complex questions, Claude may break them into multiple queries
  • Results are always validated before presentation -- if something looks off, Claude will flag it
0Installs
0Views

Supported Agents

Claude CodeCursorCodexGemini CLIAiderWindsurfOpenClaw

Details

License
MIT
Source
admin
Published
3/18/2026

Tags

Related Skills

Senior Data Scientist

World-class senior data scientist skill specialising in statistical modeling, experiment design, causal inference, and predictive analytics. Covers A/B testing (sample sizing, two-proportion z-tests, Bonferroni correction), difference-in-differences, feature engineering pipelines (Scikit-learn, XGBoost), cross-validated model evaluation (AUC-ROC, AUC-PR, SHAP), and MLflow experiment tracking — using Python (NumPy, Pandas, Scikit-learn), R, and SQL. Use when designing or analysing controlled experiments, building and evaluating classification or regression models, performing causal analysis on observational data, engineering features for structured tabular datasets, or translating statistical findings into data-driven business decisions.

00
Alireza Rezvani
#engineering team

Instrument Data To Allotrope

Convert laboratory instrument output files (PDF, CSV, Excel, TXT) to Allotrope Simple Model (ASM) JSON format or flattened 2D CSV. Use this skill when scientists need to standardize instrument data for LIMS systems, data lakes, or downstream analysis. Supports auto-detection of instrument types. Outputs include full ASM JSON, flattened CSV for easy import, and exportable Python code for data engineers. Common triggers include converting instrument files, standardizing lab data, preparing data for upload to LIMS/ELN systems, or generating parser code for production pipelines.

00
anthropics
#bio research

Build Dashboard

Build an interactive HTML dashboard with charts, filters, and tables. Use when creating an executive overview with KPI cards, turning query results into a shareable self-contained report, building a team monitoring snapshot, or needing multiple charts with filters in one browser-openable file.

00
anthropics
#data