Senior Data Engineer

🌐Community
by borghei · vlatest · Repository

Analyzes complex data pipelines, identifies bottlenecks, and suggests optimized solutions for senior data engineering teams.

Install on your platform

We auto-selected Claude Code based on this skill’s supported platforms.

1

Run in terminal (recommended)

terminal
claude mcp add borghei-senior-data-engineer npx -- -y @trustedskills/borghei-senior-data-engineer
2

Or manually add to ~/.claude/settings.json

~/.claude/settings.json
{
  "mcpServers": {
    "borghei-senior-data-engineer": {
      "command": "npx",
      "args": [
        "-y",
        "@trustedskills/borghei-senior-data-engineer"
      ]
    }
  }
}

Requires Claude Code (claude CLI). Run claude --version to verify your install.

About This Skill

What it does

This skill assists senior data engineering teams by analyzing and optimizing complex data pipelines. It generates configuration code for orchestration tools like Airflow, Prefect, and Dagster; validates data quality through profiling and anomaly detection; and provides actionable recommendations to improve the performance of SQL and Spark queries. The tool supports both generating pipeline configurations and validating existing ones.

When to use it

  • Generating initial pipeline configurations for moving data between systems (e.g., PostgreSQL to Snowflake).
  • Validating the quality of datasets against defined schemas, including detecting anomalies.
  • Profiling datasets to understand their characteristics and identify potential issues.
  • Optimizing slow-running SQL or Spark queries within a data warehouse environment.
  • Estimating query costs for resource planning and optimization purposes.

Key capabilities

  • Pipeline Generation: Generates code for Airflow, Prefect, and Dagster pipelines.
  • Data Quality Validation: Validates data against schemas and detects anomalies using techniques like Great Expectations.
  • Data Profiling: Provides insights into dataset characteristics.
  • SQL/Spark Optimization: Analyzes and suggests improvements for SQL and Spark query performance.
  • Cost Estimation: Estimates the cost of executing queries in environments like Snowflake and BigQuery.

Example prompts

  • "Generate an Airflow DAG to incrementally load data from PostgreSQL into Snowflake, using tables 'orders' and 'customers'."
  • "Validate the quality of data.csv against the schema defined in schema.json, and detect any anomalies."
  • "Analyze this SQL query (query.sql) for performance bottlenecks and suggest optimizations for a Snowflake warehouse."

Tips & gotchas

  • The skill utilizes command-line tools (e.g., pipeline_orchestrator.py, data_quality_validator.py). Ensure these scripts are accessible within the agent's environment.
  • Subcommands support --json for machine-readable output and --output to write results to files.
  • The skill is designed for senior data engineering tasks; familiarity with concepts like Airflow DAGs, SQL query optimization, and Spark is beneficial.

Tags

🛡️

TrustedSkills Verification

Unlike other registries that point to live repositories, TrustedSkills pins every skill to a verified commit hash. This protects you from malicious updates — what you install today is exactly what was reviewed and verified.

Security Audits

Gen Agent Trust HubPass
SocketPass
SnykPass

Details

Version
vlatest
License
Author
borghei
Installs
58

🌐 Community

Passed automated security scans.