initializing-warehouse
π―Skillfrom astronomer/agents
Configures and sets up initial database connection parameters and credentials for connecting to a data warehouse like Snowflake, BigQuery, or Redshift within an Airflow project.
Same repository
astronomer/agents(23 items)
Installation
npx skills add https://github.com/astronomer/agents --skill initializing-warehouseNeed more details? View full documentation on GitHub β
More from this repository10
Manages and troubleshoots Apache Airflow workflows by listing DAGs, testing pipelines, running tasks, and monitoring system health.
Guides developers in creating robust Apache Airflow DAGs using best practices and MCP tools.
Traces data origins by investigating DAGs, source tables, and external systems to map the complete upstream lineage of a data asset.
Systematically diagnoses Airflow DAG failures by performing deep root cause analysis, identifying error sources, and providing structured prevention recommendations.
Queries data warehouses to answer business questions by executing SQL, finding tables, and retrieving precise metrics and trends.
Triggers and monitors Airflow DAG runs, automatically waiting for completion and providing immediate feedback on success or failure.
Traces downstream data dependencies to reveal potential impacts and risks when modifying tables or data pipelines.
Guides users through migrating Apache Airflow 2.x projects to Airflow 3.x, addressing code changes, imports, operators, and compatibility issues.
Verifies data freshness by identifying timestamp columns, checking last update times, and assessing data currency across tables.
Manages local Airflow environment using Astro CLI, enabling start/stop, log viewing, container troubleshooting, and environment control.