Skip to content

ddev lab: environment setup automation design#23682

Draft
dkirov-dd wants to merge 2 commits into
masterfrom
dk/environment-setup-automation
Draft

ddev lab: environment setup automation design#23682
dkirov-dd wants to merge 2 commits into
masterfrom
dk/environment-setup-automation

Conversation

@dkirov-dd
Copy link
Copy Markdown
Contributor

What does this PR do?

Adds a WIP design spec for ddev lab — an agentic framework for provisioning Datadog integration environments on remote infrastructure with a single command.

Motivation

Setting up environments for complex integrations (Oracle, IBM MQ, Kafka, Lustre) is one of the highest-friction steps in integration development. This design explores automating that entirely: an AI research phase reads vendor documentation and generates all provisioning artifacts, which a deterministic CLI then executes to spin up a fully seeded, load-generating environment with a configured Datadog Agent.

Design doc: docs/superpowers/specs/2026-05-05-environment-setup-automation-design.md

Status: Design in progress — open questions remain on Pulumi vs. Terraform, cloud-inventory placement, and Lustre cluster networking.

Review checklist

  • Design doc reviewed
  • Open questions flagged in Section 11 discussed with relevant teams

qa/skip-qa

Design for `ddev lab` — an agentic framework that provisions Datadog
integration environments on remote EC2/GCP with a single command.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…opology

- Replace Terraform/cloud-inventory with Pulumi Automation API for
  multi-cloud (AWS + GCP) provisioning embedded directly in ddev
- Add tests/lab/RESEARCH.md as human-readable narrative; remove inline
  YAML comments from lab.yaml (machine-readable only)
- Extend cluster runtime schema with node roles, counts, instance types,
  and network.lnet_port for services like Lustre
- Add two-phase Lustre cluster provisioning: Pulumi emits IP map, then
  Ansible stages MGS → MDS → OSS → client with LNET NIDs from actual IPs

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@dd-octo-sts
Copy link
Copy Markdown
Contributor

dd-octo-sts Bot commented May 12, 2026

Validation Report

All 20 validations passed.

Show details
Validation Description Status
agent-reqs Verify check versions match the Agent requirements file
ci Validate CI configuration and Codecov settings
codeowners Validate every integration has a CODEOWNERS entry
config Validate default configuration files against spec.yaml
dep Verify dependency pins are consistent and Agent-compatible
http Validate integrations use the HTTP wrapper correctly
imports Validate check imports do not use deprecated modules
integration-style Validate check code style conventions
jmx-metrics Validate JMX metrics definition files and config
labeler Validate PR labeler config matches integration directories
legacy-signature Validate no integration uses the legacy Agent check signature
license-headers Validate Python files have proper license headers
licenses Validate third-party license attribution list
metadata Validate metadata.csv metric definitions
models Validate configuration data models match spec.yaml
openmetrics Validate OpenMetrics integrations disable the metric limit
package Validate Python package metadata and naming
readmes Validate README files have required sections
saved-views Validate saved view JSON file structure and fields
version Validate version consistency between package and changelog

View full run

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant