Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Migrate Job DataSource & Resource #327

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions docs/data-sources/job.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,20 +26,20 @@ description: |-
- `deferring_job_id` (Number) ID of the job this job defers to
- `description` (String) Long description for the job
- `environment_id` (Number) ID of the environment the job is in
- `id` (String) The ID of this resource.
- `job_completion_trigger_condition` (Set of Object) Which other job should trigger this job when it finishes, and on which conditions. (see [below for nested schema](#nestedatt--job_completion_trigger_condition))
- `id` (String) The ID of the this resource
- `job_completion_trigger_condition` (Block Set) Whether the CI job should compare data changes introduced by the code change in the PR. (see [below for nested schema](#nestedblock--job_completion_trigger_condition))
- `name` (String) Given name for the job
- `run_compare_changes` (Boolean) Whether the CI job should compare data changes introduced by the code change in the PR.
- `self_deferring` (Boolean) Whether this job defers on a previous run of itself (overrides value in deferring_job_id)
- `timeout_seconds` (Number) Number of seconds before the job times out
- `triggers` (Map of Boolean) Flags for which types of triggers to use, keys of github_webhook, git_provider_webhook, schedule, on_merge
- `triggers_on_draft_pr` (Boolean) Whether the CI job should be automatically triggered on draft PRs

<a id="nestedatt--job_completion_trigger_condition"></a>
<a id="nestedblock--job_completion_trigger_condition"></a>
### Nested Schema for `job_completion_trigger_condition`

Read-Only:

- `job_id` (Number)
- `project_id` (Number)
- `statuses` (Set of String)
- `job_id` (Number) The ID of the job that would trigger this job after completion.
- `project_id` (Number) The ID of the project where the trigger job is running in.
- `statuses` (Set of String) List of statuses to trigger the job on.
16 changes: 8 additions & 8 deletions docs/resources/job.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,26 +117,26 @@ resource "dbtcloud_job" "downstream_job" {
- `dbt_version` (String) Version number of dbt to use in this job, usually in the format 1.2.0-latest rather than core versions
- `deferring_environment_id` (Number) Environment identifier that this job defers to (new deferring approach)
- `deferring_job_id` (Number) Job identifier that this job defers to (legacy deferring approach)
- `description` (String) Description for the job
- `description` (String) Long Description for the job
- `generate_docs` (Boolean) Flag for whether the job should generate documentation
- `is_active` (Boolean) Should always be set to true as setting it to false is the same as creating a job in a deleted state. To create/keep a job in a 'deactivated' state, check the `triggers` config.
- `job_completion_trigger_condition` (Block Set, Max: 1) Which other job should trigger this job when it finishes, and on which conditions (sometimes referred as 'job chaining'). (see [below for nested schema](#nestedblock--job_completion_trigger_condition))
- `num_threads` (Number) Number of threads to use in the job
- `job_completion_trigger_condition` (Block Set) Which other job should trigger this job when it finishes, and on which conditions (sometimes referred as 'job chaining'). (see [below for nested schema](#nestedblock--job_completion_trigger_condition))
- `num_threads` (Number) Number of threads to use for the job
- `run_compare_changes` (Boolean) Whether the CI job should compare data changes introduced by the code changes. Requires `deferring_environment_id` to be set. (Advanced CI needs to be activated in the dbt Cloud Account Settings first as well)
- `run_generate_sources` (Boolean) Flag for whether the job should add a `dbt source freshness` step to the job. The difference between manually adding a step with `dbt source freshness` in the job steps or using this flag is that with this flag, a failed freshness will still allow the following steps to run.
- `schedule_cron` (String) Custom cron expression for schedule
- `schedule_days` (List of Number) List of days of week as numbers (0 = Sunday, 7 = Saturday) to execute the job at if running on a schedule
- `schedule_hours` (List of Number) List of hours to execute the job at if running on a schedule
- `schedule_cron` (String) Custom `cron` expression to use for the schedule
- `schedule_days` (Set of Number) List of days of week as numbers (0 = Sunday, 7 = Saturday) to execute the job at if running on a schedule
- `schedule_hours` (Set of Number) List of hours to execute the job at if running on a schedule
- `schedule_interval` (Number) Number of hours between job executions if running on a schedule
- `schedule_type` (String) Type of schedule to use, one of every_day/ days_of_week/ custom_cron
- `schedule_type` (String) Type of schedule to use, one of `every_day` / `days_of_week` / `custom_cron`
- `self_deferring` (Boolean) Whether this job defers on a previous run of itself
- `target_name` (String) Target name for the dbt profile
- `timeout_seconds` (Number) Number of seconds to allow the job to run before timing out
- `triggers_on_draft_pr` (Boolean) Whether the CI job should be automatically triggered on draft PRs

### Read-Only

- `id` (String) The ID of this resource.
- `id` (String) The ID of the this resource

<a id="nestedblock--job_completion_trigger_condition"></a>
### Nested Schema for `job_completion_trigger_condition`
Expand Down
137 changes: 137 additions & 0 deletions pkg/framework/objects/job/data_source.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,137 @@
package job

import (
"context"

"github.com/dbt-labs/terraform-provider-dbtcloud/pkg/dbt_cloud"
"github.com/hashicorp/terraform-plugin-framework-validators/setvalidator"
"github.com/hashicorp/terraform-plugin-framework/datasource"
"github.com/hashicorp/terraform-plugin-framework/datasource/schema"
"github.com/hashicorp/terraform-plugin-framework/schema/validator"
"github.com/hashicorp/terraform-plugin-framework/types"
)

var (
_ datasource.DataSource = &jobDataSource{}
_ datasource.DataSourceWithConfigure = &jobDataSource{}
)

func JobDataSource() datasource.DataSource {
return &jobDataSource{}
}

type jobDataSource struct {
client *dbt_cloud.Client
}

func (d *jobDataSource) Metadata(
_ context.Context,
req datasource.MetadataRequest,
resp *datasource.MetadataResponse,
) {
resp.TypeName = req.ProviderTypeName + "_job"
}

// Configure implements datasource.DataSourceWithConfigure.
func (d *jobDataSource) Configure(ctx context.Context, req datasource.ConfigureRequest, resp *datasource.ConfigureResponse) {
switch c := req.ProviderData.(type) {
case nil: // do nothing
case *dbt_cloud.Client:
d.client = c
default:
resp.Diagnostics.AddError("Missing client", "A client is required to configure the job data source")
}
}

// Schema implements datasource.DataSourceWithValidateConfig.
func (d *jobDataSource) Schema(ctx context.Context, req datasource.SchemaRequest, resp *datasource.SchemaResponse) {
resp.Schema = schema.Schema{
Attributes: map[string]schema.Attribute{
"job_id": schema.Int64Attribute{
Description: "ID of the job",
Required: true,
},
"project_id": schema.Int64Attribute{
Description: "ID of the project the job is in",
Required: true,
},
"id": schema.StringAttribute{
Description: "The ID of the this resource",
Computed: true,
},
"environment_id": schema.Int64Attribute{
Description: "ID of the environment the job is in",
Computed: true,
},
"name": schema.StringAttribute{
Description: "Given name for the job",
Computed: true,
},
"description": schema.StringAttribute{
Description: "Long description for the job",
Computed: true,
},
"deferring_job_id": schema.Int64Attribute{
Description: "ID of the job this job defers to",
Computed: true,
},
"deferring_environment_id": schema.Int64Attribute{
Description: "ID of the environment this job defers to",
Computed: true,
},
"self_deferring": schema.BoolAttribute{
Description: "Whether this job defers on a previous run of itself (overrides value in deferring_job_id)",
Computed: true,
},
"triggers": schema.MapAttribute{
Description: "Flags for which types of triggers to use, keys of github_webhook, git_provider_webhook, schedule, on_merge",
Computed: true,
ElementType: types.BoolType,
},
"timeout_seconds": schema.Int64Attribute{
Description: "Number of seconds before the job times out",
Computed: true,
},
"triggers_on_draft_pr": schema.BoolAttribute{
Description: "Whether the CI job should be automatically triggered on draft PRs",
Computed: true,
},
// "job_completion_trigger_condition": schema.NestedSingleAttribute{

"run_compare_changes": schema.BoolAttribute{
Description: "Whether the CI job should compare data changes introduced by the code change in the PR.",
Computed: true,
},
},
Blocks: map[string]schema.Block{
"job_completion_trigger_condition": schema.SetNestedBlock{
Description: "Whether the CI job should compare data changes introduced by the code change in the PR.",
Validators: []validator.Set{
setvalidator.SizeAtMost(1),
},
NestedObject: schema.NestedBlockObject{
Attributes: map[string]schema.Attribute{
"job_id": schema.Int64Attribute{
Description: "The ID of the job that would trigger this job after completion.",
Computed: true,
},
"project_id": schema.Int64Attribute{
Description: "The ID of the project where the trigger job is running in.",
Computed: true,
},
"statuses": schema.SetAttribute{
Description: "List of statuses to trigger the job on.",
Computed: true,
ElementType: types.StringType,
},
},
},
},
},
}
}

// Read implements datasource.DataSourceWithValidateConfig.
func (d *jobDataSource) Read(context.Context, datasource.ReadRequest, *datasource.ReadResponse) {
panic("unimplemented")
}
76 changes: 76 additions & 0 deletions pkg/framework/objects/job/data_source_accepance_test.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
package job_test

import (
"fmt"
"testing"

"github.com/dbt-labs/terraform-provider-dbtcloud/pkg/framework/acctest_helper"
"github.com/hashicorp/terraform-plugin-testing/helper/acctest"
"github.com/hashicorp/terraform-plugin-testing/helper/resource"
)

func TestDbtCloudJobDataSource(t *testing.T) {

randomJobName := acctest.RandStringFromCharSet(5, acctest.CharSetAlphaNum)

config := jobs(randomJobName)

check := resource.ComposeAggregateTestCheckFunc(
resource.TestCheckResourceAttrSet("data.dbtcloud_job.test", "job_id"),
resource.TestCheckResourceAttrSet("data.dbtcloud_job.test", "project_id"),
resource.TestCheckResourceAttrSet("data.dbtcloud_job.test", "environment_id"),
resource.TestCheckResourceAttr("data.dbtcloud_job.test", "name", randomJobName),
resource.TestCheckResourceAttr("data.dbtcloud_job.test", "timeout_seconds", "180"),
resource.TestCheckResourceAttr("data.dbtcloud_job.test", "triggers_on_draft_pr", "false"),
resource.TestCheckResourceAttr(
"data.dbtcloud_job.test",
"job_completion_trigger_condition.#",
"0",
),
)

resource.ParallelTest(t, resource.TestCase{
ProtoV6ProviderFactories: acctest_helper.TestAccProtoV6ProviderFactories,
Steps: []resource.TestStep{
{
Config: config,
Check: check,
},
},
})
}

func jobs(jobName string) string {
return fmt.Sprintf(`
resource "dbtcloud_project" "test_project" {
name = "jobs_test_project"
}

resource "dbtcloud_environment" "test_environment" {
project_id = dbtcloud_project.test_project.id
name = "job_test_env"
dbt_version = "%s"
type = "development"
}

resource "dbtcloud_job" "test_job" {
name = "%s"
project_id = dbtcloud_project.test_project.id
environment_id = dbtcloud_environment.test_environment.environment_id
execute_steps = [
"dbt run"
]
triggers = {
"github_webhook" : false,
"schedule" : false,
"git_provider_webhook": false
}
timeout_seconds = 180
}

data "dbtcloud_job" "test" {
job_id = dbtcloud_job.test_job.id
project_id = dbtcloud_project.test_project.id
}
`, acctest_helper.DBT_CLOUD_VERSION, jobName)
}
4 changes: 2 additions & 2 deletions pkg/framework/objects/job/data_source_all_acceptance_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ func TestDbtCloudJobsDataSource(t *testing.T) {
randomJobName := acctest.RandStringFromCharSet(5, acctest.CharSetAlphaNum)
randomJobName2 := acctest.RandStringFromCharSet(5, acctest.CharSetAlphaNum)

config := jobs(randomJobName, randomJobName2)
config := jobsAll(randomJobName, randomJobName2)

check := resource.ComposeAggregateTestCheckFunc(
resource.TestCheckResourceAttrSet("data.dbtcloud_jobs.test", "project_id"),
Expand Down Expand Up @@ -66,7 +66,7 @@ func TestDbtCloudJobsDataSource(t *testing.T) {
})
}

func jobs(jobName string, jobName2 string) string {
func jobsAll(jobName string, jobName2 string) string {
return fmt.Sprintf(`
resource "dbtcloud_project" "test_project" {
name = "jobs_test_project"
Expand Down
Loading
Loading