Deploying models into multiple datasets within BigQuery #6341
-
Hello everyone.
For us this organization is crucial, since multiple teams access our project and we use dataset level permissions. The problem is in dbt (I'm using the core) we declare our target, and that will be used by every model. I know we could use Has anyone tried to do something similar? Or maybe know a way I could implement this |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 8 replies
-
Not an expert in the Example (source): models:
stacktonic_example_project:
marts:
core:
materialized: table
+dataset: core
+tags:
- "core"
- "daily"
marketing:
materialized: table
+dataset: marketing
+tags:
- "marketing"
- "daily"
staging:
google_analytics:
materialized: view
+dataset: staging
+tags:
- "staging"
- "daily" |
Beta Was this translation helpful? Give feedback.
-
Is @1cadumagalhaes's answer the latest and greatest? It would be nice to be able to name datasets freely without following this sort of |
Beta Was this translation helpful? Give feedback.
-
I am trying to follow same pattern in my dbt_project.yml as I want to dump the models output into different dataset. profiles.yml structure as below
dbt_project.yml structure as below
new to dbt world...not getting exactly where things are going wrong |
Beta Was this translation helpful? Give feedback.
Not an expert in the
dbt-bigquery
adapter, but as far as I understand from the adapter code: you should be able to use theschema
configuration (which is interchangeable withdataset
) to define datasets for individual sources and models; either in the config block or indbt_project.yml
.Example (source):