Can I use dbt-python-model load S3 data as a dataframe in Databricks? #9346
Unanswered
gaoshihang
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello, our requirement is as follows, and we would like to seek your advice.
We want to use DBT to connect to Databricks, the source data is on S3, and we want to use PySpark to load them into a Dataframe first(Like a ETL's 'E' part), do some transform on it, then write to Databricks table.
The code for PySpark looks something like this:
Can we do these things in DBT python model?
Beta Was this translation helpful? Give feedback.
All reactions