Replies: 1 comment
-
Unfortunately, we don't have a Spark plugin. The only way I see this is to use ArcadeDB's Java API. Alternatively you could check the OrientDB Spark plugin and adapt it for ArcadeDB: https://github.com/orientechnologies/spark-orientdb. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
So I'm looking to ingest data into arcade in a somewhat reusable way using spark.
So for context I have a set of what I'll call wrappers,
I have read data and done things to dataframes [transformed] and created properties[rdf] etc from arbitrary sets of data.
I want to essentially be able to say
dataframe.write(df,arcadeInfo:DatabaseName,URL,Schema / whatever), for any given source target spark tends to have read write types / drivers .
graphx / graphframes are libraries that create vertex | edge dataframes and can write to various graph formats.
obviously graphx/frames probably aren't necessary to ingest into arcade but something im also curious about is possibly using arcade as an http sink .
if anyone has any concrete examples that would awesome.
Beta Was this translation helpful? Give feedback.
All reactions