-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Aborted (core dumped) with hafenkran/duckdb-bigquery
with nodejs API
#104
Comments
Thanks for the report. Unfortunately it's difficult for me to diagnose, because I don't have access to a BigQuery environment, so I can't run your script. (I unsurprisingly get an error about Google credentials.) If there is a problem in Node Neo (as opposed to, say, the BigQuery extension), then it should be possible to create a repro without that extension. Do you still see the problem if you alter your example to avoid using that extension, perhaps by first exporting all or part of the data in a separate step? Also, it would be helpful to know what version of @duckdb/node-api you're using, and on which platform. |
@vwxyzjn: what platforms are those environment? Could you check what's the result of It's unclear what to make of the answer then, but could be of help tracking this down. |
Yeah I feel like the only way for reproduction is if you all create a bigquery table yourself... Here is the command to create tables on bigquery CREATE OR REPLACE TABLE `ai2-allennlp.deletable.model_evaluations`
AS
SELECT
ROW_NUMBER() OVER() as id,
CONCAT('project_', CAST(FLOOR(RAND() * 100) AS STRING)) as project,
CONCAT('user_', CAST(FLOOR(RAND() * 1000) AS STRING)) as username,
CONCAT('model_', CAST(FLOOR(RAND() * 1000) AS STRING)) as model_name,
CONCAT('run_', CAST(FLOOR(RAND() * 1000) AS STRING)) as run_id,
CASE CAST(FLOOR(RAND() * 5) AS INT64)
WHEN 0 THEN 'gsm8k'
WHEN 1 THEN 'ifeval'
WHEN 2 THEN 'popqa'
WHEN 3 THEN 'mmlu:cot::summarize'
ELSE 'mmlu_abstract_algebra:mc'
END as task_name,
RAND() as primary_score
FROM
UNNEST(GENERATE_ARRAY(1, 1000000)); -- 1 million rows Below is the package.json and lock file. |
I have detailed the issue there hafenkran/duckdb-bigquery#58.
Using the duckdb cli works and python API also works, but the nodejs API fails...
The text was updated successfully, but these errors were encountered: