Success to upload graph (.csv) from S3, but cannot access via notebook

0

Dear respected team,

Hi, I successfully used bulk loader from S3 to Neptune database for SPARQL. However, to compare querying performance SPARQL to gremlin, I uploaded .csv files for vertices and edges in one of my S3 bucket and loaded those files to Neptune database via EC2, using correct command. Then, I did not get error message, just got the message including { "status" : "200 OK". ... }

Unfortunately, I cannot get results when I throw gremlin query below:
%%gremlin
g.V().count()

I think, although the loader sends the status 200 message, .csv files cannot be loaded to my databasecluster.

What can I do for loading .csv files?

OHCH
已提问 4 年前374 查看次数
2 回答
0

The /loader API returns a status 200 stating that the request has been received successfully. To check the status of the loader job, you should take the loader ID that is also received with the status 200 and send a new request to. This will provide a status of the bulk load job.

[code]https://<cluster_endpoint>:8182/loader/<loader_job_id>[/code]

If you didn't grab the loader ID from the response, querying the /loader API with a GET request will return a list of all of the bulk load job IDs in order of latest to oldest:

[code]https://<cluster_endpoint>:8182/loader[/code]

In addition, you can add the details and errors parameters to the loader id API to see further details of any errors that occurred during the bulk load job:

[code]https://<cluster_endpoint>:8182/loader/<loader_job_id>?details=true&errors=true[/code]

Just to confirm, it is possible to load BOTH RDF and Property Graph datasets into the same Neptune cluster. There's nothing incorrect about your approach. You just need to check the bulk load job for the CSV files (Property Graph data) to see if there may have been some parsing errors or something else that caused the data not to be loaded properly.

Edited by: TaylorR-AWS on Oct 29, 2020 7:32 PM

Edited by: TaylorR-AWS on Oct 29, 2020 7:33 PM

Edited by: TaylorR-AWS on Oct 29, 2020 7:34 PM

profile pictureAWS
已回答 4 年前
0

Thank you for your fast and kind response!

The problem is caused by my organizations' document security policy.

I can get the hint from your advice about printing loader status details and error list.

repeatedly,

Thank you!

OHCH

Edited by: OHCH on Oct 29, 2020 10:46 PM

OHCH
已回答 4 年前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则