Caching CSV
closed
Patrick Janeiro
Im trying to do some performance testing on you application and it means im constantly swapping out data in and out of our DB.
To do this I use a small script in python that uses the LOAD csv command and point to a public csv file stored in GCS.
However trying it recently it seems like AuraDB may be caching the file because it keeps loading data that no longer exists.
My query looks along the lines of :
USING PERIODIC COMMIT 500 LOAD CSV WITH HEADERS FROM 'https://storage.googleapis.com/my-bucket/my-csv' AS row WITH row WHERE row.userID IS NOT NULL AND row.projectID IS NOT NULL AND row.environmentID IS NOT NULL MERGE (user:User {userID: row.userID}) ON CREATE SET user.projectID = row.projectID, user.environmentID = row.environmentID;
This seems to be picking up old data when i check the graph afterwards. Are you guys doing some level of caching?
Aman Singh
closed
John Kennedy
Patrick Janeiro, hello,
It's great to hear from you. We don't operate any caching in the platform.
There is a "Reset to Blank" functionality in the UI, that could be used to troubleshoot the issue further, but at this point I think it's worth reaching out to our support team to get more direct help.
You can open a ticket here : https://aura.support.neo4j.com/hc/en-us