python - Neo4j - Import very large CSV into existing Database -


i'm quite new neo4j , lost out of date documentation , unclear commands, effect or speed.

i looking way import large data fast. data in b scale 1 kind of data, split multiple csv, don't mind fusing one.

doing simple import (load csv ... create (n:xxx {id: row.id}) taking ages, unique index, takes days. stopped operations, dropped unique index , restarted, 2x faster, still slow.

i know neo4j-import (although deprecated, , there no documentation on neo4j website "neo4j-admin import"). it's extremely unclear how simple things conditional something. biggest bummer it doesn't seem work existing database.

the main question is, there anyway accelerate import of large csv files neo4j? first simple statement create, match well. right now, running cypher command such "match (n:x {id: "y"}) return n limit 1" takes multiple minutes on 1b nodes.

(i'm running on server, 200gb+ of ram , 48cpus, not limitation hardware point of view).


Comments

Popular posts from this blog

php - Vagrant up error - Uncaught Reflection Exception: Class DOMDocument does not exist -

vue.js - Create hooks for automated testing -

Add new key value to json node in java -