Many customers want to integrate Kafka and other streaming solutions with Neo4j to ingest data into the graph from other sources.
Neo4j Streams can run in two modes:
-
as a Neo4j plugin (deprecating): As AuraDB is a database service running on the public cloud, users CAN NOT install this plugin into the AuraDB instance so they CAN NOT use this mode.
-
as a Kafka-Connect Plugin: a plugin for the Confluent Platform that allows ingesting data into Neo4j from Kafka topics.
The below documentation provides an example of how to use the Neo4j Kafka Connect Sink Cypher Template method to stream data from Kafka topics to Neo4j AuraDB instances on Linux (CentOS Linux release 7.9.2009)
1. Install Confluent Kafka and configure all required shell parameters. (This example is using Confluent 7.0.0).
mkdir ~/kafka && cd ~/kafka
curl -O http://packages.confluent.io/archive/7.0/confluent-7.0.0.zip && unzip confluent-7.0.0.zip
# edit your shell profile add below lines:
export CONFLUENT_HOME=~/kafka/confluent-7.0.0
export KAFKA_HEAP_OPTS="-Xms2g -Xmx2g"
export PATH="${CONFLUENT_HOME}/bin:$PATH"
export $PATH
2. Install the Java runtime version 1.8+ (For example: JRE Install Example)
3. install kafka connect "datagen" and "neo4j" plugins. (We will use datagen source plugin as demo data generating tool).
confluent-hub install --no-prompt confluentinc/kafka-connect-datagen:latest
confluent-hub install --no-prompt neo4j/kafka-connect-neo4j:2.0.2
4. start confluent/kafka services
confluent local services start
5. Open the Kafka admin console Web UI
http://<hostname>:9021
6. Create an "orders" topic with default configuration
7. Create dategen source connector for "orders" with configuration:
{
"name": "DatagenOrders",
"connector.class": "io.confluent.kafka.connect.datagen.DatagenConnector",
"tasks.max": "1",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"kafka.topic": "orders",
"max.interval": "2000",
"quickstart": "orders",
"key.converter.schemas.enable": "false",
"value.converter.schemas.enable": "false"
}
8. Create a Neo4j Connect with Cypher Template:
"name": "Neo4jSinkConnector",
"key.converter.schemas.enable": "false",
"neo4j.topic.cypher.orders": "MERGE (r:Order {itemid: event.itemid}) SET r.itemid=event.itemid, r.ordertime=event.ordertime;",
"value.converter.schemas.enable": "false",
"name": "Neo4jSinkConnector",
"connector.class": "streams.kafka.connect.sink.Neo4jSinkConnector",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"topics": "orders",
"neo4j.authentication.basic.username": "neo4j",
"neo4j.authentication.basic.password": "*******************************************",
"neo4j.server.uri": "neo4j+s://abcd1234.databases.neo4j.io:7687",
"neo4j.connection.max.pool.size": "5",
"neo4j.encryption.enabled": "true",
"neo4j.database": "neo4j"
9. Now, open Neo4j AuraDB browser to verify data has successfully streamed from Kafka to AuraDB.
Comments
0 comments
Please sign in to leave a comment.