Here is a step-by-step guide of how to setup Neo4j Connector for Apache Spark on Azure Databricks.
Step 1: Click Create to create an Azure Databricks Workspace
Step 2: Create "demo1" workspace
Step 3: Launch the Workspace
Step 4: Create a Cluster
Step 5: Install new libraries
Step 6: Configure notebook reading from Aura instance (you need to right-click)
Step 7: Reading from Aura:
df = spark.read.format("org.neo4j.spark.DataSource")\
.option("url", "neo4j+s://<dbid>.databases.neo4j.io")\
.option("authentication.type", "basic")\
.option("authentication.basic.username", "neo4j")\
.option("authentication.basic.password", "password")\
.option("labels", "Person")\
.load()
display(df)
Comments
0 comments
Please sign in to leave a comment.