Open
Description
hi all, I'm running in to this connection exception when trying to connect to our redis cluster and load a dataframe using the com.redislabs:spark-redis_2.12:3.1.0 package and using that in a PySpark notebook in Databricks. I've tried setting spark.redis.host and other config options in the cluster settings as mentioned in this issue[0], as well as just specifying it in teh code, but still results in the same error message. Anyone run in to something similar when running this through Databricks?
from pyspark.sql.types import IntegerType, StringType, StructField, StructType
df = spark.read.format(
"org.apache.spark.sql.redis"
).schema(
StructType(
[
StructField("classroom_cup_identifier", StringType(), True),
StructField("cycle_id", StringType(), True),
StructField("classroom_id", StringType(), True),
StructField("total_answers_correct", IntegerType(), True),
]
)
).option(
"keys.pattern", "classroom-cup:classroom-versus-classroom-leaderboard:*"
).option(
"key.column", "classroom_cup_identifier"
).load()
df.show()
[0] #357
Metadata
Metadata
Assignees
Labels
No labels