PySpark referenceClassesSparkSessionsparkContextOn this pagesparkContext Returns the underlying SparkContext. Syntax sparkContext Returns SparkContext Examples Pythonspark.sparkContext# <SparkContext master=... appName=...># Create an RDD from the Spark context.rdd = spark.sparkContext.parallelize([1, 2, 3])rdd.collect()# [1, 2, 3]