How To Check the Apache Spark Version

How To Check the Apache Spark Version

In this article, we will discuss several ways to check the Apache Spark version installed on our system.  To check the Apache Spark version, you can use one of the following methods:

1. Command-Line Interface (CLI)

Open a terminal or command prompt and run the following command:

spark-submit --version

This command will display the Spark version along with other information.

2. Spark Shell

If you have Spark installed locally, you can launch the Spark Shell by running the following command:

spark-shell

Once the shell starts, you will see the Spark version displayed in the output logs.

3. Programmatically using Scala

If you have a Spark application written in Scala, you can check the Spark version programmatically by adding the following code snippet to your application:

import org.apache.spark.SparkContext

val sparkContext = new SparkContext()
val sparkVersion = sparkContext.version
println(s"Spark version: $sparkVersion")

This code creates a SparkContext and retrieves the version using the version method.

4. Programmatically using Python

If you have a Spark application written in Python, you can check the Spark version programmatically by adding the following code snippet to your application:

from pyspark.sql import SparkSession

spark = SparkSession.builder.getOrCreate()
sparkVersion = spark.version
print("Spark version:", sparkVersion)

This code creates a SparkSession and retrieves the version using the version attribute.

Also you can have the comprehensive guide covers every step – from preparing your Ubuntu 20.04 LTS environment to startup the Apache Sparks, can be found on How To Install Apache Spark on Ubuntu 20.04 LTS article.

Conclusion

How To Check the Apache Spark Version

(Visited 289 times, 1 visits today)

You may also like