11. Can RDD be shared between SparkContexts?

Ans: No, When an RDD is created; it belongs to and is completely owned by the Spark context it originated from. RDDs can’t be shared between SparkContexts.

Premium Training : Spark Full Length Training : with Hands On Lab

12. In Spark-Shell, which all contexts are available by default?

Ans: SparkContext and SQLContext

13. Give few examples , how RDD can be created using SparkContext

Ans: SparkContext allows you to create many different RDDs from input sources like:

· Scala’s collections: i.e. sc.parallelize(0 to 100)

· Local or remote filesystems : sc.textFile("README.md")

· Any Hadoop InputSource : using sc.newAPIHadoopFile

14. How would you brodcast, collection of values over the Sperk executors?

Ans: sc.broadcast("hello")

15. What is the advantage of broadcasting values across Spark Cluster?

Ans: Spark transfers the value to Spark executors once, and tasks can share it without incurring repetitive network transmissions when requested multiple times