[Solved] MapReduce to Spark
This is a very broad question, but the short of it is: Create an RDD of the input data. Call map with your mapper code. Output key-value pairs. Call reduceByKey with your reducer code. Write the resulting RDD to disk. Spark is more flexible than MapReduce: there is a great variety of methods that you … Read more