This topic contains 1 reply, has 2 voices, and was last updated by  Shahana syed 4 months ago.

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
  • #1984 Reply


    1) New API using Mapper and Reducer as class. Where as old api used mapper and reducer as interface.

    2) New API is in the org.apache.hadoop.mapreduce package where as old API can still be found in org.apache.hadoop.mapred.

    3) Job Control is done through the JOB class in new API. Job Control was done through JobClient

    4) In the new API, the reduce() method passes values as a java.lang.iterable, where as in the old API, the reducer() method passes values as a java.lang.Iterator.

    Reference Link:

    #6902 Reply

    Shahana syed

    Differences between old map reduce API and new API:
    old API | New API
    1)The name of the output file doesn’t tell us | Name of the output file tells us whose output it is (we can say it is
    whose output it is that is mapper’s or reducer’s | reducer’s if the name is part-r-0000 )
    2)Uses output collector and reporter | It uses context
    3)Mapper, reducer are interfaces |
    (public class Shahana extends MapReduceBase implements| Mapper, reducer are abstract classes
    mapper) | (public class Shahana extends mapper)
    4)Requires MapReduceBase | It is not needed here

Viewing 2 posts - 1 through 2 (of 2 total)
Reply To: Difference between Hadoop old API and new API
Your information:


Your Name (required)

Your Email (required)


Phone No

Your Message


  • No products in the cart.