• Around 5 years of IT experience as a Developer, Designer & quality reviewer with cross platform integration experience using Hadoop, Java, J2EE and SOA.
• Skilled experience in installing, configuring and using Apache Hadoop ecosystems such as MapReduce, Hive, Pig, Sqoop, Flume, Yarn, Spark, Kafka and Oozie.
• Strong understanding of Hadoop daemons and Map-Reduce concepts.
• Strong experience in importing-exporting data into HDFS format.
• Experienced in developing UDFs for Hive using Java.
• Worked with Apache Falcon which is a data governance engine that defines, schedules, and monitors data management policies.
• Hands on experience with Hadoop, HDFS, MapReduce and Hadoop Ecosystem (Pig, Hive, Oozie, Flume and HBase).
• Strong understanding and strong knowledge in NoSQL databases like HBase, MongoDB & Cassandra.
• Experience in working with Anguar 4, Nodejs, Bookshelf, Knex, MariaDB.
• Understanding of data storage and retrieval techniques, ETL, and databases, to include graph stores, relational databases, tuple stores
• Good skills in developing reusable solution to maintain proper coding standard across different java project.
• Good exposure to Python programming.
• Expertise in debugging and optimizing Oracle and java performance tuning with strong knowledge in Oracle 11g and SQL
• Ability to work effectively in cross-functional team environments and experience of providing training to business users.
• Good experience in using Sqoop for traditional RDBMS data pull.
• Good working knowledge of Flume.
• Worked with Apache Ranger console to create and manage policies for access to files, folders, databases, tables, or columns.
• Worked with Yarn Queue Manager to allocate queue capacities for different service accounts.
• Hands on experience on Hortonworks and Cloudera Hadoop environments.
• Familiar with handling complex data processing jobs using Cascading.
• Strong database skills in IBM- DB2, Oracle and Proficient in database development, including Constraints, Indexes, Views, Stored Procedures, Triggers and Cursors.
• Extensive experience in Shell scripting.
• Experience in component design using UML Design-Use Case, Class, Sequence, and Development, Component diagrams for the requirements.
• Excellent analytical and programming abilities in using technology to create flexible and maintainable solutions for complex development problems.
• Good communication and presentation skills, willing to learn, adapt to new technologies and third party products.
• Skilled experience in installing, configuring and using Apache Hadoop ecosystems such as MapReduce, Hive, Pig, Sqoop, Flume, Yarn, Spark, Kafka and Oozie.
• Strong understanding of Hadoop daemons and Map-Reduce concepts.
• Strong experience in importing-exporting data into HDFS format.
• Experienced in developing UDFs for Hive using Java.
• Worked with Apache Falcon which is a data governance engine that defines, schedules, and monitors data management policies.
• Hands on experience with Hadoop, HDFS, MapReduce and Hadoop Ecosystem (Pig, Hive, Oozie, Flume and HBase).
• Strong understanding and strong knowledge in NoSQL databases like HBase, MongoDB & Cassandra.
• Experience in working with Anguar 4, Nodejs, Bookshelf, Knex, MariaDB.
• Understanding of data storage and retrieval techniques, ETL, and databases, to include graph stores, relational databases, tuple stores
• Good skills in developing reusable solution to maintain proper coding standard across different java project.
• Good exposure to Python programming.
• Expertise in debugging and optimizing Oracle and java performance tuning with strong knowledge in Oracle 11g and SQL
• Ability to work effectively in cross-functional team environments and experience of providing training to business users.
• Good experience in using Sqoop for traditional RDBMS data pull.
• Good working knowledge of Flume.
• Worked with Apache Ranger console to create and manage policies for access to files, folders, databases, tables, or columns.
• Worked with Yarn Queue Manager to allocate queue capacities for different service accounts.
• Hands on experience on Hortonworks and Cloudera Hadoop environments.
• Familiar with handling complex data processing jobs using Cascading.
• Strong database skills in IBM- DB2, Oracle and Proficient in database development, including Constraints, Indexes, Views, Stored Procedures, Triggers and Cursors.
• Extensive experience in Shell scripting.
• Experience in component design using UML Design-Use Case, Class, Sequence, and Development, Component diagrams for the requirements.
• Excellent analytical and programming abilities in using technology to create flexible and maintainable solutions for complex development problems.
• Good communication and presentation skills, willing to learn, adapt to new technologies and third party products.
No comments:
Post a Comment