Sunday, January 21, 2018

Satish

• Over 9 years of overall experience in Financial, Marketing and Enterprise Application Development in diverse industries which includes hands on experience in Big data ecosystem related technologies. 
• Three years of comprehensive experience as Hadoop Developer. 
• Experience in writing Hadoop Jobs for analyzing data using Hive and Pig 
• Experience in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on amazon web services (AWS). 
• Hands-on experience on major components in Hadoop Ecosystem including Hive, HBase, HBase-Hive Integration, PIG, Sqoop, Flume, Map reduce, Spark, Kafka, storm and Oozie. 
• Set up standards and processes for Hadoop based application design and implementation. 
• Extensive experienced in working with NoSQL databases including HBase, Cassandra and MongoDB. 
• Experience in working with Map Reduce programs using Apache Hadoop for working with Big Data. 
• Experience in installation, configuration, supporting and monitoring Hadoop clusters using Apache, Cloudera distributions and AWS. 
• Experience in using Pig, Hive, Sqoop, HBase and Cloudera Manager. 
• Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and vice-versa. 
• In depth understanding/knowledge of Hadoop Architecture and various components such as HDFC, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce Concepts 
• Hands on experience in application development using Java, RDBMS, and Linux shell scripting 
• Extending Hive and Pig core functionality by writing custom UDFs. 
• Experience in analyzing data using HiveQL, PigLatin, and custom Map Reduce programs in Java. 
• In depth and extensive knowledge of Hadoop architecture and its components. 
• Familiarity and experience with Data warehousing and ETL tools. 
• Experienced in NoSQL databases such as HBase, Cassandraand MongoDB. 
• Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper. 
• Experienced in analyzing/processing data using HiveQL, Storm, Kafka, Redis, Flume, Sqoop, Pig Latin, and custom MapReduce programs in Java. 
• Knowledge on importing and exporting data using Flume and kafka. 
• Familiarity working with popular frameworks likes Struts 1.1, Hibernate 3.0, Spring IOC, Spring AOP and Spring JDBC 
• Experience using middleware architecture using Sun Java technologies like J2EE, JSP 2.0, Servlets 2.4, JDBC, JUnit and application servers like Web Sphere 7.1 and Web logic 10.3. 
• Good understanding of XML methodologies (XML, XSL, XSD) including Web Services (JAX-WS Specification )and SOAP 
• Experience in Web Services using XML, HTML and SOAP. 
• Experience in component design using UML Design, Use case, Class, Sequence, Deployment and Component diagrams for the requirements. 
• Experience in Message based systems using JMS, TIBCO & MQ Series. 
• Experience in writing database objects like Stored Procedures, Triggers, SQL, PL/SQL packages and Cursors for Oracle, SQL Server, DB2 and Sybase. 
• Experienced in using CVS, SVN and Sharepoint as version manager. 
• Proficient in unit testing the application using Junit, MRUnit and logging the application using Log4J. 
• Ability to blend technical expertise with strong Conceptual, Business and Analytical skills to provide quality solutions and result-oriented problem solving technique and leadership skills.

No comments:

Post a Comment