Sunday, January 21, 2018

Sharath

• 7+ years of work experience in application and product development using Hadoop, Java/J2EE, Mainframe and ETL Technologies. 
• Experience on major components in Hadoop eco systems like HDFS, Yarn, Map Reduce, Pig, Hive, Sqoop, Hbase , Kafka, Spark, Oozie, Flume, Hue, Impala . 
• Experience in Cloudera’s CDH and Hortonworks HDP 2.X distribution. 
• Experience in large-scale data processing on an Amazon EC2 cluster. 
• Experience with backend databases like Oracle, DB2, Mysql, Mainframe, SQL Server and NoSQL databases like Hbase,MongoDB, Cassandra. 
• Excellent programming skills with experience in Java, C, Scala, SQL and Python Programming. 
• Experience in Map-Reduce framework which include MR daemons, sorting and shuffle phase, task execution. 
• Experience in developing MapReduce jobs, PIG scripting using Hadoop Ecosystem. 
• Experience in automating the Hadoop Installation, configuration and maintaining the Hadoop Clusters and HDFS. 
• Experience in coordinating Cluster services through ZooKeeper. 
• Experience in writing PIG and HIVE UDF’s in Java. 
• Experience in developing HIVE queries to process the data and generate data cubes for visualization. 
• Experience in moving bulk amount of data into HBase using Map Reduce Integration. 
• Expertise in loading and transforming of large sets of semi structured and unstructured data using Pig Latin operations. 
• Experienced in migrating map reduce programs into Spark RDD transformations, actions to improve performance. 
• Experience in developing in PIG programs for loading and filtering the streaming data into HDFS using FLUME. 
• Worked with SQOOP for importing and exporting of data form RDBMS, Oracle to HDFS and vice-versa. 
• Worked with Apache Flume for collecting, aggregating and moving large amounts of data from application servers. 
• Experience in job workflow scheduling and monitoring tools like Oozie. 
• Good knowledge of J2EE design patterns and Core Java Design Patterns. 
• Good experience in Core Java, J2EE, JavaScript, Servlets, Struts, spring, Hibernate, Jdbc, Ejb, Xml, Html, Xhtml, Dhtml, Ajax, Jstl , Css. 
• Good knowledge on ETL tools Informatica and DataStage. 
• Experience with ETL and Data integration designed for IT and BI analysis to schedule. 
• Experience using various Hadoop Distributions (Cloudera, Hortonworks, MapR etc) to fully implement and leverage new Hadoop features. 
• Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. 
• Extensive experience in implementing SOAP and Rest Based Web Services. 
• Experience in using IDEs like Eclipse and NetBeans. 
• Adequate knowledge and working experience in Agile & Waterfall methodologies. 
• Exceptional ability to learn new technologies and to deliver outputs in short deadlines. 
• Designing and analyzing the business use cases to provide the right solutions for all the POC’s used in Hadoop Projects. 
• Quick learner with effective communication, motivation, organizational skills and excellent track record of leading end to end implementation and integrating projects to successful completion under deadlines.

No comments:

Post a Comment