ER - Hadoop Engineer

Date: 04-Mar-2023

Location: Bangalore, KA, IN, 560100

Company: Altimetrik

Company Overview</Company Overview>

Altimetrik delivers outcomes for our clients by rapidly enabling digital business & culture and infuse speed and agility into enterprise technology and connected solutions. We are practitioners of end-to-end business and technology transformation. We tap into an organization’s technology, people, and assets to fuel fast, meaningful results for global enterprise customers across financial services, payments, retail, automotive, healthcare, manufacturing, and other industries. Founded in 2012 and with offices across the globe, Altimetrik makes industries, leaders and Fortune 500 companies more agile, empowered and successful. 

Altimetrik helps get companies get “unstuck”.  We’re a technology company that lives organizations a process and context to solve problems in unconventional ways.  We’re a catalyst for organization’s talent and technology, helping teams push boundaries and challenge traditional approaches. We make delivery more bold, efficient, collaborative and even more enjoyable.

Role Definition</Job Overview>

· Responsible for implementation and ongoing administration of Hadoop infrastructure.

· Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.

· Working with data delivery teams to setup new Hadoop users/applications. This job includes onboarding activities like setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig, Spark and MapReduce access for the new users/applications.

· Cluster maintenance as well as creation and removal of nodes using tools like Ambari and other home-grown tools.

· Performance tuning of Hadoop clusters and Hadoop workloads.

· Screen Hadoop cluster job performances and capacity planning at application/queue level

· Monitor Hadoop cluster connectivity and security

· Manage and review Hadoop log files.

· File system management and monitoring.

· HDFS support and maintenance.

· Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.

· Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.

· Optimize analytical jobs and queries against data in the HDFS/Hive environments

· Develop considerable bash shell or python scripts, LINUX utilities & LINUX Commands to ease day to day operations

· Maintain central dashboards for all System, Data, Utilization and availability metrics

Required skills

· Hadoop (preferably Hortonworks/open distribution), HDFS, Hive, Kafka, Spark, Oozie/Airflow, HBase

· Debugging skill on Java

· Strong knowledge on SQL & HQL

· Strong Linux knowledge along with scripting would be an advantage

· Kerberos, TLS, Ranger, data encryption


· Minimum 3 years of work experience in developing, maintaining, optimization, issue resolution of Hadoop clusters, supporting Business users

· Experience in Linux / Unix OS Services, Administration, Shell, awk scripting

· Experience in building and scalable Hadoop applications 

· Experience in Core Java, Hadoop (Map Reduce, Hive, Pig, Spark, Kafka, Hbase, HDFS, H-catalog, Zookeeper and Oozie/Airflow) 

· Experience in Hadoop security (Kerberos, Knox, TLS)

· Hands-on Experience in SQL and No SQL Databases (HBASE) 

· Experience in building large scale real-world backend and middle-tier systems in Java

· Experience in tool Integration, automation, configuration management in GIT, Jira platforms 

· Excellent oral and written communication and presentation skills, analytical and problem-solving skills 

· Self-driven, Ability to work independently and as part of a team with proven track record developing and launching products at scale

· Develop and enhance platform best practices and educate Visa developers on best practices