IBM Cloudera Hadoop Developer Jobs 2019:
About Company
International Business Machines Corporation (IBM) is an American multinational information technology company headquartered in Armonk, New York, with operations in over 170 countries. The company began in 1911, founded in Endicott, New York, as the Computing-Tabulating-Recording Company (CTR) and was renamed "International Business Machines" in 1924. IBM is incorporated in New York. The number of employees is 350,600, Revenue is US$79.59 billion and Total assets is US$123.38 billion.
Aspirants who are freshers of 2019 passed outs looking to get software job to settle in software field, so they can choose IBM company is one of the option to build future. Candidates are eligible of educational qualification can apply. It can give best salary. For more jobs you can regularly visit our site www.jobmela4u.com.
Company: IBM
Job Designation : Cloudera Hadoop Developer
Job Location : Bangalore
Salary : Best In Industry
Experience : Freshers/Experience
Education Qualification : Bachelors/Masters Degree
Website : www.ibm.com
About Company
International Business Machines Corporation (IBM) is an American multinational information technology company headquartered in Armonk, New York, with operations in over 170 countries. The company began in 1911, founded in Endicott, New York, as the Computing-Tabulating-Recording Company (CTR) and was renamed "International Business Machines" in 1924. IBM is incorporated in New York. The number of employees is 350,600, Revenue is US$79.59 billion and Total assets is US$123.38 billion.
Aspirants who are freshers of 2019 passed outs looking to get software job to settle in software field, so they can choose IBM company is one of the option to build future. Candidates are eligible of educational qualification can apply. It can give best salary. For more jobs you can regularly visit our site www.jobmela4u.com.
Company: IBM
Job Designation : Cloudera Hadoop Developer
Job Location : Bangalore
Salary : Best In Industry
Experience : Freshers/Experience
Education Qualification : Bachelors/Masters Degree
Website : www.ibm.com
Job Description
Responsibilities:
Demonstrate ability to become a SME in managing and monitoring enterprise CDH clusters across multiple sites/geographies with high availability.
Assess the storage utilization over a period of time and keep management informed about scaling up the infrastructure if it is needed.
Make sure all the nodes in the CDH clusters are in compliance as per ITSS security guidelines and no due vulnerabilities are present.
Evaluate research and external technologies through rapid Proof of Concepts with partners and customers
Ability to generate ideas for new features through innovation and market / industry expertise
Engage in cross-company solution development task forces
Generate reusable assets, whitepapers, articles around standard methodologies of the Hadoop services
Eligibility
Education Qualification :
Any Graduate : Bachelors/Masters Degree
Required Skills:
Expertise in Apache Spark and Scala programming.
Install, configure and maintain enterprise Hadoop environment.
Should have worked on CDH cluster deployment and worked on HDFS HA configuration.
Experience with cluster version upgrade and ability to deploy new services in the CDH cluster using CSD method.
Adding new users over time and discarding redundant users smoothly.
Proficiency in Linux scripting (Shell/Python/Ansible).
Maintain security and data privacy with Kerberos, knox and sentry.
Experience in open-stack and various virtualization techniques.
Basic understanding of TCP/IP, data networks (LAN/WAN) and IP tables/firewalls.
Required Skills:
Expertise in Apache Spark and Scala programming.
Install, configure and maintain enterprise Hadoop environment.
Should have worked on CDH cluster deployment and worked on HDFS HA configuration.
Experience with cluster version upgrade and ability to deploy new services in the CDH cluster using CSD method.
Adding new users over time and discarding redundant users smoothly.
Proficiency in Linux scripting (Shell/Python/Ansible).
Maintain security and data privacy with Kerberos, knox and sentry.
Experience in open-stack and various virtualization techniques.
Basic understanding of TCP/IP, data networks (LAN/WAN) and IP tables/firewalls.
No comments:
Post a Comment