Hadoop and Cloudera Administrator- Very large amounts of Data - job id 32747


Your Way To Work™

Hadoop and Cloudera Administrator- Very large amounts of Data

$$

Midtown



How to Apply

logo

George Konetsky


logo

(646) 876-9562


logo

(212) 616-4800 ext-180




A F/T position at a Global Cyber Security firm, to help defend businesses around the world against agile and well-financed cyber attackers by providing unparalleled visibility, insight and responsiveness..

Pay Options: F/T Employee.

Contact George Konetsky call (646)876-9562 or email george@sans.com with the Job Code MN32747 or Click the Apply Now button ().

Location: NYC.

Skills required for the position: HADOOP, CLOUDERA, LINUX, SCRIPTING.

Optional (not required):AGILE, DEVOPS

Detailed Info:

Hadoop/ Cloudera Administrator, to do hands on tech management of Hadoop and Cloudera system for a Cyber Security firm.

  • This system deals with significant amounts of data - Petabytes (far more than terabytes). Have excellent communications skills and a passion for Hadoop and Cloudera, be an expert.

  • This firm is a global cybersecurity firm, providing Advanced Threat Intelligence for companies around the world.

  • The firm monitors the networks, the dark web and the internet in real time. The firm does real time external threat monitoring, predictive human and machine sourced intelligence and managed security response. Cyber Defense.

Development/Computing Environment:

  • Senior level hands on Hadoop Admin / Cloudera Admin / Linux / some python for scripting.

  • Multiple environment & multi cluster experience preferred kerberos/LDAP/AD config & troubleshooting experience heavy Hadoop skills Performance tuning.

  • Qualification:

  • Experience in Cloudera Hadoop distribution. enabling High availability, installing services, applying patches.

  • Unix/Linux knowledge including the ability to understand hardware,

  • Operating system and network settings.

  • Experience with Hadoop Ecosystem components which include HDFS, YARN, Hive, Impala, Spark, Sqoop, Kafka, Flume and Solr.

  • Unix Shell, perl or python scripting.

  • Hadoop Developer or Administration experience with CDH (Cloudera Distribution of Apache Hadoop)

  • Experience with configuring and troubleshooting HDFS, Hive, Yarn, Impala

  • Experience with implementing Kerberos and Sentry security on CDH Development and/or DevOps experience with Hadoop a plus

  • Good working experience in Linux / Unix Good Communication skills and working experience at Client place. Knowledge on Agile and Dev Ops way will be an added advantage..

The position offers competitive compensation package.


Job Id: 32747