Lead Big Data Development/Engineer - job id 33367

Your Way To Work™

Lead Big Data Development/Engineer

Market $$$ F/T Employee


How to Apply


Recruiting Team


(212) 616-4800


(212) 616-4800

A Full-time position at an Insurance Company

Pay Options: F/T Employee.

Contact Jason Vu. call (646)876-9536 / (212)616-4800 ext.290 or email jason@sans.com with the Job Code JV33367 or Click the Apply Now button ().

Location: Cary, NC.

Skills required for the position: DATA ANALYSIS, DATA GOVERNANCE, HADOOP.

Optional (not required):CLOUD

Detailed Info:

Global Data & Analytics organization is the team of expert technologists responsible for building big data platforms and data services with innovative technologies to enable our businesses to generate insights, and value to its customers. This team plays a central role in Design, Build & Integration of all emerging technologies and cutting edge products across our Data Hub and Enterprise Data Management Systems.

The Lead Software Development Engineer - Emerging Platform Applied Engineering Leads will contribute in platform designs, development aspects with focus on delivering end-to-end solutions and build reusable framework and utilities. The candidate should have prior hands-on experience with Big Data technologies and working knowledge on Big Data tools.

Key Responsibilities:

Contribute to all Applied Engineering activities including idea generation, requirements, project planning, frameworks, building roadmap, etc.

Drive the platform design, engineered systems, and delivery of any data and analytics requirements with business groups

Publish and enforce technology best practices, configuration recommendations, usage design/patterns, and cookbooks to developer community

Engineer frameworks for reusable services, customization/development required including visualizations

Coordinate multiple offshore and onshore teams for development, setup and framework rollout activities

Platform SME and provide Level-3 technical support for troubleshooting

Development/Computing Environment:

Hands-on knowledge of the Big Data technologies Hadoop, Spark, Hive, HBase, Sqoop, Flume, Pig, Kafka, Python, Shell Scripts, etc.

Experience with solution architecture, design, development , and understanding of data life cycle: data acquisition, data quality management, data governance, and metadata management

Experience with detailed level of data analysis using technical tools; Alteryx, Arcadia, Qlik, Power BI etc.

Experience in large scale data warehouse implementations and knowledge of ETL technologies such as Informatica, Talend etc.

Minimum bachelor degree required, master degree preferred. 10+ years of experience in IT environment; 2 to 3 years of experience in relevant Big Data technologies


Hands-on working experience with one or more Hadoop distribution from Hortonworks, Cloudera or MapR.

Experience in one or more cloud framework such as Azure PAAS, AWS etc.

Capable of building, articulating, and presenting new ideas to technical, non-technical, and business communities.

Possess creativity and innovation skills, demonstrate the ability compare and analyze different tools and technologies, and make appropriate recommendations

Excellent written and verbal communication skills..

The position offers competitive compensation package.

Job Id: 33367