Data Engineer (ETL) - job id 33222


Your Way To Work™

Data Engineer (ETL)

Market Per Hour Rate

Lower Manhatten (NOHO)



How to Apply

logo

Kyle Barlics


logo

(732) 791-4723


logo

(212) 616-4800 ext-580




A Contract position at Premier New York Education Institution.

Pay Options: IC - Self Incorporated or w2.

Contact Kyle Barlics. call (732)791-4723 / (212)616-4800 ext.580 or email kyle@sans.com with the Job Code KTB33222 or Click the Apply Now button

Location: Lower Manhatten (NOHO).

Skills required for the position: ETL, ETL, DATA STAGE, DATA WAREHOUSING, DATA MART, RELATIONAL DATABASES, SQL, ORACLE.


Detailed Info: he primary focus will be to develop the construction and maintenance of our data pipeline, ETL processes and data warehouse. Data Engineer will also be responsible for data quality and understanding the data needs our various source data in order to anticipate and scale our systems.

Roles & responsibilities may include:

Integrate data from a variety of data sources (Data warehouse, Data marts) utilizing on-premises or cloud-based data structures;

Develop and implement streaming, data lake, and analytics big data solutions

Create integration of data from multiple data sources, knowledge of various ETL techniques and frameworks using Databricks

Create Applications using Change Data Capture Tools

Technical Support (includes troubleshooting, monitoring)

Technical Analyst and Project Management Support

Application Performance and System Testing Analyst

Design, develop, deploy and support end to end ETL specifications based on business requirements and processes such as source-to-target data mappings, integration workflows, and load processes using IBM Datastage

Developing ETL jobs using various stages such as Sequential, Dataset, Transformer, Copy, Lookup, filter, Join, Merge, Funnel, Sort, Remove Duplicates, Modify and Aggregator etc.

Involve in day to day support and providing solutions/troubleshooting Production Outages/Issues using Datastage tool for business requirements, enhancements and handling service requests


Development/Computing Environment: "Ideal" candidates will have the following experience, knowledge, skills or abilities:

Minimum of 5-7 years of IT work experience focused in Data Acquisition and Data Integration using DataStage

Minimum 4-6 years of experience with ORACLE SQL and PL/SQL Package

Experience working with flat files and XML transformations

Analyzing the statistics of the DataStage jobs in director and conducting performance tuning to reduce the time required to load the tables and the loads on computing nodes involved.

Knowledge/Experience on Data Warehousing applications, directly responsible for the Extraction, Staging, Transformation, Pre Loading and Loading of data from multiple sources into Data Warehouse

Application development, including Cloud development experience, preferably using AWS (AWS Services, especially S3, API Gateway, Redshift, Lambda, etc.)

Working with different file formats: Hive, Parquet, CSV, JSON, Avro etc. Compression techniques

Knowledge of Business Intelligence Tools, Enterprise Reporting, Report Development, data modeling, data warehouse architecture, data warehousing concepts

Comfortable with AWS cloud (S3, EC2, EMR, Redshift, etc.)

Ability to collaborate with colleagues across different schools/locations

Python, Spark and AWS experience is a big plus.

The position offers competitive rate.


Job Id: 33222