Cloud Data Engineer - AWS with Python - job id 33289


Your Way To Work™

Cloud Data Engineer - AWS with Python

$$$

Wall Street



How to Apply

logo

George Konetsky


logo

(646) 876-9562


logo

(212) 616-4800 ext-180




A F/T position at company that owns exchanges for financial and commodity markets, and operates 23 regulated exchanges and marketplaces..

$$$ Pay Options: F/T Employee.

Contact George Konetsky. call (646)876-9562 / (212)616-4800 ext.180 or email george@sans.com with the Job Code GK33289 or Click the Apply Now button ().

Location: Wall Street.

Skills required for the position: DATA ENGINEER, AWS, PYTHON.

Optional (not required):JAVA, SHELL, SQL, HADOOP, SPARK

Detailed Info: to build massive reservoirs for Big data. Design, Develop, construct, test and maintain architectures such as large-scale data processing systems. Tool selection and POC analysis. Gather and process raw data at scale that meet functional / non-functional business requirements (including writing scripts, REST API calls, SQL Queries, etc.) Develop data set processes for data modeling, mining and production. Integrate new data management technologies and software engineering tools into existing structures. Create custom software components (e.g. specialized UDFs) and analytics applications. Employ a variety of languages and tools (e.g. scripting languages) to marry systems together

Install and update disaster recovery procedures. Recommend ways to improve data reliability, efficiency and quality. Collaborate with data architects, modelers and IT team members on project goals. Support our business users with ad-hoc analysis and reports

Build high-performance algorithms, prototypes, predictive models and proof of concepts

Research opportunities for data acquisition and new uses for existing data

Development/Computing Environment: Master's degree in Computer Science, software/computer Engineering, applied math, physic, statistics or related field (or relevant work experience and technical expertise). Hadoop based technologies (e.g. hdfs, Spark). Spark Experience is must. Strong SQL skills on multiple platform (preferred MPP systems). Database Architectures. Data Modeling tools (e.g. Erwin, Visio). 5+ years of Programming experience in Python, and/or Java. Experience with Continuous integration and deployment. Strong Unix/Linux skills. Experience in petabyte scale data environments and integration of data from multiple diverse sources. Kafka, Cloud computing, machine learning, text analysis, NLP & Web development experience is a plus. NoSQL experience a plus (HBase, Cassandra). Finance experience, most notably in Equities and Options trading and reference data is a plus




Must Have:

Key Points: -SQL

Language:

oPython Experience

oShell

oJava is plus

Hadoop, Spark experience

Good with understanding architecture.

AWS Experience is big plus.



.

The position offers competitive compensation package.


Job Id: 33289