AWS Data Engineer | W2 Contract | Banking Domain
Job Description Below:-
Role: Data Engineer
Location: St. Louis, MO
Duration: 1 year – possible extension
Location: Local candidates strongly preferred - onsite 2-3 days per week. If non-local, they will be required to come in for our PI planning cycles. Usually ~2 months for 3 days.
Description;
We are looking for a Data Engineer to design and build capabilities for a cutting-edge, cloud-based big data analytics platform. You will report to an engineering leader and be a part of an agile engineering team responsible for developing complex cloud-native data processing capabilities as part of an AWS-based data analytics platform. You will also work with data scientists, as users of the platform, to analyze and visualize data and develop machine learning/AI models.
- Responsibilities
- Develop, enhance, and troubleshoot complex data engineering, data Visualization, and data integration capabilities using Python, R, Lambda, Glue, Redshift, EMR, QuickSight, SageMaker, and related AWS data processing and visualization services.
- Provide technical thought leadership and collaborate with software developers, data engineers, database architects, data analysts, and data scientists on projects to ensure data delivery and align data processing architecture and services across multiple ongoing projects.
- Perform other team contributions such as peer code reviews, database defect support, Security enhancement support, Vulnerability management, and occasional backup production support.
- Leverage DevOps skills to build and release Infrastructure as Code, Configuration as Code, software, and cloud-native capabilities, ensuring the process follows appropriate change management guidelines.
- In partnership with the product owner and engineering leader, ensure the team has a clear understanding of the business vision and goals and how that connects with technology solutions.
- Qualifications
- Bachelor's degree with a major or specialized courses in Information Technology or commensurate experience.
- 7+ years proven experience with a combination of the following:
- Designing and building complex data processing pipelines and streaming.
- Design of big data solutions and use of common tools (Hadoop, Spark, etc.)
- Relational SQL databases, especially Redshift
- IaC tools like Terraform, Ansible, AWS CDK.
- Containerization services like EKS, ECR.
- AWS cloud services: EC2, S3, RDS, Redshift, Glue, Lambda, Step Functions, SageMaker, QuickSight, Config, Security Hub, Inspector
- Designing, building, and implementing high-performance API and programs using architectural frameworks and guidelines
- UNIX / Linux operating systems.
Apply Job!
Apply to this Job