08 Data Engineer C2C Contract Jobs in USA – September 04, 2025

As of Sep 04, 2025, eight active Data Engineer C2C contract roles (remote, hybrid, onsite) require expertise in Microsoft Fabric (Synapse, Power BI, Data Factory), Databricks, AWS (S3, EC2, Redshift, Glue, Kinesis, Lambda), Snowflake, Spark, Python, SQL, PL/SQL, ETL/ELT, and CI/CD. Roles involve building and optimizing pipelines, APIs, scalable architectures, and AI/ML workflows using SageMaker, Rekognition, TensorFlow, and PyTorch. Domain knowledge in healthcare (data standards, video analytics) and financial services (trade lifecycle, securities, mutual funds, ETFs) is preferred.

UP Next: 08 Java Developer C2C Contract Jobs.

08 Data Engineer C2C Contract Jobs in USA

1. Certified Fabric Support Lead (Data Engineer)

Skilled in Microsoft Fabric (Synapse, Power BI, Data Factory), Databricks, Azure Functions, Logic Apps, ETL/ELT, pipelines, SQL, Python. Role involves advanced support, troubleshooting, optimization, and consulting on Fabric solutions.

2. AWS Data Engineer

Looking for an AWS Data Engineer skilled in data hydration, manipulation using Python, and Snowflake. Must have strong hands-on experience with AWS services (S3, EC2, ECS, EKS, Redshift, RDS, Lambda, VPC, DMS, API Gateway, IAM, CloudFront), ETL pipelines (Airflow, Glue, Kinesis, Step Functions), and modern data frameworks (Spark, EMR, Databricks). Knowledge of Docker, Fargate, ECR, CI/CD, Jenkins, Git, and shell scripting preferred.

3. Lead Data Engineer

Strong AWS cloud experience, AWS certified, skilled in Spark, Glue, Redshift, Aurora, Snowflake, PySpark, Lambda, Python, SQL. Responsible for building data pipelines, ETL, APIs, and scalable architectures.

4. AWS Data Engineer 

Looking for an AWS Data Engineer with expertise in SageMaker, Rekognition, Kinesis, and scalable architectures for real-time and batch video analysis. Role involves ETL pipelines, integration with medical systems, dataset preparation for ML models, video segmentation/annotation, and data structures for analytics. Skills required include Python, SQL, big data technologies, computer vision, deep learning (TensorFlow, PyTorch), AWS AI services, and knowledge of healthcare data standards.

5. Data Operations Engineer

Looking for a DataOps Engineer with strong experience in data ingestion, ELT, reporting, and PL/SQL. Must have solid knowledge of investments and capital markets including trade lifecycle, securities, mutual funds, and ETFs. Responsibilities include managing pipelines, ensuring accurate data delivery, monitoring jobs, troubleshooting issues, collaborating with teams, and supporting data quality initiatives.

6. Data Operations Engineer

Skilled Data Engineer with expertise in financial instruments, trade lifecycle, PL/SQL, data pipelines, and reporting solutions. Strong in data ingestion, ELT, financial data structures, and regulatory requirements.

7. Data Scientist/ML Engineer

Seeking a Data Science & AI Specialist (contract-to-hire) to develop and deploy AI/ML models, perform advanced analytics, and build Power BI dashboards. Must be hands-on with data/coding and skilled in presenting insights and strategies to executive leadership.

8. Data Operations Engineer

Experience in financial instruments (securities, mutual funds, investments, capital markets), data operations/engineering, PL/SQL, ELT pipelines, reporting, trade lifecycle, financial data structures, and regulatory requirements.

The listed Data Engineer C2C roles are currently hiring, with a focus on infrastructure engineering, automation, and modern deployment tools. Please review the requirements carefully to ensure fit before applying.

Disclaimer: These roles are from public listings — always verify recruiter and company authenticity before sharing personal information.

Posted by Sravanthi

Leave a Reply

Your email address will not be published. Required fields are marked *