08 Data Engineer C2C Contract Jobs in USA – September 10, 2025

As of Sep 10, 2025, there are 8 active C2C Data Engineer contract openings (remote, hybrid, onsite). Roles require expertise in ETL/ELT pipelines, AWS (Glue, EMR, DMS, Redshift), Databricks (PySpark, Spark SQL, Delta Live Tables), CI/CD, Terraform, Python, SQL, Scala, and cloud platforms (AWS, Azure, GCP, Snowflake, Airflow). Domain knowledge in clinical data standards (CDISC, SDTM), GxP, 21 CFR Part 11, and experience with Power BI, time series DBs, and schema optimization are highly valued.

UP Next: 08 Java Developer C2C Contract Jobs.

08 Data Engineer C2C Contract Jobs in USA

1. Data Engineer 

Hiring Data Engineer (Remote, travel to FL as needed)  responsible for designing ETL pipelines, migrating structured/unstructured data to cloud-based EDMS, ensuring data quality, and supporting metadata mapping and document classification.

2. AWS Data Engineer

Looking for a Data Engineer skilled in AWS (EC2, S3, Glue, EMR, DMS, CDC, IAM) and Databricks (PySpark, Spark SQL, Delta Live Tables, Workflows). Must have experience in CI/CD (Azure Pipelines, Git), Terraform, and on-prem to AWS integration.

3. Sr. Data Engineer

We are seeking an experienced Data Management Specialist to design, develop, and manage enterprise-scale data solutions. The role involves overseeing data platforms, ensuring performance and scalability, and delivering high-quality Snowflake-based solutions.

4. Senior Data Engineer 

We are looking for a Spark Developer with strong skills in Python, AWS, and SQL to design, develop, and optimize large-scale data pipelines. The role involves building ETL/ELT workflows, leveraging AWS services (S3, EMR, Glue, Lambda, Redshift, Kinesis), and ensuring data quality, scalability, and performance.

 

5. Clinical Data Engineer/Analyst

Skilled in ETL/ELT, SQL, Python/Scala, cloud platforms (AWS, Azure, GCP), Databricks, Snowflake, Airflow, with knowledge of clinical data standards (CDISC, SDTM), GxP, 21 CFR Part 11, and data privacy.

6. Lead AWS Glue Data Engineer

We are seeking a Lead AWS Glue Data Engineer with a strong background in banking or financial services. The ideal candidate will have extensive experience in AWS Glue, ETL pipeline development, and data engineering leadership, with proven ability to design scalable data solutions in enterprise environments.

7. Data Engineer

Seeking a Data Engineer with strong skills in Python and Power BI to design and optimize data pipelines, build insightful dashboards, and support business intelligence initiatives.

8. Data Engineer

We are seeking a skilled Data Engineer to design, build, and optimize data pipelines, ensuring data quality, scalability, and performance. The ideal candidate will have strong expertise in Python, SQL, and time series databases, with the ability to review and improve existing database schemas.

  • Vendor:  Empower Professionals
  • Location: Snoqualmie, WA
  • Salary: Depends on Experience
  • Recruiter email: [email protected]
  • Link: Apply Now

 

The listed Data Engineer C2C roles are actively hiring, with a strong focus on infrastructure engineering, automation, and modern deployment tools. Please review the requirements carefully to confirm fit before applying.

Disclaimer: These roles are sourced from public listings. Always verify recruiter and company authenticity before sharing personal information.

Posted by Sravanthi

Leave a Reply

Your email address will not be published. Required fields are marked *