08 Data Engineer C2C Contract Jobs in USA – August 27, 2025

Eight in-demand Data Engineer C2C contract roles (as of Aug 27, 2025) – remote/hybrid/onsite – require skills in SQL, Python, PySpark, AWS (Glue, S3, Lambda, EventBridge, Kafka, Kinesis, CloudWatch, IAM, SNS, SQS, CDK), real-time/event-driven systems, GitLab, CI/CD, TDD, Azure Data Services (ADF, Databricks, Synapse, Data Lake, SQL DB), ETL/ELT pipeline design, cloud architecture, team leadership, Telecom domain, Big Data/ETL, Hive, Teradata, Spark SQL, Palantir Foundry, JavaScript/TypeScript, data modeling, cloud (AWS/Azure/GCP), data governance/security, healthcare domain, predictive analytics/ML, Delta Lake, Snowflake, data modeling, Git, Matillion ETL, Tibco Data Virtualization, ETL migration, NoSQL, Generative AI, LLMs, and RAG.

UP Next: 08 Java Developer C2C Contract Jobs.

08 Data Engineer C2C Contract Jobs in USA

1. AWS Data Engineer

Experience in SQL, Python, PySpark, AWS (Glue, S3, Lambda, EventBridge, Kafka, Kinesis, CloudWatch, IAM, SNS, SQS, CDK), real-time streaming/event-driven systems, GitLab, CI/CD, TDD, and building resilient active/active or active/passive applications.

2. Lead Azure Data Engineer

Expertise in Azure Data Services (ADF, Databricks, Synapse, Data Lake, SQL DB), Python/SQL, ETL/ELT pipeline design, CI/CD (Azure DevOps), real-time data processing, cloud architecture, and team leadership.

3. Data Engineer

Strong in PySpark (must), Databricks, Rundeck, Spark clusters, SQL (must); Scala (plus).  in Data Engineering/Data Science, Telecom domain (must), Big Data/ETL, AWS (S3, Athena), and SQL (Hive, Teradata, Spark SQL).

4. Data Engineer 

Expert in Palantir Foundry, Python, SQL, JavaScript/TypeScript, ETL, data modeling, cloud platforms (AWS/Azure/GCP), data governance, and security; healthcare domain and predictive analytics/ML a plus.

5. Data Engineer 

We’re hiring an AWS Data Engineer – skilled in PySpark, Python, Spark, SQL, and AWS data services.

6. Cloud Data Engineer

Expert in PySpark, Databricks, Python, SQL, cloud (Azure/AWS/GCP); build/optimize ETL pipelines, ensure data quality/security; plus: ADF, Delta Lake, Snowflake, data modeling, CI/CD, Git.

7. Data Engineer 

Skilled in Matillion ETL, Snowflake, SQL/stored procedures, ETL development, data integration, dimensional modeling, Tibco Data Virtualization (plus), healthcare insurance (plus), ETL migration (plus), NoSQL (plus).

8. Data Engineer With Gen AI and LLM

Strong in Python, Generative AI patterns, LLMs, and RAG; limited exposure to query optimization, indexing, performance profiling, ETL/ELT tools (Spark, Airflow, dbt), large-scale pipelines, cloud warehouses (Snowflake, BigQuery, Redshift), and data lineage/metadata management.

The Data Engineer C2C roles listed are actively hiring, requiring skills in infrastructure engineering, automation scripting, and modern deployment tools. Review each role to ensure alignment with your skills before applying.

Disclaimer: All jobs are sourced from public listings—verify recruiter and company authenticity before sharing personal details.

Posted by Sravanthi

Leave a Reply

Your email address will not be published. Required fields are marked *