08 Data Engineer C2C Jobs of the Day USA [September 16, 2025]

As of September 16, 2025, there are 8 active Data Engineer openings (remote, hybrid, and onsite) requiring expertise in Databricks, Unity Catalog, Privacera, Collibra, AWS, Spark, Python, SQL, Snowflake, APIs, K2View, GCP, and related tools, with a strong focus on building scalable pipelines, data governance, ETL/ELT, and cloud-based data engineering.

UP Next: 08 Java Developer C2C Contract Jobs.

08 Data Engineer C2C Jobs of the Day in USA

1. Data Platform Engineer

Hiring Data Platform Engineer with strong expertise in Databricks, Unity Catalog, Privacera, and Collibra. Must design/optimize data pipelines, implement governance, and operationalize ML models.

2. Data Engineer/ Lead

Hiring Data Engineer with expertise in AWS (EMR/Glue, S3, Lambda, CloudWatch), Python (for scripting & pipelines), Spark, Databricks, SQL, Bash, Java, Snowflake, and APIs. Must have experience in data warehousing and working in Agile/JIRA environments.

3. Data Fabric K2View Engineer

Hiring K2View Data Fabric Engineer with expertise in K2View (LU design, mDB, ingestion, orchestration), SQL/Java/Python, ETL tools, real-time streaming (Kafka/MQ), APIs, and cloud platforms. Must have experience in data governance/compliance and working in Agile/DevOps environments.

4. Data Engineer 

 

 

Looking for an experienced Data Engineer skilled in AWS (EMR/Glue, S3, Lambda, CloudWatch), Spark, Databricks, SQL, Bash, Python (scripting & pipelines), Java, Snowflake, and APIs. Must have data warehousing experience and work in Agile/JIRA environments.

5. Data Engineer

Hiring Data Engineer with expertise in Python, AWS Glue, Iceberg, Redshift, S3, Lambda, Terraform, Spark, and API Gateway. Must have strong knowledge of data modeling, ETL, and pipeline design, with AWS certifications a plus.

6. Sr. AWS Data Engineer

Hiring Sr. Data Engineer with expertise in Databricks, AWS Glue, Spark, PySpark, Parquet, and Iceberg. Must have strong skills in data pipelines, automation, CI/CD, SDLC, and financial services domain experience.

7. Data Engineer

Hiring Data Engineer with strong expertise in Scala/PySpark, Spark (batch & streaming), Databricks/Hadoop, SQL, and ETL/ELT. Must have experience building scalable pipelines on cloud (AWS/Azure/GCP) with knowledge of data governance, lineage, and performance tuning.

8. Data Engineer

Hiring Data Engineer with strong expertise in Python, GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Airflow/Composer), ETL/ELT, and data modeling. Must have experience building scalable pipelines, ML-ready data models, and shared frameworks/tools, with bonus skills in Scala, Java, APIs, and microservices.

 

Active C2C Data Engineer openings underscore strong demand for infrastructure engineering, automation, and modern deployment tooling review job requirements carefully to ensure alignment before applying.
Disclaimer: These opportunities are sourced from publicly available listings; always verify recruiter and company authenticity before sharing personal information.

Posted by Sravanthi

Leave a Reply

Your email address will not be published. Required fields are marked *