Description:
The ideal candidate will have a proven track record of delivering end-to-end ETL/ELT pipelines across Databricks, Azure, and AWS environments.Key Responsibilities Design scalable data lake and data architectures using Databricks and cloud-native services. Develop metadata-driven, parameterized ingestion frameworks and multi-layer data architectures. Optimize data workloads and performance. Define data governance frameworks for CHP. Design and develop robust data pipelines. Architect AI systems,
Dec 16, 2025;
from:
dice.com