Accenture Digital Core
At Accenture Digital Core, we help clients continuously innovate at speed and at scale, combining deep industry expertise with cutting-edge technology to deliver meaningful transformation. You will bring innovation, intelligence and experience together to shape robust, secure and scalable digital solutions.
What You’ll Do
- Design and build scalable ETL/ELT data pipelines;
- Optimize data storage and performance in cloud platforms;
- Ensure data quality, validation and lineage;
- Operate pipelines in production environments;
- Collaborate with analytics and AI teams;
- Support incident resolution and platform stability;
- Handle and oversee Python and SQL development activities in production environments;
- Oversee Spark / PySpark data processing workloads to ensure reliability and performance;
- Ensure proper handling of Lakehouse and Delta Lake concepts across data pipelines;
- Supervise and apply data partitioning strategies, schema evolution and CDC patterns;
- Coordinate and manage the use of Azure services, including Synapse, Data Lake and data pipelines;
- Oversee monitoring and alerting processes and support the use of DevOps pipelines to ensure operational stability.
What You Bring
- University degree in Computer Science, Software Engineering, Information Technology, Engineering, or a comparable technical field;
- 3-5 years of relevant work experience;
- Understanding of distributed systems;