Senior Data Engineer / Architect
Experience Required: 8+Years
Mode of work: Remote
Skills Required: Azure DataBricks, Eventhub, Kafka, Architecture,Azure Data Factory, Pyspark, Python, SQL, Spark
Notice Period : Immediate Joiners/ Permanent/Contract role (Can join within July 4th 2025)
- Design, develop, and maintain scalable and robust data solutions in the cloud using Apache Spark and Databricks.
- Gather and analyse data requirements from business stakeholders and identify opportunities for data-driven insights.
- Build and optimize data pipelines for data ingestion, processing, and integration using Spark and Databricks.
- Ensure data quality, integrity, and security throughout all stages of the data lifecycle.
- Collaborate with cross-functional teams to design and implement data models, schemas, and storage solutions.
- Optimize data processing and analytics performance by tuning Spark jobs and leveraging Databricks features.
- Provide technical guidance and expertise to junior data engineers and developers.
- Stay up to date with emerging trends and technologies in cloud computing, big data, and data engineering.
- Contribute to the continuous improvement of data engineering processes, tools, and best practices.
Benefits
- Bachelor’s or master’s degree in computer science, engineering, or a related field.
- 10+ years of experience as a Data Engineer, Software Engineer, or similar role, with a focus on building cloud-based data solutions.
- Strong knowledge and experience with Azure cloud platform, Databricks,EventHub,Architecture,Spark,Kafka, ETL Pipeline,Python/Pyspark, SQL.
- Strong experience with cloud platforms such as Azure .
- Proficiency in Apache Spark and Databricks for large-scale data processing and analytics.
- Experience in designing and implementing data processing pipelines using Spark and Databricks.
- Strong knowledge of SQL and experience with relational and NoSQL databases.
- Experience with data integration and ETL processes using tools like Apache Airflow or cloud-native orchestration services.
- Good understanding of data modelling and schema design principles.
- Experience with data governance and compliance frameworks.
- Excellent problem-solving and troubleshooting skills.
- Strong communication and collaboration skills to work effectively in a cross-functional team.
Interested in joining our team? Send your resume to Pavithra.tr@enabledata.com to explore this opportunity further.