Big Data Software Engineer

OREDATA is a Digital Transformation & IT Consulting firm with 10+ years of proven expertise and hundreds of successfully implemented projects across the EMEA region. When you join the OREDATA team, you’ll be working hand-in-hand with experts focused on tackling digital, operational, analytical & data science challenges with the greatest impact. We foster collaboration with proximity, an agile and autonomous approach and best practices and guiding principles.

Read more:

Apply and be part of our exciting journey!

Key Responsibilities

  • Design, develop, and maintain scalable and efficient big data infrastructure, including data storage, processing, and retrieval systems.
  • Develop algorithms, scripts, and pipelines for processing, cleaning, and analyzing large volumes of data from various sources.
  • Implement distributed computing frameworks and technologies (e.g., Hadoop, Apache Sqoop, Kafka, Apache Spark, Airflow) to process and analyze data in parallel across clusters of machines.
  • Develop data visualization tools and dashboards to present insights and findings in a clear and actionable manner for stakeholders.
  • Monitor the health and performance of big data systems, troubleshoot issues, and perform routine maintenance tasks to ensure system reliability and availability.
  • Collaborate with data scientists, analysts, and business stakeholders to understand requirements, gather feedback, and deliver solutions that meet business needs.
  • Stay informed about emerging technologies and trends in big data and contribute to research efforts to explore new techniques and tools for data processing and analysis.
  • Prepare comprehensive technical documentation for developed systems and provide ongoing technical support and guidance to team members as needed.


    • Bachelor’s Degree in Computer Engineering/Science, or equivalent practical experience
    • Minimum 2+ Years of Big Data Engineering experience required
    • In-depth knowledge of Hadoop, Apache Sqoop, Kafka, Apache Spark, Airflow and similar frameworks
    • Good knowledge of Big Data querying tools, such as Hive, and Hbase
    • Minimum 1 year of experience with Java.
    • Minimum 1 year of experience with Python.  
    • Knowledge of scripting languages including shell scripting and Python
    • Experience Cloudera CDH/CDP installation, configuration, monitoring, cluster security, cluster resources management, maintenance and performance tuning
    • Designing the architecture of a big data platform, monitoring and maintaining the environment using best practices
    • Good knowledge of relational databases, industry practices, techniques, and standards
    • Passionate about learning big data, new technologies , open source technologies
    • Creative and innovative problem-solving skills
    • Good team player, result-oriented attitude and analytical mind
    • Multitasking, time, and stress management
    • Advanced level of English

Get to know us

If you want to know more about us and what we do, then visit our website:

Why Oredata?

  • Remote working and flexible time off
  • Opportunity to get company paid Professional Certificates (Google Cloud Platform, Confluent Kafka, etc)
  • Access to Online Training Platforms (Udemy, Pluralsight, A Cloud Guru, Coursera, etc.)
  • Opportunity to work on international projects
  • Private Health Insurance
  • Birthday Leave Policy
  • Dynamic work ecosystem where you can take initiative and responsibility
  • Open communication, flexibility and start-up spirit
  • Learning & Development opportunities for both personal and professional growth