top of page

MongoDB Experienced Spark Developer

Job Type

Contract to hire ( 3 Months)


3 to 8 years



Job Description

We are seeking a skilled and experienced MongoDB Experienced Spark Developer to join our
dynamic team. As a key member of our development team, you will be responsible for
designing, implementing, and maintaining scalable data processing solutions using Apache
Spark and MongoDB. The ideal candidate should have a strong background in Spark
development, MongoDB, and data engineering, with a focus on delivering efficient and
high-performance solutions.

Key Responsibilities

Data Processing and Analysis: 

  • Develop, implement, and optimize scalable data processing solutions using Apache Spark for  large-scale datasets. 

  • Design and implement algorithms for data analysis, transformation, and enrichment. MongoDB Database Management:

  • Work with MongoDB to design and implement efficient database schemas. 

  • Optimize and troubleshoot MongoDB database performance.

Integration and ETL: 

  •  Integrate Spark applications with various data sources and destinations. 

  •  Develop and maintain efficient ETL (Extract, Transform, Load) processes. 

  •  Performance Tuning: 

  •  Monitor and optimize the performance of Spark jobs and MongoDB queries. 

  •  Identify and resolve performance bottlenecks in the data processing pipeline. 

Code Review and Collaboration: 

  • Collaborate with the development team to conduct code reviews and ensure best practices are followed.

  • Mentor junior developers and share knowledge about Spark and MongoDB development. 

Data Security and Compliance: 

  • Implement and enforce data security measures to ensure the confidentiality and integrity of data. 

  • Ensure compliance with relevant data protection and privacy regulations.


  • Create and maintain comprehensive documentation for Spark applications, MongoDB schemas, and data processing workflows.


Bachelor's or Master's degree in Computer Science, Information Technology, or related field. 

  • Proven experience working as a Spark Developer with a focus on MongoDB. 

  • In-depth knowledge of Apache Spark, Spark SQL, and the Spark ecosystem. 

  • Strong proficiency in MongoDB database design, query optimization, and administration. 

  • Experience with data modeling, ETL processes, and integration of data from multiple sources. 

  • Proficiency in programming languages such as Scala or Python. 

  • Solid understanding of distributed computing principles and big data technologies. 

  • Familiarity with cloud platforms (e.g., AWS, Azure) and containerization (e.g., Docker, Kubernetes) is  a plus. 

  • Strong analytical and problem-solving skills with attention to performance and scalability

bottom of page