Mid-Level (4 to 6 years)
36L - 48L (Per Year)

About the Job



Who is Mangtas?

Mangtas is a marketplace for B2B services, disrupting a 700B industry by making outsourcing reliable.

Our mission is to connect businesses with the global vendor ecosystem. We facilitate the process of finding, contracting and engaging outsourcing services around the globe for corporations small and large. To that extent, we have built a vendor marketplace ( to provide a connection point for all vendor related opportunities. We are currently focusing on services that help our clients create “Tech of the Future” – applying artificial intelligence & analytics, virtual reality & metaverses, gaming & gamification, blockchain & NFTs, IOT & robotics, cyber security and scaled microservices to name a few.

We are looking for world-class talent to join a crack team of engineers, product managers and designers. We want people who are passionate about creating software that makes a difference to the world. We like people who are brimming with ideas and who take initiative rather than wait to be told what to do. We prize team-first mentality, personal responsibility and tenacity to solve hard problems and meet deadlines. As part of a small and lean team, you will have a very direct impact on the success of the company.

As a data engineer you will:

·        Design, develop and maintain an infrastructure for streaming, processing and storage of data. Build tools for effective maintenance and monitoring of the data infrastructure.

·        Contribute to key data pipeline architecture decisions and lead the implementation of major initiatives.

·        Work closely with stakeholders to develop scalable and performant solutions for their data requirements, including extraction, transformation and loading of data from a range of data sources.

·        Develop the team’s data capabilities - share knowledge, enforce best practices and encourage data-driven decisions.

·        Develop data retention policies, backup strategies and ensure that the firm’s data is stored

redundantly and securely.

Job requirements

●       Solid Computer Science fundamentals, excellent problem-solving skills and a strong understanding of distributed computing principles.

●       At least 3 years of experience in a similar role, with a proven track record of building scalable and performant data infrastructure.

●       Expert SQL knowledge and deep experience working with relational and NoSQL databases.

●       Advanced knowledge of Apache Kafka and demonstrated proficiency in Hadoop v2, HDFS, and MapReduce.

●       Experience with stream-processing systems (e.g. Storm, Spark Streaming), big data querying tools (e.g. Pig, Hive, Spark) and data serialization frameworks (e.g. Protobuf, Thrift, Avro).

●       Bachelor’s or Master’s degree in Computer Science or related field from a top university.

●       Able to work within the GMT+8 time zone

About the company

we are providing recruitement services to IT, Non IT companies in India and Overseas . we provide the contractual staffing on our payroll.


Staffing and Recruiting

Company Size

11-50 Employees



Other open jobs from Career Fair Services & Technology