Main responsibilities will include:
• Performance and fine tuning a big data system that is based on Kafka Confluent and Kafka Streams.
• Offer technical solutions, innovate and improve the quality of implementation, performance and usability of the software products that are in our company’s portfolio.
• This role is perfect for the scientifically minded software developer who enjoys working on large-scale big data problems.
• Monitor performance and advising any necessary infrastructure changes
• Integrate any Big Data tools and frameworks as identified as required
• Implement the framework ETL process – data ingestion
• Closely collaborate with development team in providing solutions for improvement that were identified as necessary
• Create, debug, maintain and optimize software solutions with the highest bar of quality and velocity, using modern tech stacks and tools
• Working with the company’s core systems and having the ability to quickly learn and improve such systems
• Experience with NoSQL technologies stack, like Kafka, ElasticSearch, Redis.
· Multinational, friendly, dynamic and professional working environment;
· The opportunity to demonstrate and cultivate your skills while working with global leaders in their field as partners and clients;
· Free medical services through a private healthcare network;
· Life and accident insurance, meal vouchers