About The Position
The Engineering group at Zooz brings together Software design, Hardware and Infrastructure Management/Design and Operations and Engineering (DevOps). You’ll be a part of a rapidly growing team working across the technical organization comprised of other Engineers, Designers, and Product Managers.
As a team, we challenge and mentor each other to do the best work of your career. You’ll have the opportunity to have a truly global impact on merchant’s commerce platforms, from a developer adding payments to the next big app to global e-commerce platforms and Financial Processing partners.
We are looking for a brilliant backend engineer to join our Engineering team and work on the company's major data system. As a part of the team, you will take part in designing and building our next generation cloud-based data solutions. You will face challenges of performance, scalability, and availability all of them in a cloud-based environment with the newest tools and services. This is a truly unique Engineer position putting you in charge of every aspect of the system and the development life-cycle.
What You'll Do:
- Work in an extreme agile team in a dynamic environment
- Design and implement complex data pipelines, integrating cutting-edge technologies such as Druid, Spark and Cassandra, to deliver real-time insights critical for cross-company products
- Infrastructure design, Server side planning, and development at High load & Scale over a distributed environment (Mesos, DC/OS).
- 100% Cloud-based development on top of AWS & distributed data center
Who you are
- You enjoy working with the wider Engineering organization as well as Product Managers, Support team members to deliver the best customer experiences possible.
- You love to build user interfaces with the latest technologies.
- You should have 2+ years of Scala/Node.js/Python experience.
- You love data and know what can be done with it.
- 2+ years development experience in Scala/Node.js/Python
- Knowledge of container world and distributed data centers
- Familiarity with distributed data technology concepts like parallel processing, replication, and scalability.
- Experience with CI/CD pipelines - a plus
- Big Data experience - Kafka, Cassandra, Spark or Druid - a big plus
- Passion for 100% uptime systems