As exclusive IT consultancy, we provide innovative IT services and powerful software technology solutions for market-leading enterprise clients. We specialize in the most advanced tech stack projects, and enjoy a reputation for consistently providing our clients with the competitive edge they need to shape their digital future. beON is characterized by a flat and agile structure with an international as well as modern corporate culture. The company is headquartered in Kiel, Germany with additional offices, project and/or training centres in Düsseldorf, Munich, Vienna, Lisbon as well as Hyderabad (India). We value the collaboration with our employees. No matter where you are, you will be part of an engaged and diverse tech community with plenty of development & learning opportunities.
Benefits: you earn & learn
If you’re ready to take on responsibility, you’ve come to the right place. Your experience matters at beON. You will receive above-average financial compensation in form of salary & benefits, job stability and motivating incentives.
There will be plenty of professional development & learning opportunities. Not only will you keep up to date with the latest tech stack, but you will also have the opportunity to develop individually through the guidance of your own team of experts and in learning from external sources. In your career with us, you can constantly expand your own area of responsibility and work creatively as well as independently. You will have access to the tools and resources you need to succeed. A continuous learning curve is guaranteed.
In addition to exciting career prospects, we offer you the flexibility with a hybrid working model to create a work-life balance tailored to your needs. You will be immersed in an appreciative environment that focuses on transparency, fairness and enjoyment at work. The positive atmosphere in the company is usually reinforced by voluntary team events such as joint lunches and after work drinks, sailing, lake festivals, Oktoberfest and shared skiing. In times of pandemic, the possibility of working remote from home protects your health and safety.
- Type of employment: permanent fixed-term contract
- Workload: full-time
- Places of work: remote, Düsseldorf, Hamburg, Cologne, Frankfurt am Main, Munich, Vienna, Berlin
- Languages: German and English
- Start: as soon as possible
- Salary: The salary range is open upwards with additional bonus opportunities and salary increases, depending on the candidate’s experience
- As Apache Kafka Consultant with profound distributed systems experience, you will be responsible for the administration and deployment of customized and advanced event streaming platforms based on Apache Kafka, current industry standards and using the latest tools and methods.
- You are in contact with your customer and are responsible for the preparation, planning, migration, control, monitoring and implementation of highly scalable event streaming platforms or Kafka projects and for comprehensive customer consulting on the current state of these technologies.
- As a Consultant for Big Data Management and Stream Processing, your goal is to implement the design and architectures for streaming platforms and stream processing use cases using open source and cloud tools.
- Completed studies or comparable training with a technical background
- Sound experience and knowledge in Java
- Experience with Apache Kafka or similar large-scale enterprise distributed data systems technologies e.g. Kafka, Spark, CockroachDB, HDFS, Hive, etc.
- Experience in software development and automation to run big data systems
- Experience with implementing complex solutions for Big Data and Data Analytics applications
- Knowledge in system deployment and container technology with building, managing, deploying and release managing Docker containers and container images based on Docker, OpenShift and/or Kubernetes
- Knowledge in developing resilient scalable distributed systems and microservices architecture
- Experience with at least one of the distributed technologies (e.g. Kafka, Spark, CockroachDB, HDFS, Hive, etc.)
- Experience with at least one of the stream processing frameworks (e.g. Kafka Streams, Spark Streaming, Flink, Storm)
- Knowledge in Continuous Integration / Continuous Delivery (CI/CD) using Jenkins, Maven, Automake, Make, Grunt, Rake, Ant, GIT, Subversion, Artefactory and Nexus.
- Understanding of SDLC processes (Agile, DevOps), Cloud Operations and Support (ITIL) Service Delivery
- Knowledge in authentication mechanism with OAuth, knowledge of Vert.x and Spring Boot
- Knowledge in SQL Azure, AWS development and cloud migration to one of AWS, Azure, Google Cloud Platform and/or Hybrid/Private Cloud; as well as cloud-native end-to-end solutions, especially their key building blocks, workload types, migration patterns and tools
- Experience with monitoring tools and logging systems such as NewRelic, ELK, Splunk, Prometheus and Graylag
- Ability to communicate technical ideas in a business-friendly language
- Interest in modern organizational structure and an agile working environment (SCRUM)
- Customer-oriented and enjoy working in an international environment in German and English
If we have gained your interest, we look forward to receiving your application with an up-to-date CV. We know that you are very busy and therefore do not expect a cover letter. Please use the job title “Apache Kafka Engineer & Consultant” in the subject line of your application.
We look forward to receiving your application to the following email address: email@example.com