Altenar is an international IT company founded in 2011, with offices in Malta, Greece, Georgia, the Isle of Man, and Uruguay. We specialize in high-load software development and provide one of the best technology solutions for the iGaming industry worldwide.
Altenar is interested in recruiting the services of a Data Infrastructure Engineer to work closely within the Technology team to focus on the deployment, configuration and ongoing maintenance of the Altenar Intelligence Platform as well as actively participating in its architectural roadmap. The candidate shall be experienced in working with huge datasets at scale, preferably using SQL and languages such as Python, Scala, Java or similar for implementing pipelines and building Data Marts.
The candidate shall also have understanding of data warehousing principles. This role assumes working closely with the Data Analyst and DBA roles at Altenar as well as with the development teams. The candidate shall contribute to system and software architecture decisions on a continuous basis across Altenar products that interface with the Altenar Intelligence Platform. This process also involves working closely with the Infrastructure and Environments teams. This requires
strong communication and collaboration skills and the motivation to achieve results in a dynamic business environment.
Responsibilities:
The activities of a Data Infrastructure Engineer include but are not limited to:
-
Continuously updating core knowledge and skills in the infrastructure and performance aspects of the streaming, warehousing and big data technologies that Altenar harnesses in the operation of its business.
-
Helping with designing, development, and support of data pipelines, data models and reporting systems to achieve business milestone and have a reactive DDD (Data Driven Decision)
-
Working with Data Analysts, Infrastructure Engineers and Services Engineers implementing infrastructure for the collection, movement and storage of data and the associated instrumentation and performance metrics
-
Manage the availability of the data and the platform by developing, monitoring and troubleshooting the backup and recovery strategy
-
Reviewing log files and metrics and providing feedback and alerting to data analysts and developers
-
Taking responsibility for the data security of a chosen data solution
-
Participation in the research, test, design of new data service tools for online or warehoused data
-
Work in a fast-paced, dynamic, multinational, multicultural environment
-
Creating and maintaining documentation for the solutions provided
Experience/Skills Required:
-
Experience with Python and SQL for the Data Engineering application
-
Experience with GreenPlum and ClickHouse databases
-
Experience with various messaging systems and distributed logs, such as Kafka and RabbitMQ
-
Experience with DevOps practices and tools (practical use of Git and Infrastructure-as-Code practices is a must)
-
Prior direct or indirect involvement in data warehouse/data mart design, tool selection and support
-
Strong systems background with deep understanding of IO performance, and network protocols
-
Strong research and validation skills
-
Proficiency in spoken and written English
-
Strong presentation skills
Additionally:
-
Experience with Scala/Java and/or C# is considered an advantage
-
Experience in implementing Real Time Streaming Architecture is a big advantage
-
Experience with streaming technologies such as Spark Streaming is a plus
-
Experience in Docker, Kubernetes and the use of Helm charts is a plus
-
Experience with Grafana, Prometheus is a plus
-
Graduate-level education in Computer Science, Engineering or other relevant field is an advantage
Benefits:
-
Good compensation
-
Health insurance
-
Gym discount
-
Corporate English lessons
-
Modern comfortable offices in Moscow, Saint-Petersburg, Vladimir or remote work
-
Monthly compensation for electricity and the Internet
-
Multinational team with experts around the world
-
Participating in professional events and corporate events