HomeToGo is one of the best-known clients of NFQ – a dedicated team of 90 of our software engineers and data science professionals develop the innovations behind its business success. As HomeToGo technology partner, we are expanding our team in Lithuania and are ready to hire the brightest talents in the IT market.
About HomeToGo: a company built around the idea of finding the ideal accommodation for every trip. In just six years since its founding, the company has expanded and now operates 23 localized websites in Europe, America, and the Asia-Pacific region. It has become the world’s largest vacation rental search engine, including brands like CASAMUNDO and Tripping. With over 18 million vacation rental offers worldwide from more than 1,300 travel websites like Airbnb, Booking.com, HomeAway, TripAdvisor, and more, we make finding and booking a vacation rental easier than ever!
As a Data Engineer, you will join our ambitious and forward-thinking colleagues in the Data Engineering team. At HomeToGo we capture, process and store hundreds of gigabytes of new data on a daily basis using technologies such as Apache Kafka and Apache Spark. Our data lake holds hundreds of terabytes of data in AWS S3 which is utilised in various ways by different teams running data jobs in Apache Airflow: from building self-service analytics dashboards via AWS Redshift, Apache Druid, Redash and Tableau to training ML models which make thousands of decisions per second on our websites every day. You will contribute to HomeToGo’s data platform by developing our Data Warehouse, which has to meet challenging scalability, performance and usability requirements. Your work will be critical to ensure that everyone at HomeToGo can make data-driven decisions efficiently and reliably on a daily basis.
In this role, you will
- Manage large volumes of data
- Develop and optimize data structures and batch processing flows
- Integrate external data sources with our DWH and Data Lake
- Ensure high quality, durability and scalability of our DWH processes
- Search for new technologies and tools to improve data handling within the organization
What you’ll bring
- Excellent knowledge of SQL and relational databases
- Experience in developing complex Data Warehouses
- Experience in creating and scheduling batch data processing flows (e.g Airflow)
- At least basic coding skills with Python
- Experience working with AWS or other cloud platforms is a plus
- Practical communication skills and ability to work in a group
- Fluency in English
What we offer
- Career of growth opportunities and promotions, 360 feedback, performance evaluation system, and mentoring from an international and distinguished team.
- Culture of empowerment, trust, recognition, autonomy and quarterly transparency about company goals. A focus on work-life balance, combined with consistent collaboration and support from a team of 30+ nationalities.
- Compensation of attractive salary, health insurance, additional holidays, flexible working hours, training budget, language courses, employee-led workshops, office perks, frequent team building and company events, business trips to Berlin. Salary commensurate with expertise and experience: 3000-5000 EUR gross.