Ми використовуємо cookies для покращення досвіду користувачів, аналізу трафіку і відображення відповідної реклами.
Детальніше Погоджуюсь
Введіть посаду

Огляд статистики зарплатні професії "Data Security Engineer в Києві"

Отримувати інформацію зі статистикою на пошту

Огляд статистики зарплатні професії "Data Security Engineer в Києві"

25 000 ₴ Середня зарплата в місяць

Рівень середньої зарплатні за останні 12 місяців: "Data Security Engineer в Києві"

Валюта: UAH USD Рік: 2024
На гістограмі зображено зміну рівня середньої заробітної плати професії Data Security Engineer в Києві.

Рекомендовані вакансії

Спеціаліст/-ка зі Збору Данних/ (Data Analysis Specialist (DTM)) - Міжнародна організація з міграції (МОМ), Представництво в Україні
МІЖНАРОДНА ОРГАНІЗАЦІЯ З МІГРАЦІЇ (МОМ) - ПРЕДСТАВНИЦТВО В УКРАЇНІ, Київ
Classification: General Service Staff, Grade G7Type of Appointment: Fixed-term, one year with the possibility of extensionEstimated Start Date: As soon as possibleClosing Date: 18 July 2023 Context:Almost eight years after the onset of the crisis in the East, following the Russian invasion in February 2022, Ukraine now experiences a full-scale war. The consequences of violence have now spread country-wide, and over 5 million are estimated to have been internally displaced. Displacement is likely to become protracted as fighting continues, including the targeting of civilian infrastructure. New waves of displacement are expected, and continued deterioration of living conditions among the most vulnerable is likely. nt. Under the overall supervision of the Chief of Mission and the direct supervision of the Project Officer (DTM), the successful candidate will be responsible and accountable for data analysis activities within the Data & Analytics Unit. These activities will focus on the development and implementation of workflows for processing, analysing, and monitoring the quality of data collected through the Displacement Tracking Matrix field operations as well through other data collection programmes in the Data & Analytics unit. Core Functions / Responsibilities:In coordination with the Project Officer (DTM), design, implement, and monitor data flows to ensure that the analysis of information collected by the Data & Analytics unit is timely, effective and of the highest quality. Coordinate with the Project Officer (Reporting) timely preparation and dissemination of analytical reports in accordance with IOM procedures and donor requirements.  Contribute to the development and upkeep of data quality control monitoring mechanisms. Design and create data visualizations and dashboards for both internal use and external dissemination. Participate in the development and adjustment of methodologies, tools and standard operations procedures within the Data and Analytics Unit. Respond to internal and external data requests by providing timely and relevant analysis input, ensuring that the findings are clearly and understandably disseminated to all technical and non-technical stakeholders. Attend relevant conferences, workshops, working groups, interagency coordination meetings, and other forums. Track relevant national developments pertaining to displacement and priority needs monitoring across Ukraine. Contribute to the planning, development, organization and delivery of capacity building activities targeting IOM staff, government and civil society partners, implementing partners and communities. Support training activities and technical guidance to project staff on data processing and analysis. Promote and build capacity of staff and partners on IOM’s Data Protection Principles. Keep the supervisor informed on the status of programme implementation, identify gaps and suggest actions to improve implementation. Support preparation of project proposals and a diverse range of communication products, concept notes and other planning documents. Plan, lead, and coordinate data analysis efforts, including monitoring of implementation of analytical activities to ensure work is proceeding according to established plans, assessing implementation difficulties and providing recommendations for adjusting implementation modalities and work plans to best reflect changing environment on the field. Undertake duty travel relating to project implementation, monitoring visits, project assessments, liaison with counterparts, etc. Perform such other duties as may be assigned.Required Qualifications and ExperienceEducationHigh School degree with seven years of relevant experience;ORBachelor’s degree or equivalent in Sociology, Statistics, IT or data science, Political or other Social Sciences, International Relations, Development Studies, Migration Studies, Human Rights, Law or related fields from an accredited academic institution with five years of relevant professional experience.Experience and SkillsExperience with implementation of quantitative analysis on complex datasets, including time-series is necessary; Experience managing automated data pipelines is a distinct advantage; Experience working in humanitarian or development organizations is welcome; Experience in working with migrants, refugees, internally displaced persons, victims of trafficking and other vulnerable groups is a distinct advantage; and, Prior work experience with international humanitarian organizations, non-government or government institutions/organization in a multi-cultural setting is an advantage; In depth knowledge of and ability to select and independently lead implementation of analytical methodologies is required, Reliable ability to use and train other staff on use of at minimum one statistical software (Python, SPSS, R or STATA) is required; Attention to detail and ability to navigate complex data sets and databases is required; Understanding of the data-requirements in humanitarian and recovery programming and related methodological frameworks and tools (IASC, IRIS, DTM and others) is a distinct advantage.OtherAny offer made to the candidate in relation to this vacancy notice is subject to funding confirmation.Appointment will be subject to certification that the candidate is medically fit for appointment and security clearances.A prerequisite for taking up the position is legal residency in the country of the duty station and work permit, as applicable.How to apply:Interested candidates are invited to submit their applications filling in the IOM Personal History Form   and sending to [email protected]  by 18 July 2023 the latest, referring to this advertisement in the subject line of your message.Only shortlisted candidates will be contacted.
Асистент/-ка Проєкту / (Project Assistant (Data Management, Housing/Shelter)) - Міжнародна організація з міграції (МОМ), Представництво в Україні
МІЖНАРОДНА ОРГАНІЗАЦІЯ З МІГРАЦІЇ (МОМ) - ПРЕДСТАВНИЦТВО В УКРАЇНІ, Київ
Open to Internal and External Candidates Position Title: Project Assistant (Data Management, Housing/Shelter)Duty Station: Kyiv, UkraineClassification: General Service Staff, Grade G4Type of Appointment: Fixed Term, one year with the possibility of extensionEstimated Start Date: As soon as possibleClosing Date: 10 July 2023 Established in 1951, IOM is a Related Organization of the United Nations, and as the leading UN agency in the field of migration, works closely with governmental, intergovernmental and non-governmental partners. IOM is dedicated to promoting humane and orderly migration for the benefit of all. It does so by providing services and advice to governments and migrants.IOM is committed to a diverse and inclusive environment. Internal and external candidates are eligible to apply to this vacancy. For the purpose of the vacancy, internal candidates are considered as first-tier candidates.Context:The International Organization for Migration (IOM) is the only international inter-governmental agency with a specific mandate for migration and is dedicated to promoting humane and orderly migration for the benefit of all. It does so by providing technical expertise to governments, migrants, and host communities through a wide range of sustainable solutions contributing to support populations affected by forced migration and improve living conditions of Internally Displaced Persons (IDPs).Under the overall supervision of the Chief of Mission and Programme Coordinator (Shelter and Housing), the direct supervision of Information Management and GIS Officer, in close coordination with Shelter and Housing Unit and other units’ Information Management Teams, the successful candidate will provide support to the implementation of IOM Ukraine’s efforts to increase community resilience and cohesion in. The incumbent will be responsible for monitoring the unit activities regarding data collection/management, cleaning data, and providing guidance to relevant colleagues in different hubs. S/he will support the unit and the supervisor regarding data management and reporting needs. Core Functions / Responsibilities:Provide support to Activity Reporting Tool (ART) through data cleaning and datasets preparation. Support the delivery of training on data collection and data management on ART and other integrated information management systems. Assist Information Management (IM) and GIS Officer with preparing draft reports for IOM internal reporting through PDMS, donor reporting, cluster reporting, etc. Support with activity mapping for the Shelter and Housing Unit to highlight coverage, gaps, and needs by overlaying vulnerability information. Contribute to inputs/notes for sitreps, address data requests and maintain relevant datasets updated, and coordinate with IM/Data Management counterparts at IOM as may be needed. Support strengthening existing monitoring and reporting mechanisms to improve data collection tools and analysis. Perform such other duties as may be assigned.Required Qualifications and ExperienceEducationHigh school diploma/certificate or equivalent with at least four years of relevant work experience;ORBachelor’s degree or equivalent in Computer Science/ Geographic Information Systems/ Geography/mathematics or relevant area from an accredited academic with 2 years of relevant experience. ExperienceExperience in the management and coordination of information flows, and data management, including collection, storing, processing and analysing data to generate information products; In-depth knowledge of the latest technological developments in information technology and information system; Experience with handling confidential data and personal data; Experience in carrying out user needs analysis and scoping for the development of databases; Previous experience in conflict/post-conflict areas is desirable. Proven skills in analyse statistical information; Ability to formulate IM-related technical requirements and Standard Operating Procedures; Ability to translate planning specifications into technical briefs for data capture and analysis, and vice-versa; Ability to compile and holistically analyse diverse datasets; Team-building and information management skills; Demonstrated understanding of different data collection methodologies; Understanding of relational data theory; Advanced data visualisation and information design skills.SkillsAdvanced data visualisation and information design skills; Advanced Power Query, Power Apps, and MS Excel skills; Experience using data visualisation and design tools such as Power BI and Adobe Illustrator/Photoshop; Kobo toolbox, Survey123 or ODK design and implementation for data collection; and, Photoshop editing for the development of infographics.OtherAny offer made to the candidate in relation to this vacancy notice is subject to funding confirmation.Appointment will be subject to certification that the candidate is medically fit for appointment and security clearances.A prerequisite for taking up the position is legal residency in the country of the duty station and work permit, as applicable.How to apply:Interested candidates are invited to submit their applications filling in the IOM Personal History Form   and sending to [email protected]  by 10 July 2023 the latest, referring to this advertisement in the subject line of your message.Only shortlisted candidates will be contacted.
Національний/-а Керівник/-ця Проєкту з Обміну Данних / (National Data Liaison Officer) - Міжнародна організація з міграції (МОМ), Представництво в Україні
МІЖНАРОДНА ОРГАНІЗАЦІЯ З МІГРАЦІЇ (МОМ) - ПРЕДСТАВНИЦТВО В УКРАЇНІ, Київ
Duty Station: Kyiv, UkraineClassification: National Officer, Grade NO-BType of Appointment: Fixed-term, one year with possibility of ExtensionEstimated Start Date: As soon as possibleClosing Date: 10 August 2023 Core Functions / Responsibilities:Oversee the implementation and further programmatic development of activities falling under the the “evidence-based governance” pillar of IOM Ukraine’s Data and Analytics Unit. In close coordination with IOM colleagues overseeing partnership with the Government of Ukraine, liaise and coordinate with government entities (nationally, regionally and locally), implementing partners, United Nations agencies, civil society organisations, donors and other stakeholders on issues related to evidence-based policy-making and programming. Foster partnerships with key official statistics stakeholders in Ukraine, providing support and technical advice in relation to international standards and best practices on mobility, displacement, and demographic data. Contribute towards the planning, development and delivery of capacity building activities for IOM staff, partner organizations, government officials and other actors, in order to strengthen data collection, data analysis and evidence-based policy-making and programming. Participate in relevant conferences, workshops, steering committees, working groups, and other forums. Contribute to information management of the Data and Analytics Unit’s activities and outputs, including awareness raising and visibility, press releases, website updates and other relevant information-sharing materials. Contribute to the overall implementation of activities of the Data and Analytics Unit, including oversight of the financial, logistical, administrative and technical aspects. This must be conducted in accordance with IOM’s policies, practices and global standards, as well as relevant requirements, guidelines and grant agreements. Monitor the implementation of projects according to the work plan; document and evaluate results; identify the causes of deviations and bottlenecks; and recommend and implement corrective actions. Promote and contribute to the integration and mainstreaming of gender, protection, human rights and other pertinent cross-cutting issues into programme implementation. Identify potential areas for project development and contribute to the development of new projects by selecting and summarizing background information, assessing the local context, and drafting segments of project proposals. Participate in the development and adjustment of methodologies, contingency plans, approaches and standard operating procedures to respond to emerging challenges in Ukraine through a consultative process with other relevant parties. Coordinate the elaboration and dissemination of reports for donors, government and other relevant stakeholders, ensuring timely submission and compliance with donor and IOM requirements. Undertake duty travel as required in relation to project implementation and monitoring. Perform other related duties as assigned.Required Qualifications and ExperienceEducationBachelor’s degree in political or Social Sciences, International Relations, Development Studies, Migration Studies, Human Rights, Law, Statistics, Information Technology, Computer/Data Science or related fields from an accredited academic institution with four years of relevant professional experience; or, Master’s degree in one of the above-mentioned fields with two years of relevant professional experience.ExperienceExperience in liaising with national, regional and/or local governmental authorities in Ukraine, national/international institutions, United Nations agencies and non-governmental organizations; Experience in coordinating high-level policy consultations among national and international stakeholders; Prior experience in developing and/or coordinating international and/or national policy on data and population statistics; Experience in working in development or humanitarian programmes will be considered advantageous; Experience in working with migrants, refugees, internally displaced persons, victims of trafficking and other vulnerable groups will be considered advantageous; and Prior work experience with international humanitarian organizations, non-government or government institutions in a multi-cultural setting is an advantage.SkillsIn-depth knowledge of the of the global efforts aimed at improving national and international refugee, internally displaced persons and statelessness statistics through the development of international recommendations on how to produce, compile and disseminate statistics on these populations; Keen understanding of key issues in national and international population statistics and public governance; Statistical background and ability to analyze and interpret data collected through quantitative and qualitative methods; Knowledge of UN and bilateral donor programming; High level of written expression in both English and Ukrainian languages, including formal correspondence.LanguagesFor all applicants, fluency in English and Ukrainian is required (oral and written).OtherAny offer made to the candidate in relation to this vacancy notice is subject to funding confirmation.Appointment will be subject to certification that the candidate is medically fit for appointment and security clearances.A prerequisite for taking up the position is legal residency in the country of the duty station and work permit, as applicable.How to apply:Interested candidates are invited to submit their applications using the IOM Personal History Form and sending to [email protected]  by  10 August 2023 the latest, referring to this advertisement in the subject line of your message.Only shortlisted candidates will be contacted.
Senior Data Engineer || Big Data Engineer
DCV Technologies, Kyiv, Kyiv city, ua
Senior Data Engineer: Experienced in Scala, Python or Java and Unix/Linux environment on-premises Experience with developing RESTful APIs using Java Spring Boot Hands-on experience building data pipelines using Hadoop components (Apache Hadoop, Scala, Apache Spark, YARN, Hive, SQL) Experience optimizing Spark jobs Experience with time-series/analytics db's such as Elasticsearch Experience with industry standard version control tools (Git, GitHub) and automated deployment tools (Ansible & Jenkins)Basic shell-scripting knowledge Understanding of big data modelling techniques using relational and non-relational techniques Needs to be a Self-starter, proactive and with a team focused attitude Willingness to learn and quick to adapt to changing requirements Experience and understanding of SDLC (Software Development Lifecycle) Good to have: Create/Update/Maintain shared Jenkins library Understanding or experience of Cloud design patterns Great communication skills Big Data Engineer (Hadoop) Experience in Python, pySpark and Unix/Linux environment on-premises Hands-on experience building data pipelines using Hadoop components (Apache Hadoop, Apache Spark, YARN, Hive, SQL) Experience with industry standard version control tools (Git, GitHub) and automated deployment tools (Ansible & Jenkins) Experience optimizing Spark jobs Basic shell-scripting knowledge Understanding of big data modelling techniques using relational and non-relational techniques Needs to be a Self-starter, proactive and with a team focused attitude. Willingness to learn and quick to adapt to changing requirements. Experience and understanding of SDLC (Software Development Lifecycle) Ability to work with business users and the team stakeholders such as project managers and architects. Excellent data analysis skills with experience dealing with large and complex data sets. Good knowledge of Structured Query Language (SQL) Highly analytical with excellent attention to detail. Experienced in analysing business requirements and turning them into effective functional solutions. Excellent communication, both written and spoken (in English). Demonstrable track record of delivery. Strong interpersonal skills and ability to both lead and contribute within a small team. Nice To Have: Experience with time-series/analytics db's such as Elasticsearch Experience in Scala Experience with developing RESTful APIs using Java Spring Boot Familiar with Cloud technology components (preferred: GCP)
Mid Senior Data Analytics Engineer
TransferGo, Kyiv, Kyiv city, ua
TransferGo  is a growing fintech scale-up on a mission to make the world a fairer place for migrants. We strive to provide tailored, more affordable financial services that make people's hard-earned money go further. Now in our 12th year, with over 370 employees in offices across Europe, we’re crafting a brilliant, relevant product that makes a difference in people's lives and the well-being of their families.  We’ve come this far by building a talented, diverse workforce on a fair culture and our strong values. Having this strong team of employees, we can serve those who really need our product to make their lives better.  The Data team plays a critical role in shaping our decisions based on data. To ensure we’re doing that in the best way possible, we’re now looking for a talented Mid/Senior Data Analytics   Engineer to join our team. Here’s what you’ll be doing as a Data Analytics Engineer: Develop new and support existing ETL processes by using dbt and Redshift.  Build data validation rules and data quality monitors for data quality assurance. Optimize data workflows for performance, scalability, and reliability Ensure compliance with data governance and security standards Here’s what we’d love from our new Data Analytics Engineer: Very good SQL knowledge Familiar with database and data modeling techniques Experience in performance tunning on Analytical databases (Redshift, BigQuery, Snowflake, etc.). Experience with dbt, Git And if you can also do this stuff, even better:  You are a good communicator and are able to use English effectively at work You can champion engineering innovation and best practices You enjoy sharing your experience with other engineers and helping others to succeed Here are some of our core tools helping us to get the job done: Data storage: AWS Redshift, S3, MySQL, ElasticSearch,  Data modeling and transformation: dbt, Glue, Python, Lambda CI/CD: Github
Senior Data Integration Engineer (Python, Databricks, Spark SQL)
Epam, Kyiv, Kyiv city, ua
Description DESCRIPTIONOur client is a global, privately owned company thatconnects people with ideas, data with insights, supply with demand, restaurants with deliveries and ultimately, people with the products they love. Responsibilities Implementation of DWH and Data Hubs, including the full ETL process Implementation of Data Models Unit Testing Requirements Experience building data ingestion pipelines with tools like: Databricks, SSIS, Talend, Informatica, etc Experience implementing data models that have been designed by someone else Strong SQL The plan is to have the data engineers create Databricks notebooks using either SQL or Python, but SQL is preferable Experience unit testing code Upper-Intermediate level of English, both spoken and written (B2+) Nice to have Azure/other cloud storage technologies, SSAS tabular, Azure Data Factory, Azure DevOps We Offer Competitive compensation depending on experience and skills Individual career path Unlimited access to LinkedIn learning solutions Sick leave and regular vacation English classes with certified English teachers Flexible work hours
Senior Security Engineer
Remitly, Kyiv, Kyiv city, ua
Remitly’s vision is to transform lives with trusted financial services that transcend borders. Since 2011, we have been tirelessly delivering on our promises to people who send money around the world. Today, we are reimagining global financial services and building products that extend beyond traditional barriers to give customers access to more of the services they need, no matter where they call home. Join over 2,700 employees worldwide who are growing their careers with purpose and connection with our customers while having a positive impact on millions of people around the globe. The Role We're searching for an experienced Security Engineer to join Remitly's Intrusion Detection & Response Team. This role will help the team build out and own tools and capabilities and help advance the D&R program at Remitly. The role reports to the Director of Detection & Response. You'll accomplish this with a "detection as code" engineering mindset and partner closely with other team members and stakeholders in external teams. Your work will directly impact the security of Remitly data and help to safeguard our users. What You'll Do Design and build systems to detect and investigate potentially malicious activity Create and tune analytics to proactively detect threats with high quality ATT&CK coverage and low false positive rates Investigate and triage interesting or suspicious events Drive incident response efforts across cross-functional teams Help define and execute threat detection and response strategy Participate in the team "on-call" service rotation You Have 5+ years of experience in security or systems engineering 3+ years of experience of those in threat detection or threat response, preferably in a cloud-first environment (IaaS, PaaS, Saas) Bachelor's degree in a related discipline OR equivalent practical experience Ability to lead in complex operating environments, sometimes in high stress situations Experience building and automating threat detection analytics and threat hunting methodologies Know what the MITRE ATT&CK framework is and how to apply it Strong alignment to our mission and values Attention to detail, operates with a high degree of discretion Strong written and verbal communication skills in English Our Benefits Employee Stock Purchase Plan (ESPP) Equity in the company (RSUs) min. 26 days paid holidays + additional Remitly days off Royalties (KUP) Hybrid work arrangements with an office in a Kraków City Centre Commuting to work expenses reimbursement Health/Dental Coverage - LUX MED VIP for employee and family Life Insurance Travel insurance for employee and family Sodexo Lunch Card/Multisport Education / Conferences Budget Equipment of your choice Mental health program for employee and their dependents Family planning program Employee Pension Plan (PPK) Headphones Reimbursement Referral bonus scheme
Junior Data Backend Engineer
TantusData, Kyiv, Kyiv city, ua
Junior Data/Backend Engineer – minimal experience required  No prior professional experience as data engineers is needed, backend engineers looking to develop new skills are welcome.  We offer: The opportunity to learn from seasoned developers with experience from top global companies  Flexible hours and a flexible work environment  Training and upskilling  B2B salary: 50-90 PLN/h  Hands-on experience with technologies like Spark, Databricks, Cloud (GCP),  Top equipment and software  You will: Be collaborating with and learning from your teammates  Be learning whilst on projects – with full support from your mentors  Be building scalable solutions focusing on value delivery for customers  Be designing, building, deploying, monitoring, and maintaining various products processing vast amount of data  Work on multiple projects – this lets you to experience various approaches and technologies  You must: Have a solid understanding of object-oriented and functional programming  Have good knowledge of algorithms and data structures  Be able to communicate fluently in English  Be willing to learn from teammates and on your own  Be a quick learner and problem-solver (not a cliche, we mean it)  Nice if you: Have commercial experience in software development (especially Java, Scala, Python)  Are willing to travel abroad every now and then  Have experience (can be academic) with distributed NoSQL databases like: Cassandra, HBase  Have experience (can be academic) with technologies like: Spark, Hadoop, Kafka, Tensorflow Have experience (can be academic) with cloud technologies (AWS, GCP, Azure)  Fancy being a part of a company focused on data related technologies?  We process terabytes of data, build machine learning models and deploy them to production.  But most importantly we champion expertise. So we will help you become a top data engineer. 
Data Azure Engineer
Devapo Sp. z o. o., Kyiv, Kyiv city, ua
Our company is seeking a skilled Data Azure Engineer to join our team and be part of a project for a large financial institution. As a Data Azure Engineer, you will develop cutting-edge data platforms using Azure technology. You will have the opportunity to lead important projects, participate in decision-making processes and collaborate with a diverse team of experts. Your expertise will drive breakthrough solutions for financial institutions and beyond. Being part of our team, you will receive a multitude of additional benefits to enjoy outside of work. What we expect: 3+ years of experience in data engineering Python programming knowledge Deep knowledge and strong experience in technologies (Azure Synapse, Databricks, Azure Datafactory, PySpark) Strong and proven knowledge working with data lakes, lakehouses, and data warehouses in the cloud (Databricks, Snowflake, Azure) Advanced proficiency working with SQL Strong experience with ETL/ELT processes, data ingestion, data transformation, data modeling Experience in code repositories(Git) Nice to have: Dbt knowledge and experience Knowledge of Kubernetes Responsibilities: Developing data platforms, data lakehose, data warehouses End-to-end development of ETL/ELT processes Building and designing data platform components to enable clients to produce and consume data Developing, implementing, and maintaining change control, testing processes Researching and implementing best practices and new approaches to our current data stack and systems Data modeling What we offer: Salary: 90 - 120 PLN + VAT (B2B contract) Co-financing of trainings and certificates and provide assured time for learning within working hours, Private medical care and Multisport card Language classes (English), Flexible working hours, Meetings and integration events, Reference bonus for recommending a new hire, Individually tailored path for your career development, The ability to work in a hybrid form from our Warsaw office
GCP Data Platform Engineer
HRO Digital, Kyiv, Kyiv city, ua
HRO Digital is an international company providing recruitment support within #Fintech, #Finance and #Banking market in EMEA. We connect the most innovative organizations with the best people in the market. We conduct systematic market research, which allows our Digital Teams to be a step ahead of the competition.   Do you want to work for one of the world’s largest global banks? Want to be part its exciting digital transformation? Do you want to engineer incredible products for millions of customers? Well, our Client offers just that ︎ It's a leader in digital transformation of banking services and Cracow is one of the most important technological centers - majority of projects are delivered from Poland ︎   We are looking for an experienced Data / Platform Engineer to support various tasks related with development, analysis and maintenance of data processes on top of data platforms maintained by the team in GCP.   The job tasks could be related to analysis of data engineering pipelines, refactoring of pipelines, migration of processes between platforms and technologies, and following or establishing best practices for data engineering. An ideal candidate will have also a very good understanding of CI/CD processes and will be able to support Data Operations (DataOps).   Skills: experience with Spark (either Python or Scala) general programming skills data analytics skills very good general analytical skills experience with GCP data services (especially: BigQuery, Dataproc) understanding of cost and performance within GCP experience with reporting tools development communication and collaboration skills (co-operation with various technical and business teams is required on daily basis) and problem solving Nice to have: understanding of Airflow (DAGs development and system administration) experience with development in CI/CD tools and IaaC understanding of SRE, Service Management, application life-cycle management   We offer: Prestigious position at one of the world's largest banks Competitive salary with a B2B contract Almost remote work (1 day per 2 - 3 months from the office in Cracow) and flexible working hours Working with cutting-edge IT technologies Personal growth and development opportunities within the organization Private healthcare coverage and multisport card Referral program and company events Convenient parking, relaxation and game rooms, bicycle racks and showers for cyclists   Our recruitment process comprises of two meetings with hiring managers, followed by an initial phone screening with our recruiter.
Senior Software Engineer - Database Platform
Snowflake, Kyiv, Kyiv city, ua
Build the future of data. Join the Snowflake team. A massive new market opportunity is being created at the intersection of Cloud and Data, and the Snowflake Data Cloud is leading the way, all powered by the database engine we are building from the ground up.  Key to Snowflake’s Database Engine is our large scale distributed transactional Key-Value store - called FDB - which powers all of Snowflake’s products and services and is rapidly evolving to meet Snowflake’s future needs. FDB runs on multiple cloud providers including Amazon Web Services, Microsoft Azure and Google Cloud. The elastic infrastructure FDB runs on is being built from the ground up and is envisioned to be a cloud agnostic, fully automated manageability platform that provides: Autoscaling and auto-balancing of clusters based on utilization, traffic and workloads Auto-provisioning of new clusters with zero manual intervention Self-healing capabilities that prevent, mitigate and resolve any production impact Built-in configuration management that guarantees FDB runs correctly and on the intended topologies Self-optimizing COGS efficiency, ensuring we run our clusters at optimal utilization We are looking for an outstanding Senior Software Engineer with a passion for large scale databases and distributed systems to help us take the FDB platform to the next level.  AS A SENIOR SOFTWARE ENGINEER ON THIS TEAM, YOU WILL: Design and implement scalable distributed system solutions for our cloud agnostic platform. Analyze fault-tolerance and high availability issues, performance and scale challenges, and solve them. Own the end to end delivery of your projects, from identifying a solution, to design, implementation, test and safe production rollout Understand trade-offs between consistency, durability and costs to build solutions which can meet the demands of rapidly growing services. Build the next generation transaction system, caching, storage engine and multi tenant capabilities Evangelize best practices in database usage and end-to-end architecture.  Pinpoint problems, instrument relevant components as needed, and ultimately implement solutions. AN IDEAL CANDIDATE WILL HAVE: 5+ years industry experience designing, building and supporting large scale infrastructure in production. Experience designing, building, and operating large-scale distributed systems infrastructure supporting stateful services Experience in container orchestration, cluster management, or autoscaling. Excellent understanding of operating systems concepts including multi-threading, memory management, networking and storage, performance and scale. Systems programming skills including multi-threading, concurrency, etc. Fluency in Java, C++, or C is preferred. Solid understanding of the internals of Kubernetes, Mesos, OpenShift, or other container platforms Experience with scalable Key-Value stores such as FoundationDB, RocksDB/LevelDB, DynamoDB, Redis, etc. a plus. Track record of delivering highly complex projects in the distributed systems space Intense curiosity, willingness to question and passion for making systems better Experience with one or more of the following highly desired:  Big Data storage technologies and their applications (HDFS, Cassandra, Columnar Databases, etc.) Scalable Key-Value stores such as FoundationDB, RocksDB/LevelDB, DynamoDB, Redis, Cassandra, etc. BS in Computer Science; Masters or PhD Preferred. About Snowflake: Snowflake SIGMOD 2016 paper About FoundationDB:  FDB SIGMOD 21 Paper FoundationDB Summit 2018  and  FoundationDB Summit 2019 How FDB powers Snowflake Metadata Forward! SALARY We believe all Snowflake employees have an impact in the long-term success of Snowflake, which is why new hire equity is designed to be a considerable part of your annual compensation. When the price of Snowflake stock rises, we are all rewarded. At Snowflake, equity is an important part of our total compensation package which is comprised of: Base salary  Bonus target or sales commission target Equity in the form of Restricted Stock Units (RSUs) The total target monthly compensation range for this job is 40,000 PLN – 80,000 PLN The final compensation offered will vary based on individual experience, skills, and job-based knowledge.  BENEFITS Snowflake is excited to offer a variety of benefits for our employees in Poland. For all details on benefits and perks you're eligible for as well as resources to help you understand your coverage, please review the following: Medical & Dental Insurance Mental Health Support Employee Capital Plan (PPK) Life Insurance Gym reimbursement / Multisport Phone reimbursement Modern Family Benefits Family Planning, Maternity/Paternity and Parenting Support with Maven Rethink: Parenting and family support for children with developmental disabilities or learning, social, or behavioral challenges. Adoption and surrogacy reimbursement Global Parental Leave And also: free snacks & coffee in the office Internal trainings, parties. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact?
Security Engineer
Remitly, Kyiv, Kyiv city, ua
Remitly’s vision is to transform lives with trusted financial services that transcend borders. Since 2011, we have been tirelessly delivering on our promises to people who send money around the world. Today, we are reimagining global financial services and building products that extend beyond traditional barriers to give customers access to more of the services they need, no matter where they call home. Join over 2,700 employees worldwide who are growing their careers with purpose and connection with our customers while having a positive impact on millions of people around the globe. The Role We're searching for a Security Engineer to join Remitly's Intrusion Detection & Response Team. This role will help the team build out and own tools and capabilities and help advance the D&R program at Remitly. The role reports to the Director of Detection & Response. You'll accomplish this with a "detection as code" engineering mindset and partner closely with other team members and stakeholders in external teams. Your work will directly impact the security of Remitly data and help to safeguard our users. What You'll Do Design and build systems to detect and investigate potentially malicious activity Create and tune analytics to proactively detect threats with high quality ATT&CK coverage and low false positive rates Investigate and triage interesting or suspicious events Drive incident response efforts across cross-functional teams Help define and execute threat detection and response strategy Participate in the team "on-call" service rotation You Have 3+ years of experience in security or systems engineering 2+ years of experience of those in threat detection or threat response, preferably in a cloud-first environment (IaaS, PaaS, Saas) Bachelor's degree in a related discipline OR equivalent practical experience Ability to work independently in complex operating environments, sometimes in high stress situations Experience building and automating threat detection analytics and threat hunting methodologies Familiar with the MITRE ATT&CK framework is and how to apply it Strong alignment to our mission and values Attention to detail, operates with a high degree of discretion Strong written and verbal communication skills in English Our Benefits Employee Stock Purchase Plan (ESPP) Equity in the company (RSUs) min. 26 days paid holidays + additional Remitly days off Royalties (KUP) Hybrid work arrangements with an office in a Kraków City Centre Commuting to work expenses reimbursement Health/Dental Coverage - LUX MED VIP for employee and family Life Insurance Travel insurance for employee and family Sodexo Lunch Card/Multisport Education / Conferences Budget Equipment of your choice Mental health program for employee and their dependents Family planning program Employee Pension Plan (PPK) Headphones Reimbursement Referral bonus scheme
Senior Database Reliability Engineer
ActiveCampaign, Kyiv, Kyiv city, ua
ActiveCampaign is seeking a Senior Database Engineer to lead the design, implementation, and optimization of our database systems. In this role, you will play a key role in architecting scalable and high-performance database solutions to support our growing data infrastructure needs. You will collaborate with cross-functional teams to ensure our database systems meet business requirements and performance goals. What your day could consist of: Lead the design, implementation, and optimization of database systems to support data warehousing and analytics requirements. Work closely with software engineers, data scientists, and analysts to understand data needs and design efficient database schemas and structures. Optimize database performance through query tuning, indexing strategies, and partitioning techniques. Establish and enforce best practices for database design, data modeling, and data governance. Monitor database health and performance metrics, and implement proactive measures to ensure system reliability and availability. Evaluate and recommend new technologies and tools to improve database scalability, performance, and efficiency. Mentor junior team members and provide technical guidance and expertise. Collaborate with cross-functional teams to troubleshoot and resolve database-related issues and challenges. Stay abreast of industry trends and advancements in database technologies, and drive innovation and continuous improvement. What is needed: 5 years of experience designing, implementing, and optimizing NoSQL database systems, such as DynamoDB, Cassandra, Vertica, Amazon Redshift, Google BigQuery, or Snowflake. Strong proficiency in SQL query optimization, database tuning, and performance monitoring. Hands-on experience with ETL processes, data modeling, and data pipeline orchestration. Experience deploying and managing services hosted on Kubernetes.  Solid understanding of distributed systems, data replication, and high availability architectures. Proficiency in programming languages such as Python, Java, or Scala. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills, with the ability to work effectively in a fast-paced, Agile environment. Experience leading database architecture and design projects is a plus.
Senior Data Integration Engineer (Azure, Databricks, SQL)
Epam, Kyiv, Kyiv city, ua
Description DESCRIPTIONOur client is a German multinational pharmaceutical and biotechnology company and is one of the largest pharmaceutical companies and biomedical companies in the world. Responsibilities Create data models and related data pipelines in Azure DataBricks and Data Factory for analytical dashboards, integrating multiple data assets Support the architectural decisions, and participate in the elaboration of new implementation proposals for our customers, e.g. providing high level estimations and helping establish the right assumptions Drive, lead and coach other BE engineers to implement data pipelines following the best practices and influencing the customer requirements Strive to understand the problems to solve and proactively make suggestions on the best way to addressed them (performance, data volume, data discrepancies or mismatches, operational costs, etc.) Understanding, having had the experience of working with sales and consumer good analytics the most important metrics and aggregations to provide and the challenges they often have Working with and supporting several analytical teams (frontend developers, product owners, QAs, solution architect) Requirements 5+ years of relevant development experience and practice with data management, data storage, data modeling, data analytics, data migration, and database design Expert hands-on experience with Databricks Expert-level knowledge of SQL Experienced with Azure Data Factory Production coding experience with one of the data-oriented programming languages Delta Lake Upper-Intermediate level of English, both spoken and written (B2+) We Offer Competitive compensation depending on experience and skills Individual career path Unlimited access to LinkedIn learning solutions Sick leave and regular vacation English classes with certified English teachers Flexible work hours
Data Center Engineer
Experis, Kyiv, Kyiv city, ua
As a Data Center Network Engineer you will join the team of network professionals, where you will be responsible for maintaining stable, reliable, and secure data center network services, their development and improvement. Requirements (not everything is required): Experience with  Cisco Nexus 3K/5K/9K Knowledge of dynamic routing protocols: OSPF, BGP Understanding of NGFW firewalls (Paloalto, Fortigate, Checkpoint) Understanding of IP networking, L2/L3 network protocols, TCP/IP, VLAN, VRRP, LACP, MC-LAG, EVPN with VXLAN, DHCP, DNS Good understanding of Application Delivery Controller (F5) Excellent analytical skills to troubleshoot complex technical problems Not required, but it’s an advantage if you are experienced with: Network automatization for infrastructure deployment Python programming o VMware NSX-V/NSX-T technologies Experience in public cloud network technologies Your personality: Strong organizational skills Team player Great time management skills Strong goal-oriented mindset and focus on high quality Strong sense of ownership of the network Proactive problem solver Fluent in English, written and spoken Offer: 100% remote work B2B via Experis MultiSport Plus Group insurance Medicover E-learning platform
Information Security Engineering Team Lead
GR8 Tech, Kyiv, Kyiv city, ua
This role is responsible for overseeing the implementation and maintenance of security policies, procedures, and tools to protect the network, cloud and data center infrastructure, endpoints, and overall data of the company. This includes defining the security objectives, standards, and best practices, as well as identifying the security risks and vulnerabilities to protect corporate infrastructure and corporate perimeter from threats. About your key responsibilities and impact: 1. Network Security — Architecting, implementing, and managing network security controls, such as firewalls, intrusion detection/prevention systems (IDS/IPS), network segmentation, and secure VPNs; — Defining and supporting security profiles for corporate VPN; — Supporting and securing 3d party connections to the corporate infrastructure; — Collaborating with network engineering teams to design and deploy secure network architectures and configurations that mitigate security risks and ensure data confidentiality, integrity, and availability. 2. Endpoint Security — Designing, implementing, and maintaining endpoint security solutions, including endpoint detection and response (EDR), and endpoint management platforms, DLP agents; — Developing and enforcing endpoint security policies, configurations, and standards to protect laptops from malware, unauthorized access, and data breaches; — Conducting regular vulnerability assessments and patch management activities to address security vulnerabilities and ensure endpoint compliance with security standards; 3. Cloud & Infrastructure Security — Designing and architecting secure cloud solutions based on industry best practices and security principles; — Designing and implementing security controls for AWS cloud environments and data center infrastructure, ensuring alignment with security best practices and compliance requirements; — Designing a set of requirements to harden infrastructure components. Essential professional experience: In-depth knowledge of cybersecurity concepts, tools, principles, best practices and technologies; Ability to develop and execute long-term security strategies to address evolving threats and risks; Knowledge of common security threats, vulnerabilities, attack vectors, and mitigation strategies across application, infrastructure, and network layers; Deep knowledge of implementing security controls and configurations as code using tools such as Terraform, Ansible; In-depth understanding of endpoint security technologies and OS security configuration best practices (Linux, Windows, macOS); Proficiency in antivirus software, endpoint detection and response (EDR), endpoint management platform, DLP for laptops and mobile devices; Strong understanding of firewall technologies, including packet filtering, and stateful inspection; Expertise in application layer filtering; Experience with designing and implementing firewall rules to enforce security policies; Expertise in configuring and managing Intrusion Detection/Prevention Systems (IDS/IPS) to detect and block malicious activities on the network; Proficiency in Virtual Private Network (VPN) technologies for securing remote access and site-to-site communications, including the configuration of VPN concentrators, authentication methods, and encryption protocols; Knowledge of device authentication mechanisms and protocols; Strong understanding of security controls and skills of their administration in one of the popular cloud providers (AWS, GCP, Azure); Knowledge of vulnerability management processes, including vulnerability scanning, prioritization, remediation, and tracking using tools like Nessus, Qualys, or OpenVAS; Expertise in securing containerized applications and orchestrators like Docker, Kubernetes, and Docker Swarm, including container image scanning, runtime security, and access control; Experience in defining, provisioning, and managing infrastructure resources using code, ensuring consistent and secure deployment environments; Familiarity with security governance frameworks, policies, standards, and regulatory requirements (e.g., GDPR, PCI DSS, ISO/IEC 27001); English: B2. What we offer: Benefits Cafeteria: Sports compensation; Medical coverage; Psychological support; Home-office coverage. Work-life: Remote work, Coworking compensation; Childcare budget; Maternity leave; Paternity leave; Additional 2 days for family events. Our GR8 Culture: Open feedback and transparent direct communications; Growth and development: better every day; High tolerance to experiment and mistakes; Supportive friendly environment.
Senior Software Engineer for Database Service
Samsung R&D Institute Poland, Kyiv, Kyiv city, ua
Senior Software Engineer for Managed Database Service About our Team On a daily basis, we build and maintain Samsung Private Cloud, the platform for internal company needs. We are looking for a Senior Software Engineer responsible for the development of cloud database and cloud data warehouse solutions. Role and Responsibilities Delivering microservices into a private cloud environment (Python, Go) Integration, maintenance, and monitoring of cloud database and cloud data warehouse solutions (MySQL, StarRocks, Redis) Implementing operational requests and providing support for the client Technologies in use Python, Go MySQL, StarRocks, Redis Amazon Web Services (AWS) Skills and Qualifications 5+ years experience in backend services development Proficient knowledge of Python or Go Experience in deployment, configuration, and maintenance of cloud-based distributed systems Practical knowledge of SQL Fluency in English Nice to have Experience in maintenance and monitoring of relational databases, data warehouses, or cache storage Experience in building and managing large, highly available enterprise-grade applications Working knowledge of public cloud systems (e.g. AWS, Azure, or GCP certification) Practical knowledge of software architecture, data modeling, and object-oriented programming concepts We offer Team: Friendly working atmosphere Wide range of trainings Opportunity to work on multiple projects Working with the latest technologies on the market Monthly integration budget Possibility to attend local and foreign conferences Start of work between 7 a.m. and 10 a.m. Equipment: PC workstation/Laptop + 2 external monitors Benefits: Private medical care (possibility to add family members for free) Multisport card Life insurance Lunch card Partial reimbursement of the cost of an English language course Possibility to learn Korean for free Variety of discounts (Samsung products, theaters, restaurants) Unlimited free access to Copernicus Science Center for you and your friends Possibility to test new Samsung products Location: Office in Warsaw Spire near metro station Attractive relocation package Hybrid work system (3 times per week from the office)
Senior Software Engineer - Data Clean Rooms
Snowflake, Kyiv, Kyiv city, ua
There is only one Data Cloud. Snowflake’s founders started from scratch and designed a data platform built for the cloud that is effective, affordable, and accessible to all data users. But it didn’t stop there. They engineered Snowflake to power the Data Cloud, where thousands of organizations unlock the value of their data with near-unlimited scale, concurrency, and performance. This is our vision: a world with endless insights to tackle the challenges and opportunities of today and reveal the possibilities of tomorrow. The Data Clean Rooms team builds services, systems, and product features to enable secure multi-party collaboration on sensitive data while preserving the privacy of the data. At Snowflake, we are aiming to provide customers with an integrated set of easy-to-use features allowing them to develop, deploy and monitor data clean rooms at scale. As a senior software engineer in Data Clean Rooms, you will Design and implement data privacy features and services to enable secure multi-party collaboration, including query constraints, data clean room construction, deployment and monitoring at scale. Work closely with PMs to drive projects from idea formulation to design and implementation. Our ideal senior software engineer will have: 8+ years of industry experience including designing, building, and supporting large-scale distributed systems in production. Strong fundamentals in computer science skills and system design. Experience with transactional and analytical database systems. Fluency in Java and familiarity with at least one of the following: Python, TypeScript. A strategic mindset and a strong sense of ownership. Excellent interpersonal communication skills. Great written and verbal English language skills. Bonus points for experience with the following: Experience in data privacy, privacy preserving analytics, ML. Industry experience in building large scale cloud applications. SALARY We believe all Snowflake employees have an impact in the long-term success of Snowflake, which is why new hire equity is designed to be a considerable part of your annual compensation. When the price of Snowflake stock rises, we are all rewarded. At Snowflake, equity is an important part of our total compensation package which is comprised of: Base salary  Bonus target or sales commission target Equity in the form of Restricted Stock Units (RSUs) The total target monthly compensation range for this job is 30,000 PLN – 60,000 PLN The final compensation offered will vary based on individual experience, skills, and job-based knowledge.  BENEFITS Snowflake is excited to offer a variety of benefits for our employees in Poland. For all details on benefits and perks you're eligible for as well as resources to help you understand your coverage, please review the following: Medical & Dental Insurance Mental Health Support Employee Capital Plan (PPK) Life Insurance Gym reimbursement / Multisport Phone reimbursement Modern Family Benefits Family Planning, Maternity/Paternity and Parenting Support with Maven Rethink: Parenting and family support for children with developmental disabilities or learning, social, or behavioral challenges. Adoption and surrogacy reimbursement Global Parental Leave And also: free snacks & coffee in the office Internal trainings, parties.
Data Quality Engineer
NeoGames, Kyiv, Kyiv city, ua
/Only for candidates based in Warsaw or Cracow who are comfortable working on Umowa o Pracę - Employment Contract (no possibility to work on B2B)/ Project description: Aristocrat Interactive is a leader in the Online Real Money Gaming space offering solutions spanning game content and aggregation, lotteries, online casino, sportsbook, game systems and managed services offered industry leading platforms. The Data & BI team owns the group’s Data & Analytics platforms spanning Data Engineering, Analytical Engineering and Business Intelligence to lead the group’s data-driven modernization both internally and for its clients. The Data Quality Engineer will play a vital role to develop and own automated and manual testing of data pipelines, reports and data products for customers and internal stakeholders. In this role, the chosen candidate will drive the implementation of Data Quality automation frameworks, tools, and processes to implement scalable and robust data-focused test frameworks and scripts. The ideal candidate will have experience using modern data stacks including Apache Airflow, Data Build Tool, Snowflake, and Kafka, along with a deep understanding of Python. Responsibilities: Establish Data Quality automation frameworks, tools, and processes Design and implement scalable and robust data-focused test frameworks and tools across all the stages of the data lifecycle – from ingestion through transformation to consumption Develop high-quality, automated data-centric tests using Airflow and dbt Creating thorough test strategies, test plans, and test cases for all Data features and products Participate in the data development lifecycle from research, technical design to implementation and maintenance Collaborate with the DevOps team to integrate automation test frameworks and tools into CI/CD Collaborate with the other development and product teams to define high-value test strategies and test cases on large datasets and transformations to determine data quality and data integrity Rigorously test data infrastructure components to comply with privacy regulations including the EU GDPR, PCI, etc. Participate in security audits and implement remedial actions Learn to identify opportunities to compound the growth and efficiency in testing and automation Drive innovation by constantly learning about new automation technologies and methodologies, along with experimentation   Required Skills and Experience: 3+ years of work experience in testing automation - preferably in a role related to data warehouses, databases, ETL/ELT and/or data visualisation Experience in testing applications using frameworks like Cypress, Selenium, and Appium Prior use of Apache Airflow, Data Build Tool, Snowflake, Kafka, Ansible and other data specific tools and applications Proficient in Advanced SQL and Python Experienced with data warehouses like Snowflake, BigQuery, RedShift or equivalent Experienced with databases like Microsoft SQL, PostgreSQL Experienced with data visualization tools like Power BI, Metabase, QlikSense, Tableau, SIsense or equivalent Understanding of software application testing best practices and philosophies with an emphasis around data integrity and data quality Familiarity with data streaming and/or event-driven data pipelines will be considered an asset Clear English written and spoken business communication skills What we offer: High-level compensation on an employment contract and regular performance based salary and career development reviews; Medical insurance (health), employee assistance program; Multisport Card; English classes with native speakers, trainings, conferences participation; Referral program; Team buildings, corporate events. How many interview stages do we have? HR interview Technical interview with the manager Send your CV to our email ASAP, because we can’t wait to start working with U and create cool projects together! LET’S MOVE THE WORLD TOGETHER!
MySQL DBRE Database Reliability Engineer
Sporty Group, Kyiv, Kyiv city, ua
We consistently top the charts as one of if not the most used Sports Betting website in the countries we operate in.  With millions of weekly active users, we strive to be the best in industry for our users. Sporty Group is a consumer internet and technology business with an unrivalled sports media, gaming, social and fintech platform which serves millions of daily active users across the globe via technology and operations hubs across more than 10 countries and 3 continents. The recipe for our success is to discover intelligent and energetic people, who are passionate about our products and serving our users, and attract and retain them with a dynamic and flexible work life which empowers them to create value and rewards them generously based upon their contribution. We have already built a capable and proven team of 300+ high achievers from a diverse set of backgrounds and we are looking for more talented individuals to drive further growth and contribute to the innovation, creativity and hard work that currently serves our users further via their grit and innovation. Our Stack Database: MySQL, MongoDB PaaS: AWS RDS, Redshift Monitoring: Grafana, Prometheus, PMM Infra management: Terraform, Jenkins Programming: Python, Shell script Cloud Services: AWS EC2, Cloudwatch etc. Server Operating System: CentOS ETL pipeline tools: Airflow Key Responsibilities Monitor existing database infrastructure via automated alerts and dashboards. Slow query monitoring and database system capacity proactive adjustment.  Automating deployments, config management, and building infrastructure with Terraform  Enhance our existing dashboards as well as develop new dashboards and alert mechanisms. Aid in reconfiguring existing architecture and database structure to allow for rapid deployment to new countries On call responsibilities on a a rotating pattern Requirements 3+ Years experience within a relevant domain Advanced MySQL and MongoDB troubleshooting ability  Strong skills in Python and general programming Experience working with Grafana/Prometheus Hands-on experience on PaaS, such as AWS RDS, GCP SQL, Atlas Experienced and keen in delivering quality documentation and operational runbooks Open mind and willing to take on new challenges in a rapidly growing organisation Benefits Quarterly and flash bonuses We have core hours of 10am-3pm in a local timezone, but flexible hours outside of this Top-of-the-line equipment Referral bonuses 28 days paid annual leave Annual company retreat Highly talented, dependable co-workers in a global, multicultural organisation Payment via DEEL, a world class online wallet system  Our teams are small enough for you to be impactful Our business is globally established and successful, offering stability and security to our Team Members Our Mission Our mission is to be an everyday entertainment platform for everyone Our Operating Principles 1. Create Value for Users 2. Act in the Long-Term Interests of Sporty  3. Focus on Product Improvements & Innovation  4. Be Responsible  5. Preserve Integrity & Honesty  6. Respect Confidentiality & Privacy  7. Ensure Stability, Security & Scalability  8. Work Hard with Passion & Pride Interview Process Online HackerRank Test (Max time of 90 Minutes) Remote video screening with our Talent Acquisition Team  Remote video interview with 3 x Team Members (45 mins each, not separate days) 24-72 hour feedback loops throughout process Post Interview Process Feedback call on successful interview Offer released followed by contract ID Check Via Zinc & 2 references from previous employers Working at Sporty The top-down mentality at Sporty is high performance based, meaning we trust you to do your job with an emphasis on support to help you achieve, grow and de-block any issues when they're in your way. Generally employees can choose their own hours, as long as they are collaborating and doing stand-ups etc. The emphasis is really on results.  As we are a highly structured and established company we are able to offer the security and support of a global business with the allure of a startup environment. Sporty is independently managed and financed, meaning we don’t have arbitrary shareholder or VC targets to cater to.  We literally build, spend and make decisions based on the ethos of building THE best platform of its kind. We are truly a tech company to the core and take excellent care of our Team Members.