Ми використовуємо cookies для покращення досвіду користувачів, аналізу трафіку і відображення відповідної реклами.
Детальніше Погоджуюсь
Введіть посаду

Огляд статистики зарплатні професії "Data Security в Київській області"

Отримувати інформацію зі статистикою на пошту

Огляд статистики зарплатні професії "Data Security в Київській області"

25 000 ₴ Середня зарплата в місяць

Рівень середньої зарплатні за останні 12 місяців: "Data Security в Київській області"

Валюта: UAH USD Рік: 2024
На гістограмі зображено зміну рівня середньої заробітної плати професії Data Security в Київській області.

Розподіл вакансії "Data Security" по областям Київській області

Як видно з діаграми, в Київській області найбільшу кількість вакансій професії Data Security відкрито в Києві. На другому місці - Бровари, а на третьому - Ірпінь.

Найдите подходящую статистику

Показати ще

Рекомендовані вакансії

Старший/-а Графічний/-а Дизайнер/-ка - Senior Graphic Designer (Data and analytics) - Міжнародна організація з міграції (МОМ), Представництво в Україні
МІЖНАРОДНА ОРГАНІЗАЦІЯ З МІГРАЦІЇ (МОМ) - ПРЕДСТАВНИЦТВО В УКРАЇНІ, Київ
Classification: General Service Staff, Grade G7Type of Appointment: Special Short-Term, Nine months with the possibility of extensionEstimated Start Date: As soon as possibleClosing Date: 16 June 2023Context:Since 2014, the IOM Data and Analytics team has been collecting, analysing and reporting on humanitarian needs in Ukraine. The scale of displacement and humanitarian needs across Ukraine remain extremely high. As a result, there is a high demand for timely and credible information on the location, intentions, needs and situation of vulnerable communities across the country. The IOM Data and Analytics team support all organisations engaged in delivering services and support to those communities with data which is presented in a format that is both engaging and actionable. Under the overall supervision of the Chief of Mission and Programme Manager (IM & MEL), and the direct supervision of the Reporting Officer, the successful candidate will support the Data and Analytics (D&A) Unit’s efforts to deliver graphic design work for all D&A  products and needs, including those related to reports capturing the results of Displacement Tracking Matrix (DTM) and other types of assessments. Core Functions / Responsibilities:Design and develop layouts and templates for research studies, reports, infographics, dashboards, public information materials, social media images, etc. to support the strategic communication of the D&A Unit on the data collected in Ukraine to monitor the displacement and mobility of the population, as well as the priority needs and the main socio-economic conditions resulting from the ongoing war.  Assist the D&A Reporting team in the production of analytical and reporting products and the D&A Unit staff in preparing presentations, including in Adobe InDesign, PowerPoint, Publisher, and Prezi, based on internal/external requests as well as designing wireframes for internal and external websites and portals using Adobe XD.   Ensure compliance of all the materials with IOM, IOM Global Migration Data Analysis Centre (GMDAC), DTM, and donor brand guidelines.  Act as a focal point for the production, editing and adaptation of visual text and images for all D&A creative content for online campaigns, print ads, websites, videos, and other visibility materials through professional software such as Adobe InDesign, Photoshop, Illustrator etc., and other online available applications such as Canva.   Maintain and verify copyright free or properly licensed content, cultural sensitivities, human rights approach and a gender focus in all communications materials.  Support the D&A IM officer and reporting team in ensuring the implementation and adaptation, as needed, of information visualization tools that meet the needs of partners and other humanitarian actors.  In coordination with Programme Support staff, facilitate coordination of the work with external vendors, printing companies, etc.   Perform technical trainings to strengthen graphic design skills for IOM and counterparts´ staff in basic concepts of graphic design, use and combination of colors, development of infographics, as well as graphic design software such as Adobe Photoshop and Illustrator. Additionally, support the organization of project-related meetings and workshops.  Perform such other duties as may be assigned.Required Qualifications and ExperienceEducationBachelor’s Degree in Graphic Design, Communications or a related field from an accredited academic institution with at least five years of relevant professional experience; or  High School Degree/Certificate with seven years of relevant professional experience. ExperienceIn-depth knowledge of and expertise in a wide range of Adobe Creative Suite software applications such as InDesign, Illustrator, Photoshop and Dreamweaver is required. Relevant and demonstrable academic and practical experience in the field of graphic design is required. Experience and skill in data visualization will be considered a significant advantage. Experience with additional visualization software such as tableau and Power BI is an advantage. A portfolio demonstrating knowledge of graphic design layout and reporting products, including photographic requirements and procedures.  Ability to identify and interpret graphic and web design needs and develop creative and responsive design concepts. Ability to develop complex, integrated design, printing, and/or reproduction specifications. Ability to compile and prepare graphic production budgets, schedules, and workplans. Knowledge of available external graphic design, printing, publication, and associated resources. Pro-active team player who can work with web developers, web designers and production colleagues under pressure to tight deadlines.    Ability to multi-task, can work independently and think strategically.  Good organizational skills with the ability to prioritize and an eye for detail.   Knowledge of principles and practices of graphic design. Ability to create and produce graphic materials using a range of media, methods, techniques, and equipment. Ability to communicate effectively, both orally and in writing. Ability to supervise and train assigned staff.OtherAny offer made to the candidate in relation to this vacancy notice is subject to funding confirmation.Appointment will be subject to certification that the candidate is medically fit for appointment and security clearances.A prerequisite for taking up the position is legal residency in the country of the duty station and work permit, as applicable.How to apply:Interested candidates are invited to submit their applications filling in the IOM Personal History Form   and sending to [email protected]  by 16 June 2023 the latest, referring to this advertisement in the subject line of your message.Only shortlisted candidates will be contacted.
Спеціаліст/-ка зі Збору Данних/ (Data Analysis Specialist (DTM)) - Міжнародна організація з міграції (МОМ), Представництво в Україні
МІЖНАРОДНА ОРГАНІЗАЦІЯ З МІГРАЦІЇ (МОМ) - ПРЕДСТАВНИЦТВО В УКРАЇНІ, Київ
Classification: General Service Staff, Grade G7Type of Appointment: Fixed-term, one year with the possibility of extensionEstimated Start Date: As soon as possibleClosing Date: 18 July 2023 Context:Almost eight years after the onset of the crisis in the East, following the Russian invasion in February 2022, Ukraine now experiences a full-scale war. The consequences of violence have now spread country-wide, and over 5 million are estimated to have been internally displaced. Displacement is likely to become protracted as fighting continues, including the targeting of civilian infrastructure. New waves of displacement are expected, and continued deterioration of living conditions among the most vulnerable is likely. nt. Under the overall supervision of the Chief of Mission and the direct supervision of the Project Officer (DTM), the successful candidate will be responsible and accountable for data analysis activities within the Data & Analytics Unit. These activities will focus on the development and implementation of workflows for processing, analysing, and monitoring the quality of data collected through the Displacement Tracking Matrix field operations as well through other data collection programmes in the Data & Analytics unit. Core Functions / Responsibilities:In coordination with the Project Officer (DTM), design, implement, and monitor data flows to ensure that the analysis of information collected by the Data & Analytics unit is timely, effective and of the highest quality. Coordinate with the Project Officer (Reporting) timely preparation and dissemination of analytical reports in accordance with IOM procedures and donor requirements.  Contribute to the development and upkeep of data quality control monitoring mechanisms. Design and create data visualizations and dashboards for both internal use and external dissemination. Participate in the development and adjustment of methodologies, tools and standard operations procedures within the Data and Analytics Unit. Respond to internal and external data requests by providing timely and relevant analysis input, ensuring that the findings are clearly and understandably disseminated to all technical and non-technical stakeholders. Attend relevant conferences, workshops, working groups, interagency coordination meetings, and other forums. Track relevant national developments pertaining to displacement and priority needs monitoring across Ukraine. Contribute to the planning, development, organization and delivery of capacity building activities targeting IOM staff, government and civil society partners, implementing partners and communities. Support training activities and technical guidance to project staff on data processing and analysis. Promote and build capacity of staff and partners on IOM’s Data Protection Principles. Keep the supervisor informed on the status of programme implementation, identify gaps and suggest actions to improve implementation. Support preparation of project proposals and a diverse range of communication products, concept notes and other planning documents. Plan, lead, and coordinate data analysis efforts, including monitoring of implementation of analytical activities to ensure work is proceeding according to established plans, assessing implementation difficulties and providing recommendations for adjusting implementation modalities and work plans to best reflect changing environment on the field. Undertake duty travel relating to project implementation, monitoring visits, project assessments, liaison with counterparts, etc. Perform such other duties as may be assigned.Required Qualifications and ExperienceEducationHigh School degree with seven years of relevant experience;ORBachelor’s degree or equivalent in Sociology, Statistics, IT or data science, Political or other Social Sciences, International Relations, Development Studies, Migration Studies, Human Rights, Law or related fields from an accredited academic institution with five years of relevant professional experience.Experience and SkillsExperience with implementation of quantitative analysis on complex datasets, including time-series is necessary; Experience managing automated data pipelines is a distinct advantage; Experience working in humanitarian or development organizations is welcome; Experience in working with migrants, refugees, internally displaced persons, victims of trafficking and other vulnerable groups is a distinct advantage; and, Prior work experience with international humanitarian organizations, non-government or government institutions/organization in a multi-cultural setting is an advantage; In depth knowledge of and ability to select and independently lead implementation of analytical methodologies is required, Reliable ability to use and train other staff on use of at minimum one statistical software (Python, SPSS, R or STATA) is required; Attention to detail and ability to navigate complex data sets and databases is required; Understanding of the data-requirements in humanitarian and recovery programming and related methodological frameworks and tools (IASC, IRIS, DTM and others) is a distinct advantage.OtherAny offer made to the candidate in relation to this vacancy notice is subject to funding confirmation.Appointment will be subject to certification that the candidate is medically fit for appointment and security clearances.A prerequisite for taking up the position is legal residency in the country of the duty station and work permit, as applicable.How to apply:Interested candidates are invited to submit their applications filling in the IOM Personal History Form   and sending to [email protected]  by 18 July 2023 the latest, referring to this advertisement in the subject line of your message.Only shortlisted candidates will be contacted.
Асистент/-ка Проєкту / (Project Assistant (Data Management, Housing/Shelter)) - Міжнародна організація з міграції (МОМ), Представництво в Україні
МІЖНАРОДНА ОРГАНІЗАЦІЯ З МІГРАЦІЇ (МОМ) - ПРЕДСТАВНИЦТВО В УКРАЇНІ, Київ
Open to Internal and External Candidates Position Title: Project Assistant (Data Management, Housing/Shelter)Duty Station: Kyiv, UkraineClassification: General Service Staff, Grade G4Type of Appointment: Fixed Term, one year with the possibility of extensionEstimated Start Date: As soon as possibleClosing Date: 10 July 2023 Established in 1951, IOM is a Related Organization of the United Nations, and as the leading UN agency in the field of migration, works closely with governmental, intergovernmental and non-governmental partners. IOM is dedicated to promoting humane and orderly migration for the benefit of all. It does so by providing services and advice to governments and migrants.IOM is committed to a diverse and inclusive environment. Internal and external candidates are eligible to apply to this vacancy. For the purpose of the vacancy, internal candidates are considered as first-tier candidates.Context:The International Organization for Migration (IOM) is the only international inter-governmental agency with a specific mandate for migration and is dedicated to promoting humane and orderly migration for the benefit of all. It does so by providing technical expertise to governments, migrants, and host communities through a wide range of sustainable solutions contributing to support populations affected by forced migration and improve living conditions of Internally Displaced Persons (IDPs).Under the overall supervision of the Chief of Mission and Programme Coordinator (Shelter and Housing), the direct supervision of Information Management and GIS Officer, in close coordination with Shelter and Housing Unit and other units’ Information Management Teams, the successful candidate will provide support to the implementation of IOM Ukraine’s efforts to increase community resilience and cohesion in. The incumbent will be responsible for monitoring the unit activities regarding data collection/management, cleaning data, and providing guidance to relevant colleagues in different hubs. S/he will support the unit and the supervisor regarding data management and reporting needs. Core Functions / Responsibilities:Provide support to Activity Reporting Tool (ART) through data cleaning and datasets preparation. Support the delivery of training on data collection and data management on ART and other integrated information management systems. Assist Information Management (IM) and GIS Officer with preparing draft reports for IOM internal reporting through PDMS, donor reporting, cluster reporting, etc. Support with activity mapping for the Shelter and Housing Unit to highlight coverage, gaps, and needs by overlaying vulnerability information. Contribute to inputs/notes for sitreps, address data requests and maintain relevant datasets updated, and coordinate with IM/Data Management counterparts at IOM as may be needed. Support strengthening existing monitoring and reporting mechanisms to improve data collection tools and analysis. Perform such other duties as may be assigned.Required Qualifications and ExperienceEducationHigh school diploma/certificate or equivalent with at least four years of relevant work experience;ORBachelor’s degree or equivalent in Computer Science/ Geographic Information Systems/ Geography/mathematics or relevant area from an accredited academic with 2 years of relevant experience. ExperienceExperience in the management and coordination of information flows, and data management, including collection, storing, processing and analysing data to generate information products; In-depth knowledge of the latest technological developments in information technology and information system; Experience with handling confidential data and personal data; Experience in carrying out user needs analysis and scoping for the development of databases; Previous experience in conflict/post-conflict areas is desirable. Proven skills in analyse statistical information; Ability to formulate IM-related technical requirements and Standard Operating Procedures; Ability to translate planning specifications into technical briefs for data capture and analysis, and vice-versa; Ability to compile and holistically analyse diverse datasets; Team-building and information management skills; Demonstrated understanding of different data collection methodologies; Understanding of relational data theory; Advanced data visualisation and information design skills.SkillsAdvanced data visualisation and information design skills; Advanced Power Query, Power Apps, and MS Excel skills; Experience using data visualisation and design tools such as Power BI and Adobe Illustrator/Photoshop; Kobo toolbox, Survey123 or ODK design and implementation for data collection; and, Photoshop editing for the development of infographics.OtherAny offer made to the candidate in relation to this vacancy notice is subject to funding confirmation.Appointment will be subject to certification that the candidate is medically fit for appointment and security clearances.A prerequisite for taking up the position is legal residency in the country of the duty station and work permit, as applicable.How to apply:Interested candidates are invited to submit their applications filling in the IOM Personal History Form   and sending to [email protected]  by 10 July 2023 the latest, referring to this advertisement in the subject line of your message.Only shortlisted candidates will be contacted.
Національний/-а Керівник/-ця Проєкту з Обміну Данних / (National Data Liaison Officer) - Міжнародна організація з міграції (МОМ), Представництво в Україні
МІЖНАРОДНА ОРГАНІЗАЦІЯ З МІГРАЦІЇ (МОМ) - ПРЕДСТАВНИЦТВО В УКРАЇНІ, Київ
Duty Station: Kyiv, UkraineClassification: National Officer, Grade NO-BType of Appointment: Fixed-term, one year with possibility of ExtensionEstimated Start Date: As soon as possibleClosing Date: 10 August 2023 Core Functions / Responsibilities:Oversee the implementation and further programmatic development of activities falling under the the “evidence-based governance” pillar of IOM Ukraine’s Data and Analytics Unit. In close coordination with IOM colleagues overseeing partnership with the Government of Ukraine, liaise and coordinate with government entities (nationally, regionally and locally), implementing partners, United Nations agencies, civil society organisations, donors and other stakeholders on issues related to evidence-based policy-making and programming. Foster partnerships with key official statistics stakeholders in Ukraine, providing support and technical advice in relation to international standards and best practices on mobility, displacement, and demographic data. Contribute towards the planning, development and delivery of capacity building activities for IOM staff, partner organizations, government officials and other actors, in order to strengthen data collection, data analysis and evidence-based policy-making and programming. Participate in relevant conferences, workshops, steering committees, working groups, and other forums. Contribute to information management of the Data and Analytics Unit’s activities and outputs, including awareness raising and visibility, press releases, website updates and other relevant information-sharing materials. Contribute to the overall implementation of activities of the Data and Analytics Unit, including oversight of the financial, logistical, administrative and technical aspects. This must be conducted in accordance with IOM’s policies, practices and global standards, as well as relevant requirements, guidelines and grant agreements. Monitor the implementation of projects according to the work plan; document and evaluate results; identify the causes of deviations and bottlenecks; and recommend and implement corrective actions. Promote and contribute to the integration and mainstreaming of gender, protection, human rights and other pertinent cross-cutting issues into programme implementation. Identify potential areas for project development and contribute to the development of new projects by selecting and summarizing background information, assessing the local context, and drafting segments of project proposals. Participate in the development and adjustment of methodologies, contingency plans, approaches and standard operating procedures to respond to emerging challenges in Ukraine through a consultative process with other relevant parties. Coordinate the elaboration and dissemination of reports for donors, government and other relevant stakeholders, ensuring timely submission and compliance with donor and IOM requirements. Undertake duty travel as required in relation to project implementation and monitoring. Perform other related duties as assigned.Required Qualifications and ExperienceEducationBachelor’s degree in political or Social Sciences, International Relations, Development Studies, Migration Studies, Human Rights, Law, Statistics, Information Technology, Computer/Data Science or related fields from an accredited academic institution with four years of relevant professional experience; or, Master’s degree in one of the above-mentioned fields with two years of relevant professional experience.ExperienceExperience in liaising with national, regional and/or local governmental authorities in Ukraine, national/international institutions, United Nations agencies and non-governmental organizations; Experience in coordinating high-level policy consultations among national and international stakeholders; Prior experience in developing and/or coordinating international and/or national policy on data and population statistics; Experience in working in development or humanitarian programmes will be considered advantageous; Experience in working with migrants, refugees, internally displaced persons, victims of trafficking and other vulnerable groups will be considered advantageous; and Prior work experience with international humanitarian organizations, non-government or government institutions in a multi-cultural setting is an advantage.SkillsIn-depth knowledge of the of the global efforts aimed at improving national and international refugee, internally displaced persons and statelessness statistics through the development of international recommendations on how to produce, compile and disseminate statistics on these populations; Keen understanding of key issues in national and international population statistics and public governance; Statistical background and ability to analyze and interpret data collected through quantitative and qualitative methods; Knowledge of UN and bilateral donor programming; High level of written expression in both English and Ukrainian languages, including formal correspondence.LanguagesFor all applicants, fluency in English and Ukrainian is required (oral and written).OtherAny offer made to the candidate in relation to this vacancy notice is subject to funding confirmation.Appointment will be subject to certification that the candidate is medically fit for appointment and security clearances.A prerequisite for taking up the position is legal residency in the country of the duty station and work permit, as applicable.How to apply:Interested candidates are invited to submit their applications using the IOM Personal History Form and sending to [email protected]  by  10 August 2023 the latest, referring to this advertisement in the subject line of your message.Only shortlisted candidates will be contacted.
Data Engineer
Beesafe, Kyiv, Kyiv city, ua
A company founded in 2020 to revolutionize the insurance market in Europe and create one of the most innovative insurance products in Central Eastern Europe. We felt proud that it was Poland to start the revolution! Since then, we have developed a lot. Now we have several dozen people on board - some work from our office in Warsaw, some remotely. We do our best to be as elastic as possible. We have an agile mindset so that we can transform our product quickly, and at the same time, we are a stable place to work, as we belong to Vienna Insurance Group, the leading insurance corporate group in Central-Eastern Europe. Do you know Compensa, our major company in Poland?  Why it is worth to work with us?  You’ll be using state-of-the-art tools and technologies in the Azure Cloud  You’ll own your work and decide on the project’s direction  You’ll be improving the data availability by acting as a liaison between analytics and IT teams  You’ll be joining a team of other customer focused, Data Analaysts and Data Engineers to drive one of the most exciting product on the Polish Insurance market  What you need to start the adventure with us: Commercial experience in building Data or Analytics Platforms in the Cloud  Practical knowledge of open-source Big Data tools (e.g. Kafka, Airflow, Presto and Spark)  +3 years of programming experience (we mostly use Python, but experience with Java/Scala and willingness to learn Python is ok too)   Experience in database development and data model design  Solid business and collaboration skills, and responsive to service needs and operational demands  Understanding of the principles of Agile and Scrum (we work in Scrum by the way) ​ Nice to have:  Data analysis and preparation skills, including experience with large data sets and unstructured data  Practical knowledge of Databricks platform and BI Tools  DevOps and Kubernetes skills  Hands-on experience with microservices architecture  Enthusiastic approach to coffee breaks (we love informal discussions with a cup of favourite coffee or tea)  What makes us so special?   We have an office close to the city center (Al. Jerozolimskie 162) We can work from our office or you can discuss with us a fully remote way of working We offer a wide range of benefits (each month you can choose from massages, various vouchers, concert tickets, etc. or stay with the usual: private medical care & MultiSport card) Culture of learning and benefit of 1 day a month fully dedicated to your self-growth Regular Team Meet-ups During the spring-summer-autumn, you can grab nice rewards for participation in sports challenges We organize integrations to get to know each other after work
Data Engineer
Travcorp Poland Sp z o.o., Kyiv, Kyiv city, ua
Travcorp Poland Sp z o.o. We are a company that develops software for a global travel company. We work on projects that make vacationing easier and more pleasant. If you join us, you will get the opportunity to shape the experiences of users from all over the world. The varied pool of clients from all walks of life and presenting different online behaviors guarantees that working on our projects gives you a chance to constantly learn and develop in your field. Apart from that, we are a team of people who love looking for new solutions and working closely with other teams. We value openness, transparent communication and always welcome creative ideas. About the project We are seeking a highly skilled and motivated Data Engineer to join our team. The ideal candidate will have a strong foundation in SQL, proficiency in Python, understanding of financial and operational business metrics, and excellent communication skills in English. As a Data Engineer, you will play a critical role in transforming raw data into actionable insights, facilitating smooth data operations for our business intelligence and analytical needs. Katowice Your responsibilities Develop and maintain ETL data pipelines in DBT and Airflow to transform raw data from multiple sources into structured datasets aligned with business logic. Prepare source datasets for reporting in Power BI and other analytical tools, ensuring data accuracy and consistency. Manage Snowflake data warehouse, including query profiling, user management and optimization. Our requirements Proficiency in SQL with the ability to write complex queries and optimize database performance. Strong experience in Python for data manipulation, scripting, and automation. Experience with AWS and IAC, preferably CloudFormation. Experience in developing and managing data pipelines, preferably using Airflow and DBT. Familiarity with the Snowflake data warehouse, including management, query optimization, and user access control. Excellent English communication skills. Optional Experience in developing pipelines for AI use cases such as processing data for chatbots, forecasting models, predictive analytics. Knowledge of AWS services such as S3, EC2, Lambda, SNS, SQS, EKS. Good understanding of financial and operational metrics, processes and KPIs. Ability to translate business requirements into actionable data solutions Familiarity with DBT (Data Build Tool) for data modeling and transformation. Experience in implementing data governance and security best practices. Development opportunities we offer conferences abroad conferences in Poland development budget external training industry-specific e-learning platforms intracompany training mentoring soft skills training space for experimenting substantive support from technological leaders support of IT events technical knowledge exchange within the company the company supports open source projects time for development of your ideas What we offer Opportunities for constant development and work on exciting projects. Working in an international environment. English lessons. Attractive remuneration in Euro. Flexible working hours. Fully or partially paid training and development.
Data Engineer
The Codest, Kyiv, Kyiv city, ua
Hello World! We are The Codest  - International Tech Software Company with tech hubs in Poland delivering global IT solutions and projects. Our core values lie in “Customers and People First” approach that prioritises the needs of our customers and a collaborative environment for our employees, enabling us to deliver exceptional products and services. Our expertise centers on  web development, cloud engineering, DevOps and quality.  After many years of developing our own product - Yieldbird, which was honored as a laureate of the prestigious Top25 Deloitte awards, we arrived at our mission: to help tech companies build impactful product and scale their IT teams through boosting IT delivery performance. Through our extensive experience with product development challenges, we have become experts in building digital products and scaling IT teams. But our journey does not end here - we want to continue our growth. If you’re goal-driven and looking for new opportunities, join our team! What awaits you is an enriching and collaborative environment that fosters your growth at every step. We are currently looking for a: Senior Data Engineer Project Description: In this role, you will have the opportunity to specialize in leveraging artificial intelligence to optimize clinical and commercial workflows within the pharmaceutical, biotechnology, and life sciences industries. You'll be at the forefront of utilizing AI-driven insights into trial design, indication selection, competitive intelligence, and market landscaping. Your focus in this project will be on simplifying complex data into actionable intelligence, ultimately enhancing efficiency and effectiveness across these vital sectors.   Your Responsibilities : Design and build robust, scalable data pipelines to support AI models Ensure smooth data flow from various sources for analysis and deployment Utilize state-of-the-art techniques for processing diverse data types Oversee database management and optimization Focus on optimizing data storage, retrieval, and security Implement optimal data structuring techniques for vector retrieval and LLM behavior Key Requirements: Extensive experience with Python programming language and with data engineering projects (5+ years of commercial experience) Experience with data orchestration tools such as Airflow and Prefect Good familiarity with PostgreSQL Proficient in web scraping techniques Strong skills in data manipulation with Pandas library Understanding of parallelization techniques for efficient processing Knowledge of anti-botting measures for secure operations Proficiency in data orchestration tools for managing workflows effectively Strong skills in designing efficient database schemas Understanding of ensuring data integrity and reliability for robust systems Advanced English in speaking and writing (you will be working with international clients only) Nice to have: Experience with AI, LLM, and embeddings Familiarity with developing multimodal models Knowledge of database optimization techniques Full stack experience with NextJS/React Experience with Agile methodologies, particularly the Scrum framework Our Promise (what you can expect from us): 24-28k on a B2B contract 100% remote work (but we have offices in Krakow and Warsaw and we’re happy to meet there from time to time ) 1800 PLN per year for your self-development 1800 PLN per year for all of your home office needs Our B2B contract contains provisions that allow you to obtain IP BOX support Integration events, education opportunities and much more… A unique opportunity to take your career to the next level - we’re looking for people who want to create an impact. You have ideas, we want to hear them! Recruitment process: 30 minute screening call online with our recruiter 30-45 minute technical interview with our developer 1h client call Offer Questions, insights? Feel free to reach out to our recruiting team: [email protected] In the meantime, feel free to visit  our website  where you can find key facts about us.
Data Engineer
Upwork, Kyiv, Kyiv city, ua
Job Description We are looking for a Data Engineer with database, data migration, ETL and Salesforce experience. Skillset: - Database - Data Migration - ETL - Salesforce - AWS(Athena, Spark, EMR, S3, Roles and policies) - Evaluate business needs and objectives - Selecting, preparing, extracting, and transforming data from multiple sources - Analyzing information, Data extraction, Data transformation to improve data quality - Analyzing raw data, Developing and maintaining datasets - Build data systems and pipelines - Conduct complex data analysis and report on results - Build algorithms and prototypes - Combine raw information from different sources - Explore ways to enhance data quality and reliability - Identify opportunities for data acquisition - Develop analytical tools and programs - Collaborate with data, integration and technical architects - Previous experience as a data engineer or in a similar role - Technical expertise with data models, data mining, and segmentation techniques - Knowledge of programming languages  (e.g. Java and Python) - Hands-on experience with SQL, PL/SQL and database design - Great numerical and analytical skills - Prior experience with data migration to Salesforce - Good communication and negotiation skills to influence engineering teams - Migration automation experiences to Salesforce is desired - Prior Salesforce migration experience; understanding of the salesforce object models; sf migration challenges etc - Advanced SQL programming and data analysis. - Prior experience leading large migration efforts by clearly defining data cleanup strategies & migration paths - AWS(Athena, Spark, EMR, S3, Roles and policies) Array
Data Azure Engineer
Devapo Sp. z o. o., Kyiv, Kyiv city, ua
Our company is seeking a skilled Data Azure Engineer to join our team and be part of a project for a large financial institution. As a Data Azure Engineer, you will develop cutting-edge data platforms using Azure technology. You will have the opportunity to lead important projects, participate in decision-making processes and collaborate with a diverse team of experts. Your expertise will drive breakthrough solutions for financial institutions and beyond. Being part of our team, you will receive a multitude of additional benefits to enjoy outside of work. What we expect: 3+ years of experience in data engineering Python programming knowledge Deep knowledge and strong experience in technologies (Azure Synapse, Databricks, Azure Datafactory, PySpark) Strong and proven knowledge working with data lakes, lakehouses, and data warehouses in the cloud (Databricks, Snowflake, Azure) Advanced proficiency working with SQL Strong experience with ETL/ELT processes, data ingestion, data transformation, data modeling Experience in code repositories(Git) Nice to have: Dbt knowledge and experience Knowledge of Kubernetes Responsibilities: Developing data platforms, data lakehose, data warehouses End-to-end development of ETL/ELT processes Building and designing data platform components to enable clients to produce and consume data Developing, implementing, and maintaining change control, testing processes Researching and implementing best practices and new approaches to our current data stack and systems Data modeling What we offer: Salary: 90 - 120 PLN + VAT (B2B contract) Co-financing of trainings and certificates and provide assured time for learning within working hours, Private medical care and Multisport card Language classes (English), Flexible working hours, Meetings and integration events, Reference bonus for recommending a new hire, Individually tailored path for your career development, The ability to work in a hybrid form from our Warsaw office
Data Engineer
Experis Manpower Group, Kyiv, Kyiv city, ua
We are seeking someone who can assist in establishing an infrastructure foundation on the Google Cloud Platform (GCP) and migrating the current legacy data lake to a manageable, scalable, and secure data lake in the cloud. This data lake should provide the necessary data analytics capabilities to meet our growing needs. We require assistance in assessing, designing, planning, and migrating the existing on-premise or cloud data lake to Google BigQuery. Work model:  100% remote Methodology:  Agile Project duration:  06.2024 – 11.2024, extension possible Main responsibilities: Assess the current system including Flows, Data pipelines, Schemas and reports. Focus on the following areas: Data Governance leading practices, definitions, guidelines, process, and recommendations for products available in GCP: Data Catalog, Data Lineage, Data Quality, Data Masking, Data Classification Datasets and analytics infrastructure Database and application technology Extract, transform and load (ETL) or extract, load, and transform (ELT) workloads Job orchestration and scheduling needs Tooling supply plan (including supporting end of life) Plans for continuous integration between applications Business units and other teams using data solutions As – is technical state for the areas analysed above Desired future – state for data transformation, data lake, or data analytics Main requirements: Bachelor's degree in business, computer science, or a related field Experience in below technologies (minimum 3 years) BigQuery Data Flow SQL Python Experience in migration data warehouses to BigQuery English – C1 Our offer: B2B via Experis MultiSport Plus PZU group insurance Medicover e-learning platform
Data Engineer
Stonex Poland, Kyiv, Kyiv city, ua
Connecting clients to markets – and talent to opportunity   With 4,300 employees and over 400,000 retail and institutional clients from more than 80 offices spread across five continents, we’re a Fortune-100, Nasdaq-listed provider, connecting clients to the global markets – focusing on innovation, human connection, and providing world-class products and services to all types of investors. Whether you want to forge a career connecting our retail clients to potential trading opportunities, or ingrain yourself in the world of institutional investing, The StoneX Group is made up of four segments that offer endless potential for progression and growth.   Business Segment Overview:  With boots on the ground authenticity at the heart of everything we do, our comprehensive array of commercial products and services enable you to work directly with our clients, across hedging, risk management, execution and clearing, OTC products, commodity finance and more.   Position Purpose:  The Data Engineer is responsible for empowering the Data team to achieve its primary objectives: ingesting, transforming and exposing real-time, event-driven data streams pertaining to the firm’s data assets. The ideal candidate will exhibit passion for continuous improvement and a dedicated focus on enabling consumers to achieve their goals by making data driven decisions. Responsibilities Primary duties will include:  Prioritizes and executes rapid raw data collection from source systems, targets and implements efficient storage of, employs fast and reliable access patterns. Understands system protocols, how systems operate and data flows. Aware of current and emerging technology tools and their benefits. Expected to independently develop a full software stack. Understands the building blocks, interactions, dependencies, and tools required to complete software and automation work. Independent study of evolving technology is expected. Drives engineering projects by developing software solutions; conducting tests and inspections; building reports and calculations. Strong focus on innovation and enablement, contributes to designs to implement new ideas which improve an existing and new system/process/service. Understands and can apply new industry perspectives to our existing business and data models. Reviews existing designs and processes to highlight more efficient ways to complete existing workload more effectively through industry perspectives. Maintains knowledge of existing technology documents. Writes basic documentation on how technology works using collaboration tools like Confluence. Creates clear documentation for new code and systems used. Documenting systems designs, presentations, and business requirements for consumption and consideration at the manager level. Collaborates with technical teams and utilizes system expertise to deliver technical solutions. Continuously learns and teaches others existing and new technologies. Contributes to the development of others through mentoring or in-house workshops and learning sessions. Drives team practices and procedures to achieve repeatable success and defined expectation of services Provides a significant collaborative role in long-term department planning, with focus on initiatives achieving data empowerment, operational efficiency and sustainability Monitors and evaluates overall strategic data infrastructure; tracks system efficiency and reliability; identifies and recommends efficiency improvements and mitigates operational vulnerabilities. Qualifications To land this role you will need: Proficiency in programming in Python and SQL; willingness to learn and adopt new languages as necessary. Proficiency in ETL process design and implementation in cloud-based environment preferably using Databricks. Problem solving skills – able to work through a problem, analyze root cause and propose solution. Familiarity with API based data distribution. Understanding of Enterprise architecture patterns, Object Oriented & Service Oriented principles, design patterns, industry best practices. Foundational knowledge of data structures, algorithms, and designing for performance. Experience in database technology like MSSQL and caching services like Redis. Exposure to containers, microservices, distributed systems architecture, orchestrators and cloud computing. Excellent communications skills and the ability to work with subject matter expert to extract critical business concepts. Ability to work and potentially lead in an Agile methodology environment. What makes you stand out 3-5 years of experience developing software in a professional environment (preferably financial services but not required). 3 years of hands-on Data Driven Enterprise Application development, preferable in financial industry. Comfortable with core programming concepts and techniques (e.g., concurrency, memory management). Education / Certification Requirements   Bachelor’s degree or relevant work experience in Computer Science, Mathematics, Electrical Engineering or related technical discipline.
Data Engineer
Webellian Sp.z o o, Kyiv, Kyiv city, ua
About the Webellian Webellian is a well-established  Digital Transformation  and  IT consulting  company committed to creating a positive impact for our clients. We strive to make a meaningful difference in diverse sectors such as insurance, banking, healthcare, retail, and manufacturing. Our passion for cutting-edge and disruptive technologies, as well as our shared values and strong principles, are what motivate us. We are a community of engineers and senior advisors who work with our clients across industries, playing a deep and meaningful role in accelerating and realizing their vision and strategy. About the position We are looking for Regular Data Engineer , to work on a project for one of our key customers in the insurance industry. You will work in hybrid mode with your teammates based in Poland and other stakeholders located worldwide and you will be in direct contact with business users of the solution. Goals and challenges Leverage a global data platform and enrich it with additional data capabilities Design and implement the solutions for complete use case data pipelines: from data ingestion and storage, through data processing & implementation of business rules to data consumption (e.g. reporting) Define and apply best practices for development and maintenance of the platform Keep up with trends and evolving technology in the big data and analytics world Look for opportunities to improve performance, reliability and automation Hard skills we are looking for Improve and refine data ingestion and transformation pipelines (ETL/ELT). Proficiency in Python with focus on PySpark Experience with Azure Synapse Analytics would be a plus Knowledge of Azure or other Cloud technology, especially in Data solutions Continuous integration, deployment, and delivery practitioner General understanding of Infrastructure, Orchestration and IT Security Principles (especially on an enterprise level) Soft Skills Experience in Data Engineering Bachelor (BSc), Master (MSc) or equivalent experience in a technical field (for example, Computer Science, Engineering) Fluent English (written and spoken) is a must, other languages (e.g. German, French, Italian, etc.) are a plus DevOps mindset (you build it, you run it ) Experience in insurance domain is a strong plus Capability of understanding complex requirements and breaking them down into actionable implementation tasks with attention to business logic Capability of result oriented communication with people from different departments with different skill sets Writing clear technical specifications Excellent verbal communication skills Leadership, autonomy and drive to grow and learn new technologies and tools What we offer Contract under Polish law:  B2B  or  Umowa o Pracę Benefits such as  private medical care, group insurance, Multisport card There are  English classes  available Hybrid work  (at least 2 days/week on-site) in Warsaw (Mokotów) Opportunity to work with excellent professionals High standards of work and focus on the quality of code New technologies in use Continuously learning and growth International team Pinball, PlayStation & much more (on-site) Join a growing team of dedicated professionals! We love to pass on the knowledge to grow excellence, speak our minds without playing politics, and just enjoy hanging around together. If you share our passions - we want to meet you! So go ahead and apply  Please include the following statement:  “I hereby authorize Webellian Poland Sp. z o.o. to process my personal and store data included in my job application for the needs of following and future recruitment processes (in accordance with the Personnel Protection Act 29.08.1997 no 133 position 883)”.
Data Center Engineer
Experis, Kyiv, Kyiv city, ua
As a Data Center Network Engineer you will join the team of network professionals, where you will be responsible for maintaining stable, reliable, and secure data center network services, their development and improvement. Requirements (not everything is required): Experience with  Cisco Nexus 3K/5K/9K Knowledge of dynamic routing protocols: OSPF, BGP Understanding of NGFW firewalls (Paloalto, Fortigate, Checkpoint) Understanding of IP networking, L2/L3 network protocols, TCP/IP, VLAN, VRRP, LACP, MC-LAG, EVPN with VXLAN, DHCP, DNS Good understanding of Application Delivery Controller (F5) Excellent analytical skills to troubleshoot complex technical problems Not required, but it’s an advantage if you are experienced with: Network automatization for infrastructure deployment Python programming o VMware NSX-V/NSX-T technologies Experience in public cloud network technologies Your personality: Strong organizational skills Team player Great time management skills Strong goal-oriented mindset and focus on high quality Strong sense of ownership of the network Proactive problem solver Fluent in English, written and spoken Offer: 100% remote work B2B via Experis MultiSport Plus Group insurance Medicover E-learning platform
Data Engineer
PwC Polska, Kyiv, Kyiv city, ua
The area of data management at PwC Poland is critical to all digital projects, which is why the Data Solutions department is made up of a group of super-specialists who fully understand our clients and business needs. We specialize in elevating clients data capabilities - both on premise and in the cloud. From building advanced data platforms and engineering services to offering comprehensive data analytics, BI, and integration, we ensure seamless data migration, effective visualization, and insightful reporting. Our expertise in data modeling and integration empowers your business with actionable insights, driving informed decisions and strategic growth. We are looking for: Data Engineer Your future role: Participate in Data Platform & Engineering delivery for clients from a variety of industries. Work with teammates to come up with the most efficient technological solutions to address clients’ needs. In particular: deliver advanced scalable data processing architectures, design and implement custom data processing tools (Python, PySpark etc.), ensure efficient data movement including data cleansing, transformation & continuous monitoring, enable real-time analytics and stream processing. Write unit tests for delivered scripts, notebooks etc. Share your knowledge with less experienced teammates. Adhere to the company's Delivery Excellence framework, contribute to Data Solutions Practice ways of working and internal delivery guidelines. Apply if you have: Professional experience in Data Engineer role. At least 2 years of hands-on experience in Databricks\PySpark\Python. Good knowledge of SQL language​. Hands-on experience with data related tech stack from at least one of three major cloud providers. Good knowledge of at least one ETL tool (e.g. Azure Data Factory, Informatica, Cloud Data Fusion, AWS Glue, Apache NiFi etc.) Experience with BI tools and reporting models (PowerBI, Tableau, Alteryx, Looker etc.) Practical knowledge of unit testing. Team player (experience in working in an international environment is a plus). Strong analytical skills. Very good communication skills (both oral and written communication).  Good command of English (C1 level). Minimum communicative knowledge of Polish. Nice to have: Hands-on experience with Delta Lake Hands on experience with Azure cloud Hands-on experience with Gen AI tools supporting Data Engineering activities Hands-on experience with Snowflake Hands-on experience with software development principles and standards (e.g: PEP, DRY, SOLID, Design Patterns) Experience with CI/CD tools Certification in Azure / AWS / Google cloud (data engineering paths)  Certification in Databricks  By joining us you gain: Work flexibility  - hybrid working model, flexible start of the day, workation, sabbatical leave, Development and upskilling  - our full support during onboarding process, mentoring from experienced colleagues, training sessions, workshops, certification co/financed by PwC and conversations with native speaker, Wide medical and wellbeing program  - medical care package (incl. dental care, freedom of treatment, physiotherapy), coaching, mindfulness, psychological support, education through dedicated webinars and workshops, financial and legal counseling,  Possibility to create your  individual benefits package  (a.o. lunch pass, insurance packages, concierge, veterinary package for a pet, massages) and  access to a cafeteria  - vouchers, discounts on IT equipment and car purchase, 3 paid hours for volunteering  per month,  Additional paid  Birthday Day off, And when you start enjoying PwC as much as we do, you may recommend your friend to work with us. Recruitment process: Complete the form and send your resume. Talk to our Recruiter on a short HR screening call and complete the technical test. Get to know each other better at a recruitment interview!
Data Engineer
Commerzbank, Kyiv, Kyiv city, ua
Join our team as a Data Engineer! About the project: In your role as a Data Engineer you will be working in the Data as a Service (DaaS) cell which builds products and services designed to provide barrier free, easy to use information and test data for the whole bank. This includes data anonymization services, user interfaces and a consolidated data layer. On a daily basis you will develop products based on database technology which are running in the Cloud (GCP) and on-prem. Your responsibilities will encompass the full Software Development Lifecycle including analysis, architecture, testing and also supporting during production issues during normal working hours. The Big Data cluster is the enabler for data scientists and provides a huge collection of data and a data science workbench in one place. BI technology within the lake infrastructure Establish a stable, state-of-the-art technology base with on-prem and cloud solutions Set up data lake as single data and analytics hub and effectively ingest most important data sources Establish data quality and metadata management Provide Data Marts and sandboxes for segments and functions with the most important combination of data sources What you will be doing? Build and enhance automated Data Pipelines that transform data into usable information Software development, defect and requirements analysis, testing, bug fixing of applications Take care of proper up-to-date system documentation 3rd Level Support for our daily processes, Site reliability engineering Drive technical discussions and implement new features to enable new business cases Participate in regular Scrum ceremonies (Daily, Planning, Review, Retro) Which technology & skills are important for us? High knowledge of SQL Basic knowledge of Data warehouse, relational and non-relational database solutions Basic knowledge of UNIX/LINUX (Red Hat) RH 7.x Google BigQuery knowledge Good knowledge of Hadoop Cloudera Data Platform (CDP) Basic knowledge of GIT Basic knowledge of Cloud Solutions (GCP) How? Hybrid on Wersalska 6 street (Twice a week work from the office - Łódź) Below you can find more information about Commerzbank Commerzbank is a leading international commercial bank with branches and offices in almost 50 countries. The world is changing, becoming digital, and so we are. We are leaving the traditional bank behind us and are choosing to move forward as a digital enterprise. This is exactly why we need talented people who will join us on this journey. We work in inter-locational and international teamwork in agile methodologies. What we offer?  Of course we offer Development Plans for employees Life insurance, flexible working hours, integration events and much more Important! Please add the clause to your CV.  You can find it on the end of the advert. * * *  Please add the following clause to your application: 1. I consent to the processing of personal data contained in this document by Commerzbank Aktiengesellschaft with its registered office in Kaiserstrasse 16, 60311 Frankfurt am Main, Germany, operating through the Branch in Poland with its registered office in Łódź, 91-203 Łódź, ul. Wersalska 6, KRS 0000631053, for the implementation of the current recruitment process and for the future recruitment for a period of 6 months, in accordance with the Regulation of the European Parliament and of the Council (EU) 2016/679 of 27 April 2016 on the protection of individuals with regard to the processing of personal data the free flow of such data and the repeal of Directive 95/46 / EC (RODO) and in accordance with the Act of 10 May 2018 on the protection of personal data (Journal of Laws of 2018, item 1000). I provided my personal data voluntarily and I declare that they are truthful. I have the right to withdraw this consent at any time. The withdrawal of consent shall not affect the lawfulness of processing based on consent before its withdrawal. 2. I have read the content of the information clause, including information about the purpose and methods of processing personal data and the right to access my personal data and about the right to correct, rectify and delete it.
Data Engineer
Asana, Kyiv, Kyiv city, ua
The Enterprise Data & Insights (EDI) team at Asana is tasked with building powerful decision-making data products, integrations, process automation, tools, and analytical reports. We are looking for a driven Data Engineer to add to our growing team who will be foundational to the company’s operations by supporting the business and finance teams. You will accelerate the business by connecting systems and data seamlessly.  The right candidate will have a unique mix of technical and strategic skills and has a deep understanding of our business and what is important, but can also proactively develop and maintain tools, automation, and analytics to help us achieve our ambitious goals. This role is based in our Warsaw office with an office-centric hybrid schedule. The standard in-office days are Monday, Tuesday, and Thursday . Most Asanas have the option to work from home on Wednesdays. Working from home on Fridays depends on the type of work you do and the teams with which you partner. If you're interviewing for this role, your recruiter will share more about the in-office requirements. What you’ll achieve You will design, build and deliver scalable data pipelines based on modern cloud-based architectures and build out new API integrations. You will build analytical solutions that unlock actionable insights through data analysis, investigation, and visualization You will collaborate with business and analysts to build solutions for complex systems and data platforms that support the growth of our revenue  You will ensure your pipelines and code adhere to engineering best practices by incorporating data quality, accuracy and security principles.  Establish trust and strong relationships across the technical and business teams.  Work cross-functionally with different teams and on different projects as required, and with other Enterprise Technology teammates to support shared systems, data, and infrastructure Act as an owner and ultimate escalation point for the data solutions and integrations you build Create accurate and clear technical documentation; develop support processes and procedures; and support hand-off to peers/organizational units About you Minimum 6 years of experience in designing and building integration and data solutions for business handling large volumes of data and building scalable systems Experience with Enterprise iPass platforms: Snaplogic, Dell Boomi, Mulesoft, etc. (we use Snaplogic!) Data visualization skills using business intelligence solutions including at least one of Looker, Tableau, Periscope, Pentaho, or Microstrategy Experience designing and building analytical data models optimized for performance, scalability, and analytical consumption. Experience troubleshooting, optimizing, and performance tuning SQL scripts for efficient compute and storage techniques in Snowflake DW. Strong SQL skills, with the ability to write complex SQL, do cohort analysis, comparative analysis, and ELT transformations Fluency in at least one modern language useful for data processing (e.g. Python, Scala) Hands-on experience in building solutions with AWS S3, Redshift, Snowflake (Snowflake certification desired but not required) Experience in building integration solutions with various enterprise cloud applications like, but not limited to Salesforce, Marketo, NetSuite, and Zendesk Experience supporting business teams and good understanding of their processes and systems like CPQ, Marketing Automation, and Lead-to-Opportunity. At Asana, we're committed to building teams that include a variety of backgrounds, perspectives, and skills, as this is critical to helping us achieve our mission. If you're interested in this role and don't meet every listed requirement, we still encourage you to apply. What we’ll offer Our comprehensive compensation package plays a big part in how we recognize you for the impact you have on our path to achieving our mission. We believe that compensation should be reflective of the value you create relative to the market value of your role. To ensure pay is fair and not impacted by biases, we're committed to looking at market value which is why we check ourselves and conduct a yearly pay equity audit. For this role, the estimated base salary range is between 224,000 PLN - 358,000 PLN (gross yearly). The actual base salary will vary based on various factors, including market and individual qualifications objectively assessed during the interview process. The listed range above is a guideline, and the base salary range for this role may be modified. In addition to base salary, your compensation package may include additional components such as equity, sales incentive pay (for most sales roles), and benefits. If you're interviewing for this role, speak with your Talent Acquisition Partner to learn more about the total compensation and benefits for this role. We strive to provide equitable and competitive benefits packages that support our employees worldwide and include: Mental health, wellness & fitness benefits Career coaching & support Inclusive family building benefits Long-term savings or retirement plans In-office culinary options to cater to your dietary preferences  These are just some of the benefits we offer, and benefits may vary based on role, country, and local regulations. If you're interviewing for this role, speak with your Talent Acquisition Partner to learn more about the total compensation and benefits for this role. About us Asana helps teams orchestrate their work, from small projects to strategic initiatives. Millions of teams around the world rely on Asana to achieve their most important goals, faster. Asana has been named a  Top 10 Best Workplace  for 5 years in a row, is Fortune's #1 Best Workplace in the Bay Area, and one of Glassdoor’s and Inc.’s Best Places to Work. After spending more than a year physically distanced, Team Asana is safely and mindfully returning to in-person collaboration, incorporating flexibility that adds hybrid elements to our  office-centric culture . With 11+ offices all over the world, we are always looking for individuals who care about building technology that drives positive change in the world and a culture where everyone feels that they belong.  We believe in supporting people to do their best work and thrive, and building a diverse, equitable, and inclusive company is core to our mission. Our goal is to ensure that Asana upholds an inclusive environment where all people feel that they are equally respected and valued, whether they are applying for an open position or working at the company. We provide equal employment opportunities to all applicants without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by law. We also comply with the San Francisco Fair Chance Ordinance and similar laws in other locations.
Data Quality Engineer
NeoGames, Kyiv, Kyiv city, ua
/Only for candidates based in Warsaw or Cracow who are comfortable working on Umowa o Pracę - Employment Contract (no possibility to work on B2B)/ Project description: Aristocrat Interactive is a leader in the Online Real Money Gaming space offering solutions spanning game content and aggregation, lotteries, online casino, sportsbook, game systems and managed services offered industry leading platforms. The Data & BI team owns the group’s Data & Analytics platforms spanning Data Engineering, Analytical Engineering and Business Intelligence to lead the group’s data-driven modernization both internally and for its clients. The Data Quality Engineer will play a vital role to develop and own automated and manual testing of data pipelines, reports and data products for customers and internal stakeholders. In this role, the chosen candidate will drive the implementation of Data Quality automation frameworks, tools, and processes to implement scalable and robust data-focused test frameworks and scripts. The ideal candidate will have experience using modern data stacks including Apache Airflow, Data Build Tool, Snowflake, and Kafka, along with a deep understanding of Python. Responsibilities: Establish Data Quality automation frameworks, tools, and processes Design and implement scalable and robust data-focused test frameworks and tools across all the stages of the data lifecycle – from ingestion through transformation to consumption Develop high-quality, automated data-centric tests using Airflow and dbt Creating thorough test strategies, test plans, and test cases for all Data features and products Participate in the data development lifecycle from research, technical design to implementation and maintenance Collaborate with the DevOps team to integrate automation test frameworks and tools into CI/CD Collaborate with the other development and product teams to define high-value test strategies and test cases on large datasets and transformations to determine data quality and data integrity Rigorously test data infrastructure components to comply with privacy regulations including the EU GDPR, PCI, etc. Participate in security audits and implement remedial actions Learn to identify opportunities to compound the growth and efficiency in testing and automation Drive innovation by constantly learning about new automation technologies and methodologies, along with experimentation   Required Skills and Experience: 3+ years of work experience in testing automation - preferably in a role related to data warehouses, databases, ETL/ELT and/or data visualisation Experience in testing applications using frameworks like Cypress, Selenium, and Appium Prior use of Apache Airflow, Data Build Tool, Snowflake, Kafka, Ansible and other data specific tools and applications Proficient in Advanced SQL and Python Experienced with data warehouses like Snowflake, BigQuery, RedShift or equivalent Experienced with databases like Microsoft SQL, PostgreSQL Experienced with data visualization tools like Power BI, Metabase, QlikSense, Tableau, SIsense or equivalent Understanding of software application testing best practices and philosophies with an emphasis around data integrity and data quality Familiarity with data streaming and/or event-driven data pipelines will be considered an asset Clear English written and spoken business communication skills What we offer: High-level compensation on an employment contract and regular performance based salary and career development reviews; Medical insurance (health), employee assistance program; Multisport Card; English classes with native speakers, trainings, conferences participation; Referral program; Team buildings, corporate events. How many interview stages do we have? HR interview Technical interview with the manager Send your CV to our email ASAP, because we can’t wait to start working with U and create cool projects together! LET’S MOVE THE WORLD TOGETHER!
Data Engineer
airSlate, Kyiv, Kyiv city, ua
At airSlate, our journey began in Boston, USA, in 2008. What started as a single product with 3,000 customers has grown into an influential tech company with 1000+ team members across six offices worldwide. In 2022, airSlate reached a total valuation of $1.25 billion and became a 'Unicorn '. But even as we scale, team members remain our most valuable asset. That's why we've built a company that excites people about their work. We develop products that serve over 100 million users with no-code workflow automation, electronic signature, and document management solutions. The company's portfolio of award-winning products,  signNow, pdfFiller, DocHub, WorkFlow, Instapage, and US Legal Forms , empower teams to digitally transform the way their organizations run. About AI Data Engineering team: We're passionate about AI and its potential and are now in the process of creating a new dynamic team led by our CAIO. Data is crucial to our journey, driving our AI and ML efforts. We're seeking a talented Data Engineer to join us and help us become self-reliant in our AI endeavors. This role offers a unique opportunity to dive into the heart of AI development, a crucial focus for our company's growth. You will spearhead the design, implementation, and upkeep of our data ecosystem. From crafting resilient data pipelines to fine-tuning workflows, your expertise will ensure seamless data access and rock-solid reliability for our diverse stakeholders. Collaborate with a dynamic team of data scientists, analysts, and visionaries to fuel data-driven strategies and power transformative machine learning endeavors. Seize this opportunity to drive the future of data architecture and make an impact that resonates across our organization! What you'll be working on: Develop and oversee data models adhering to best practices to ensure data quality and accessibility Constructing, maintaining, and supporting data pipelines Write maintainable, performant code and create and maintain systems documentation. Desire to continually keep up with advancements in data engineering practices. Solve technical problems of the highest scope and complexity Implement the DataOps philosophy in everything you do Build trust in all interactions and with trusted data development Collaborate with Analytics Engineers, Data Analysts, Data Scientists, and ML Engineers to drive efficiencies for their work Collaborate with other functions to ensure data needs are addressed What we expect from you: 5+ years of experience in a Data Engineer role Python/Scala/Java Proficiency in data modeling (Dimensional data modeling, Data Vault, SCD, OBT, etc.) Building data processing pipelines utilizing AWS services such as S3, EMR, Redshift, and MWAA is preferred Message Brokers and distributed streaming platform DB (SQL/NoSQL/Open formats) Dev Tools (Git/Jira/etc.) Data Privacy/Security Data Development Principles/Process Writing Requirements/Documentation Practical familiarity with DataOps, Data Lake, Data Mart, and DataMesh/Data Hub principles English B2 What we offer: — Flexible work environment — We value the advantages of in-person collaboration and prioritize work from our offices in Wroclaw or Bialystok. However, we also provide flexible work arrangements to accommodate remote or hybrid options and flexible scheduling. — Professional growth opportunities — We are committed to ongoing improvement and welcome those passionate about learning. We cover professional development courses, conferences, literature, English classes, and more for each team member. — Health and well-being — We prioritize the health and well-being of our team. This is why we provide a Luxmed subscription, a multisport card for every team member, access to the office's massage room, free lunches, and healthy in-office snacks to sustain your energy. — Bonuses and compensation — On top of a competitive base salary, our team members are eligible for monthly performance bonuses of up to 10%, determined by their achievements, time commitment, and dedication. — Stock options — At airSlate, our team members are more than employees; they're business partners. We issue stock options that grant ownership in the company, allowing everyone to share in its growth. — Open communication — We encourage transparent communication from all team members at airSlate. Feel free to share your thoughts, ideas, and concerns with our management team, CEO, any member of our leadership team, or any team lead at any time.  We are proud of:  — airSlate Care for Ukraine — With a significant number of our team members in Ukraine, our foremost concern was ensuring their safety by providing both financial and logistical assistance to them and their families. What started as an immediate response has evolved into a cornerstone of the airSlate charity program. We match donations contributed by our team members, offer humanitarian aid to those affected by the conflict, distribute food packages to seniors, and support animal shelters. Our commitment remains steadfast in working towards restoring peace to Ukraine. — airSlate Junior Club — Our sense of family extends beyond our team. All team members with children gain access to the airSlate Junior Club, featuring engaging events such as cooking classes, creative activities, and educational online games.
Data Engineer
ITDS, Kyiv, Kyiv city, ua
Join us, and innovate with cutting-edge cloud technologies! Krakow-based opportunity with the possibility to work 90% remotely! As a  Data Engineer , you will be working for our client, a leading global banking and finance institution, in the Global Banking and Markets sector. You will play a pivotal role in the GBI Transformation project, facilitating data integration across MSS Ops globally.   Your main responsibilities: Onboarding new data sources, designing, building, testing, and deploying Cloud data ingest pipelines and data models using GCP technologies (CloudStore, BigQuery, Data Fusion) Implementing appropriate authorization, authentication, encryption, and other security measures Developing procedures and scripts for data migration and initialization Reviewing and refining business and technical requirements, prioritizing tasks in Jira Managing code artifacts and CI/CD processes Ensuring applications meet non-functional requirements and IT standards Writing well-commented, maintainable code and providing support during the development lifecycle You're ideal for this role if you have: Experience with DataFusion, CDAP, or Spark data pipelines Proficiency in Java or Python for custom plugin development Strong SQL/T-SQL skills and expertise in database administration Expertise in Administration and Development of On-prem or Cloud Databases, Warehouses and Lakes Excellent understanding of GCP architecture and solution design Excellent knowledge of devops tools: Ansible, Jenkins, Puppet, Chef, Google Secrets Manager, Github Knowledge of Agile/Scrum, DevOps, and ITIL principles BS/MS degree in Computer Science or related field Fluent English Enthusiastic willingness to learn and develop technical skills independently Strong organizational and multitasking abilities It is a strong plus if you have: Experience with application monitoring and production support Broad experience with IT development and collaboration tools Understanding of IT security and application development best practices Interest in investment products and the investment banking business Experience working in global teams with diverse cultures We offer you: ITDS Business Consultants is involved in many various, innovative and professional IT projects for international companies in the financial industry in Europe. We offer an environment for professional, ambitious, and driven people. The offer includes: Stable and long-term cooperation with very good conditions Enhance your skills and develop your expertise in the financial industry Work on the most strategic projects available in the market Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years Participate in Social Events, training, and work in an international environment Access to attractive Medical Package Access to Multisport Program Internal number #4879
Data Engineer
Arx City, Kyiv, Kyiv city, ua
Job Title: Data Engineer (mid) Location: Remote About Arx   Arx is on a mission to catalyze the development of an equitably built world by empowering real estate professionals to instantly understand the regulatory and market forces impacting a region, drastically improving their ability to deliver housing where needed most.      Arx is building an AI-driven real estate analytics platform that automatically underwrites the future potential of millions of properties in advance, enabling builders & developers to source and evaluate optimal investment & development opportunities in seconds. The Role of Data in Arx   Our product is fully data-driven, with processing outputs significantly influencing our clients' business operations.  Your work will be impactful, with results visible through a fast feedback loop.  We make the most of our data to provide viable information and, most importantly, to support in-house developed advanced analytics and machine learning models.  We handle real and constantly evolving data, requiring robust methods for monitoring and improving its quality.  This is a challenging yet rewarding process, as the data and how we use it prove to have tremendous value for our clients.  Our data comes from multiple sources, comprising millions of records. Efficient extraction, transformation, and loading processes require distributed processing to ensure scalability.  Qualifications   Minimum of 1 year of experience developing scalable data processing pipelines using PySpark in production environments.  Proficient in SQL and Python programming.  Skilled in ETL/ELT implementation in cloud-based environments, preferably AWS.  Strong knowledge of data structures, OOP, algorithms, and performance-oriented design.  Exposure to containers, microservices, distributed systems architecture, and cloud computing.  Understanding of Infrastructure as Code (IaC), preferably using Terraform.  Experience in data exploratory analysis and Tableau is a plus.  Proficiency in English. Key Responsibilities   Develop and maintain scalable data processing pipelines following industry standards.  Understand current and new data sources and come up with optimal solutions.  Monitor data changes and perform root-cause analysis when necessary.  Collaborate closely with other technical teams and the Product Manager to understand requirements.  Build embedded analytics using Tableau dashboards.    Benefits Salary range: $35k - $45k /year in addition to equity compensation commensurate with experience.  Flexible vacation policy , Arx observes Polish holidays. Flexible working hours and remote work environment . Professional development opportunities and continuous learning support . Collaborative and inclusive work environment.