Ми використовуємо cookies для покращення досвіду користувачів, аналізу трафіку і відображення відповідної реклами.
Детальніше Погоджуюсь
Введіть посаду

Огляд статистики зарплатні професії "Data Science в Київській області"

Отримувати інформацію зі статистикою на пошту
На жаль, за зазначеним запитом немає статистичних даних. Спробуйте змінити посаду або регіон.

Рекомендовані вакансії

Спеціаліст/-ка зі Збору Данних/ (Data Analysis Specialist (DTM)) - Міжнародна організація з міграції (МОМ), Представництво в Україні
МІЖНАРОДНА ОРГАНІЗАЦІЯ З МІГРАЦІЇ (МОМ) - ПРЕДСТАВНИЦТВО В УКРАЇНІ, Київ
Classification: General Service Staff, Grade G7Type of Appointment: Fixed-term, one year with the possibility of extensionEstimated Start Date: As soon as possibleClosing Date: 18 July 2023 Context:Almost eight years after the onset of the crisis in the East, following the Russian invasion in February 2022, Ukraine now experiences a full-scale war. The consequences of violence have now spread country-wide, and over 5 million are estimated to have been internally displaced. Displacement is likely to become protracted as fighting continues, including the targeting of civilian infrastructure. New waves of displacement are expected, and continued deterioration of living conditions among the most vulnerable is likely. nt. Under the overall supervision of the Chief of Mission and the direct supervision of the Project Officer (DTM), the successful candidate will be responsible and accountable for data analysis activities within the Data & Analytics Unit. These activities will focus on the development and implementation of workflows for processing, analysing, and monitoring the quality of data collected through the Displacement Tracking Matrix field operations as well through other data collection programmes in the Data & Analytics unit. Core Functions / Responsibilities:In coordination with the Project Officer (DTM), design, implement, and monitor data flows to ensure that the analysis of information collected by the Data & Analytics unit is timely, effective and of the highest quality. Coordinate with the Project Officer (Reporting) timely preparation and dissemination of analytical reports in accordance with IOM procedures and donor requirements.  Contribute to the development and upkeep of data quality control monitoring mechanisms. Design and create data visualizations and dashboards for both internal use and external dissemination. Participate in the development and adjustment of methodologies, tools and standard operations procedures within the Data and Analytics Unit. Respond to internal and external data requests by providing timely and relevant analysis input, ensuring that the findings are clearly and understandably disseminated to all technical and non-technical stakeholders. Attend relevant conferences, workshops, working groups, interagency coordination meetings, and other forums. Track relevant national developments pertaining to displacement and priority needs monitoring across Ukraine. Contribute to the planning, development, organization and delivery of capacity building activities targeting IOM staff, government and civil society partners, implementing partners and communities. Support training activities and technical guidance to project staff on data processing and analysis. Promote and build capacity of staff and partners on IOM’s Data Protection Principles. Keep the supervisor informed on the status of programme implementation, identify gaps and suggest actions to improve implementation. Support preparation of project proposals and a diverse range of communication products, concept notes and other planning documents. Plan, lead, and coordinate data analysis efforts, including monitoring of implementation of analytical activities to ensure work is proceeding according to established plans, assessing implementation difficulties and providing recommendations for adjusting implementation modalities and work plans to best reflect changing environment on the field. Undertake duty travel relating to project implementation, monitoring visits, project assessments, liaison with counterparts, etc. Perform such other duties as may be assigned.Required Qualifications and ExperienceEducationHigh School degree with seven years of relevant experience;ORBachelor’s degree or equivalent in Sociology, Statistics, IT or data science, Political or other Social Sciences, International Relations, Development Studies, Migration Studies, Human Rights, Law or related fields from an accredited academic institution with five years of relevant professional experience.Experience and SkillsExperience with implementation of quantitative analysis on complex datasets, including time-series is necessary; Experience managing automated data pipelines is a distinct advantage; Experience working in humanitarian or development organizations is welcome; Experience in working with migrants, refugees, internally displaced persons, victims of trafficking and other vulnerable groups is a distinct advantage; and, Prior work experience with international humanitarian organizations, non-government or government institutions/organization in a multi-cultural setting is an advantage; In depth knowledge of and ability to select and independently lead implementation of analytical methodologies is required, Reliable ability to use and train other staff on use of at minimum one statistical software (Python, SPSS, R or STATA) is required; Attention to detail and ability to navigate complex data sets and databases is required; Understanding of the data-requirements in humanitarian and recovery programming and related methodological frameworks and tools (IASC, IRIS, DTM and others) is a distinct advantage.OtherAny offer made to the candidate in relation to this vacancy notice is subject to funding confirmation.Appointment will be subject to certification that the candidate is medically fit for appointment and security clearances.A prerequisite for taking up the position is legal residency in the country of the duty station and work permit, as applicable.How to apply:Interested candidates are invited to submit their applications filling in the IOM Personal History Form   and sending to [email protected]  by 18 July 2023 the latest, referring to this advertisement in the subject line of your message.Only shortlisted candidates will be contacted.
Асистент/-ка Проєкту / (Project Assistant (Data Management, Housing/Shelter)) - Міжнародна організація з міграції (МОМ), Представництво в Україні
МІЖНАРОДНА ОРГАНІЗАЦІЯ З МІГРАЦІЇ (МОМ) - ПРЕДСТАВНИЦТВО В УКРАЇНІ, Київ
Open to Internal and External Candidates Position Title: Project Assistant (Data Management, Housing/Shelter)Duty Station: Kyiv, UkraineClassification: General Service Staff, Grade G4Type of Appointment: Fixed Term, one year with the possibility of extensionEstimated Start Date: As soon as possibleClosing Date: 10 July 2023 Established in 1951, IOM is a Related Organization of the United Nations, and as the leading UN agency in the field of migration, works closely with governmental, intergovernmental and non-governmental partners. IOM is dedicated to promoting humane and orderly migration for the benefit of all. It does so by providing services and advice to governments and migrants.IOM is committed to a diverse and inclusive environment. Internal and external candidates are eligible to apply to this vacancy. For the purpose of the vacancy, internal candidates are considered as first-tier candidates.Context:The International Organization for Migration (IOM) is the only international inter-governmental agency with a specific mandate for migration and is dedicated to promoting humane and orderly migration for the benefit of all. It does so by providing technical expertise to governments, migrants, and host communities through a wide range of sustainable solutions contributing to support populations affected by forced migration and improve living conditions of Internally Displaced Persons (IDPs).Under the overall supervision of the Chief of Mission and Programme Coordinator (Shelter and Housing), the direct supervision of Information Management and GIS Officer, in close coordination with Shelter and Housing Unit and other units’ Information Management Teams, the successful candidate will provide support to the implementation of IOM Ukraine’s efforts to increase community resilience and cohesion in. The incumbent will be responsible for monitoring the unit activities regarding data collection/management, cleaning data, and providing guidance to relevant colleagues in different hubs. S/he will support the unit and the supervisor regarding data management and reporting needs. Core Functions / Responsibilities:Provide support to Activity Reporting Tool (ART) through data cleaning and datasets preparation. Support the delivery of training on data collection and data management on ART and other integrated information management systems. Assist Information Management (IM) and GIS Officer with preparing draft reports for IOM internal reporting through PDMS, donor reporting, cluster reporting, etc. Support with activity mapping for the Shelter and Housing Unit to highlight coverage, gaps, and needs by overlaying vulnerability information. Contribute to inputs/notes for sitreps, address data requests and maintain relevant datasets updated, and coordinate with IM/Data Management counterparts at IOM as may be needed. Support strengthening existing monitoring and reporting mechanisms to improve data collection tools and analysis. Perform such other duties as may be assigned.Required Qualifications and ExperienceEducationHigh school diploma/certificate or equivalent with at least four years of relevant work experience;ORBachelor’s degree or equivalent in Computer Science/ Geographic Information Systems/ Geography/mathematics or relevant area from an accredited academic with 2 years of relevant experience. ExperienceExperience in the management and coordination of information flows, and data management, including collection, storing, processing and analysing data to generate information products; In-depth knowledge of the latest technological developments in information technology and information system; Experience with handling confidential data and personal data; Experience in carrying out user needs analysis and scoping for the development of databases; Previous experience in conflict/post-conflict areas is desirable. Proven skills in analyse statistical information; Ability to formulate IM-related technical requirements and Standard Operating Procedures; Ability to translate planning specifications into technical briefs for data capture and analysis, and vice-versa; Ability to compile and holistically analyse diverse datasets; Team-building and information management skills; Demonstrated understanding of different data collection methodologies; Understanding of relational data theory; Advanced data visualisation and information design skills.SkillsAdvanced data visualisation and information design skills; Advanced Power Query, Power Apps, and MS Excel skills; Experience using data visualisation and design tools such as Power BI and Adobe Illustrator/Photoshop; Kobo toolbox, Survey123 or ODK design and implementation for data collection; and, Photoshop editing for the development of infographics.OtherAny offer made to the candidate in relation to this vacancy notice is subject to funding confirmation.Appointment will be subject to certification that the candidate is medically fit for appointment and security clearances.A prerequisite for taking up the position is legal residency in the country of the duty station and work permit, as applicable.How to apply:Interested candidates are invited to submit their applications filling in the IOM Personal History Form   and sending to [email protected]  by 10 July 2023 the latest, referring to this advertisement in the subject line of your message.Only shortlisted candidates will be contacted.
Національний/-а Керівник/-ця Проєкту з Обміну Данних / (National Data Liaison Officer) - Міжнародна організація з міграції (МОМ), Представництво в Україні
МІЖНАРОДНА ОРГАНІЗАЦІЯ З МІГРАЦІЇ (МОМ) - ПРЕДСТАВНИЦТВО В УКРАЇНІ, Київ
Duty Station: Kyiv, UkraineClassification: National Officer, Grade NO-BType of Appointment: Fixed-term, one year with possibility of ExtensionEstimated Start Date: As soon as possibleClosing Date: 10 August 2023 Core Functions / Responsibilities:Oversee the implementation and further programmatic development of activities falling under the the “evidence-based governance” pillar of IOM Ukraine’s Data and Analytics Unit. In close coordination with IOM colleagues overseeing partnership with the Government of Ukraine, liaise and coordinate with government entities (nationally, regionally and locally), implementing partners, United Nations agencies, civil society organisations, donors and other stakeholders on issues related to evidence-based policy-making and programming. Foster partnerships with key official statistics stakeholders in Ukraine, providing support and technical advice in relation to international standards and best practices on mobility, displacement, and demographic data. Contribute towards the planning, development and delivery of capacity building activities for IOM staff, partner organizations, government officials and other actors, in order to strengthen data collection, data analysis and evidence-based policy-making and programming. Participate in relevant conferences, workshops, steering committees, working groups, and other forums. Contribute to information management of the Data and Analytics Unit’s activities and outputs, including awareness raising and visibility, press releases, website updates and other relevant information-sharing materials. Contribute to the overall implementation of activities of the Data and Analytics Unit, including oversight of the financial, logistical, administrative and technical aspects. This must be conducted in accordance with IOM’s policies, practices and global standards, as well as relevant requirements, guidelines and grant agreements. Monitor the implementation of projects according to the work plan; document and evaluate results; identify the causes of deviations and bottlenecks; and recommend and implement corrective actions. Promote and contribute to the integration and mainstreaming of gender, protection, human rights and other pertinent cross-cutting issues into programme implementation. Identify potential areas for project development and contribute to the development of new projects by selecting and summarizing background information, assessing the local context, and drafting segments of project proposals. Participate in the development and adjustment of methodologies, contingency plans, approaches and standard operating procedures to respond to emerging challenges in Ukraine through a consultative process with other relevant parties. Coordinate the elaboration and dissemination of reports for donors, government and other relevant stakeholders, ensuring timely submission and compliance with donor and IOM requirements. Undertake duty travel as required in relation to project implementation and monitoring. Perform other related duties as assigned.Required Qualifications and ExperienceEducationBachelor’s degree in political or Social Sciences, International Relations, Development Studies, Migration Studies, Human Rights, Law, Statistics, Information Technology, Computer/Data Science or related fields from an accredited academic institution with four years of relevant professional experience; or, Master’s degree in one of the above-mentioned fields with two years of relevant professional experience.ExperienceExperience in liaising with national, regional and/or local governmental authorities in Ukraine, national/international institutions, United Nations agencies and non-governmental organizations; Experience in coordinating high-level policy consultations among national and international stakeholders; Prior experience in developing and/or coordinating international and/or national policy on data and population statistics; Experience in working in development or humanitarian programmes will be considered advantageous; Experience in working with migrants, refugees, internally displaced persons, victims of trafficking and other vulnerable groups will be considered advantageous; and Prior work experience with international humanitarian organizations, non-government or government institutions in a multi-cultural setting is an advantage.SkillsIn-depth knowledge of the of the global efforts aimed at improving national and international refugee, internally displaced persons and statelessness statistics through the development of international recommendations on how to produce, compile and disseminate statistics on these populations; Keen understanding of key issues in national and international population statistics and public governance; Statistical background and ability to analyze and interpret data collected through quantitative and qualitative methods; Knowledge of UN and bilateral donor programming; High level of written expression in both English and Ukrainian languages, including formal correspondence.LanguagesFor all applicants, fluency in English and Ukrainian is required (oral and written).OtherAny offer made to the candidate in relation to this vacancy notice is subject to funding confirmation.Appointment will be subject to certification that the candidate is medically fit for appointment and security clearances.A prerequisite for taking up the position is legal residency in the country of the duty station and work permit, as applicable.How to apply:Interested candidates are invited to submit their applications using the IOM Personal History Form and sending to [email protected]  by  10 August 2023 the latest, referring to this advertisement in the subject line of your message.Only shortlisted candidates will be contacted.
Data Analysis Practice Lead
Viseven Group, Kyiv
Viseven Group — міжнародна MarTech компанія, що спеціалізується на інтерактивному контенті та хмарних рішеннях для глобальних фармацевтичних компаній з 2009 року. Постійне зростання та саморозвиток закладені в нашій корпоративній ДНК. Наші унікальні розробки та підходи активно використовуються більш ніж у 50 країнах світу. Рішення Viseven представлені на великих галузевих заходах у Барселоні, Філадельфії, Лондоні тощо.Команда, яка швидко зростає, включає понад 700+ висококваліфікованих технічних і нетехнічних експертів: front- і back-end розробників, BA фахівців та менеджерів, які створюють, локалізують і налаштовують програми у 8 офісах по всьому світу.Місія ролі:Viseven в пошуку досвідченого спеціаліста на роль Data Analysis Practice Lead.На цій позиції ви визначатимете метрики для відстеження ефективності роботи команди, оптимізації часу виконання запитів та оптимізації процесів управління даними. Ви сприятимете зростанню бізнесу, визначаючи нові можливості, впроваджуючи передові рішення, та розвиваючи командне середовище, засноване на співпраці. Ваша роль передбачає застосування гнучких методологій для забезпечення своєчасної реалізації проєктів та визначення пріоритетів запитів для ефективного розподілу ресурсів.Обов'язки:Розробляти комплексну стратегію та дорожню карту даних для розвитку дата аналітичного сервісу, як корпоративного активу, з метою зростання прибутку та оптимізації витрат.Очолювати розробку та впровадження політик і рекомендацій щодо управління даними та керування ними.Виступати в якості консультанту в комунікації з клієнтами, брати участь в pre-sale процесах, надаючи професійну підтримку та забезпечуючи високоякісні надання послуги у сфері аналізу даних.Співпрацювати з зовнішніми та внутрішніми клієнтами (omnichannel, маркетинг, фінанси тощо) для розуміння потреб, визначення підходів для опрацювання і управління даними, покращення послуг.Стратегічно планувати ресурси (внутрішні / аутсорс), їх підбір та залучення для задоволення потреб бізнесу.Стандартизувати методи, інструменти, процеси та практики управління даними для підвищення ефективності та їх повторного використання у проєктах.Підтримувати розвиток команди дата аналітиків, менторити та надавати можливості для навчання.Мотивувати команду до повного залучення в роботу, дослідження своїх можливостей та прагнення до інновацій в управлінні та аналізі даних.Забезпечити навчання та обмін досвідом в команді для підвищення співпраці та розвитку навичок.Поступово розширювати послуги у наданні більш складних рішень з використанням Data Science та Machine Learning (персоналізація, рекомендації тощо).Досвід на навички:5+ років досвіду роботи у сфері послуг з аналізу даних, що включає дизайн, розробку та реалізацію проєктів.Комерційний досвід на лідуючих посадах, вплив на технологічний розвиток дата аналітичної функції/сервісів в компанії, успішний запуск продуктів обробки даних і управління проєктами, починаючи від незначних оновлень до стратегічних змін.Знання найкращих практик та стандартів галузі управління даними, з глибоким розумінням життєвого циклу даних.Досвід роботи з великими обсягами даних, Big Data платформами та інструментами.Досягнення успіхів у розвитку невеликих команд у функціональні напрямки компанії або сервіси.Лідер, орієнтований на досягнення цілей, з досвідом формування високоефективних команд.Мотивований та орієнтований на результат, з навичками планування та організації команди.Відмінні комунікаційні навички, володіння англійською мовою на рівні upper-intermediate та вище.Досвід роботи з основними хмарними сервісами (AWS, Google Cloud або Azure).Досвід у фармацевтичній / FMCG галузі (буде перевагою).Англійська мова - Upper-Intermediate та вище.Буде перевагою:Розуміння Omnichannel стратегій та підходів, для співпраці та можливості надавати більш комплексні рішення.Управління кампаніями Adobe/SFMC.Робота зі сторонніми медіа-даними та операціями.Дані Google Analytics (GA).Досвід роботи з інноваційними технологіями та можливостями, включаючи IOT, Data Science, AI (моделі даних та супутні сервіси).Платформа клієнтських даних (CDP).Що ми пропонуємо?Команда має для нас велике значення, тому ми цінуємо її та надаємо можливість кожному ділитися своїм баченням, втілювати власні ідеї, зростати у професійному середовищі, зберігаючи баланс між роботою та особистим життям.Приєднавшись до Viseven, ви отримаєте:· Конкурентоспроможну винагороду та регулярний перегляд заробітної плати· Професійне та кар'єрне зростання· Оплачувану відпустку - 18 робочих днів на рік (20 робочих днів після 2 років співпраці)· Лікарняний без підтверджуючих документів - 4 робочих дні на рік· Документально оформлений лікарняний - 20 робочих днів на рік· Відпустку по сімейних обставинах - 3 оплачуваних робочих дні (у разі одруження, народження дитини або тяжкої втрати)· Комплексне медичне страхування (включаючи курс массажів та фізіопроцедур)· Курси вивчення англійської мови· Можливість участі в професійних форумах і конференціях· Регулярні корпоративні заходи та тімбілдінги· Досвідчену команду та дружню атмосферу· Приємне робоче середовище: комфортний, повністю обладнаний офіс та можливість працювати вдома
Senior Scala Software Engineer (Data Science team)
GlobalLogic, Ukraine, Kyiv
Description: Founded in 2007, Rubicon Project’s pioneering technology created a new model for the advertising industry. Today, our automated advertising platform is used by the world’s leading publishers and applications to transact with top brands around the globe enabling them to reach more than 1 billion consumers. Rubicon Project operates the largest independent Display Advertising Exchange and Supply Side Platform that automates the buying and selling of Display Advertising across all formats (banner, video) on all devices (desktop browsers, mobile devices, billboards). Rubicon Project auctions over 20 Billion Ads on a daily basis in real-time in less than 1/2 of a second each. Rubicon Project is a publicly traded company (NYSE: RUBI) headquartered in Los Angeles, California, USA.Requirements: Expertise in Scala, Spark;4+ years of production coding experience on the server side;Expertise in Service-oriented architectures, microservices, advanced database schemas, relational and nonrelational databases, highly scalable and available web services;Readiness to work on different types of projects from POC & MVP based on very high-level requirements to production-ready;Work will include very close collaboration with the Data Science team;Workable knowledge of basic probability and statistics;A strong understanding of algorithms, data structures, and an ability to recognize the business and technical trade-offs between different solutions;Experience with development and CI tools: maven, git, Jenkins, Puppet, Crucible, Jira;Experience working in a Linux environment;Expertise in building software in an agile development environment;Demonstrated strong English language verbal and written communication skills (at least intermediate level by client request). Responsibilities: Be a creative problem-solver who can draw on an array of expertise and technology to design and implement reliable, scalable and maintainable solutions to challenging problems;Write production-ready code and unit tests that meet both system and business requirements;Design software and prepare technical specifications;Respond to feature requests, bug reports, performance issues and ad-hoc questions;Work collaboratively with multiple teams to deliver quality software;Perform code reviews and design reviews.#LI-MD2 What We Offer Empowering Projects: With 500+ clients spanning diverse industries and domains, we provide an exciting opportunity to contribute to groundbreaking projects that leverage cutting-edge technologies. As a team, we engineer digital products that positively impact people’s lives.Empowering Growth: We foster a culture of continuous learning and professional development. Our dedication is to provide timely and comprehensive assistance for every consultant through our dedicated Learning & Development team, ensuring their continuous growth and success.DE&I Matters: At GlobalLogic, we deeply value and embrace diversity. We are dedicated to providing equal opportunities for all individuals, fostering an inclusive and empowering work environment.Career Development: Our corporate culture places a strong emphasis on career development, offering abundant opportunities for growth. Regular interactions with our teams ensure their engagement, motivation, and recognition. We empower our team members to pursue their career goals with confidence and enthusiasm.Comprehensive Benefits: In addition to equitable compensation, we provide a comprehensive benefits package that prioritizes the overall well-being of our consultants. We genuinely care about their health and strive to create a positive work environment.Flexible Opportunities: At GlobalLogic, we prioritize work-life balance by offering flexible opportunities tailored to your lifestyle. Explore relocation and rotation options for diverse cultural and professional experiences in different countries with our company.About GlobalLogic GlobalLogic is a leader in digital engineering. We help brands across the globe design and build innovative products, platforms, and digital experiences for the modern world.By integrating experience design, complex engineering, and data expertise—we help our clients imagine what’s possible, and accelerate their transition into tomorrow’s digital businesses.Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers around the world, extending our deep expertise to customers in the automotive, communications, financial services, healthcare and life sciences, manufacturing, media and entertainment, semiconductor, and technology industries.GlobalLogic is a Hitachi Group Company operating under Hitachi, Ltd. (TSE: 6501) which contributes to a sustainable society with a higher quality of life by driving innovation through data and technology as the Social Innovation Business.
Senior Data Engineer
GlobalLogic, Ukraine, Kyiv
Description: Our Client is the Enterprise Worldwide Company. The product you will be working with, provides management and data processing/handling capabilities for networks of the clients scientific lab equipment such as microscopes, etc. The main goals are:Collection and centralized management of data outputs (measurement results, etc.) provided by clients devicesOutdated data utilizationManaging large volumes of data acquired from measurement devices in the cloud securely and reliablySeamless sharing of measurement data with collaboratorsThe ability to share measurement results and accelerate customer service. #LI-KC1Requirements: We are looking for aSenior Data Engineer with at least 4 years of commercial experience in development of data platforms for enterprise applications. With the experience to Lead a team of engineers and take responsiility for the technical solution. – Proficiency in Airflow for workflow orchestration, dbt for data transformation, and SQL for data querying and manipulation.– Experience in data modeling, ETL (Extract, Transform, Load) processes, and data warehousing concepts.– Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services.– Excellent analytical and problem-solving skills with meticulous attention to detail.– Strong communication and collaboration skills with the ability to lead and motivate cross-functional teams.Good to have ability to participate onsite meeting. Responsibilities: • Implement new solutions into the current system with the refactoring and from scratch methods;• Preparing the technical documentation;• Participating in client meetings to understand business and user requirements and estimate tasks;• Collaborating closely with other engineers, product owners and testers to identify and solve challenging problems;• Taking part in defect investigation, bug fixing, troubleshooting; What We Offer Empowering Projects: With 500+ clients spanning diverse industries and domains, we provide an exciting opportunity to contribute to groundbreaking projects that leverage cutting-edge technologies. As a team, we engineer digital products that positively impact people’s lives.Empowering Growth: We foster a culture of continuous learning and professional development. Our dedication is to provide timely and comprehensive assistance for every consultant through our dedicated Learning & Development team, ensuring their continuous growth and success.DE&I Matters: At GlobalLogic, we deeply value and embrace diversity. We are dedicated to providing equal opportunities for all individuals, fostering an inclusive and empowering work environment.Career Development: Our corporate culture places a strong emphasis on career development, offering abundant opportunities for growth. Regular interactions with our teams ensure their engagement, motivation, and recognition. We empower our team members to pursue their career goals with confidence and enthusiasm.Comprehensive Benefits: In addition to equitable compensation, we provide a comprehensive benefits package that prioritizes the overall well-being of our consultants. We genuinely care about their health and strive to create a positive work environment.Flexible Opportunities: At GlobalLogic, we prioritize work-life balance by offering flexible opportunities tailored to your lifestyle. Explore relocation and rotation options for diverse cultural and professional experiences in different countries with our company.About GlobalLogic GlobalLogic is a leader in digital engineering. We help brands across the globe design and build innovative products, platforms, and digital experiences for the modern world.By integrating experience design, complex engineering, and data expertise—we help our clients imagine what’s possible, and accelerate their transition into tomorrow’s digital businesses.Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers around the world, extending our deep expertise to customers in the automotive, communications, financial services, healthcare and life sciences, manufacturing, media and entertainment, semiconductor, and technology industries.GlobalLogic is a Hitachi Group Company operating under Hitachi, Ltd. (TSE: 6501) which contributes to a sustainable society with a higher quality of life by driving innovation through data and technology as the Social Innovation Business.
Senior/Lead Data/BI Engineer
GlobalLogic, Ukraine, Kyiv
Description: The client has requested GlobalLogic to build a Data portal that Enables Data Discovery, Data Domains, Metadata Management, Workflows, Enterprise Glossary Management, Data Access Management, Operational Dashboards (like DQ, Data Usage) etc.Enterprise Data Portal is expected to be strategic leading-edge unified data ecosystem that provides framework, processes, tools and services, enabling consistent and frictionless access to trusted data across on-premise, cloud, other SaaS and external partners. It maximizes the value of our data and provides flexible self-service and advanced capabilities to support a wide range of use cases including enterprise, research and analytical needs.Requirements: Minimum of 3-4 years in working with Enterprise Data Warehouse technologies including Multi-Dimensional Data Modeling, Data Architectures or other work related to the construction of enterprise data assetsExtensive experience in developing microservices by using PythonExperience in analyzing data using PythonData modeling and schema designStrong SQL programming background with the ability to troubleshoot and tune the codeProven understanding and demonstrable implementation experience of cloud data platform technologiesSkill set MUST:1. BI Dashboards2. AWS Quicksight, Amazon Q, AWS Athena3. BI Portal4. Angular, AWS Neptune, API Gateway, AWS Lambda5. Data Pipelines/DevOps 6. PySpark, Iceberg, TerraformWould be benefits previous years of experience and a substantial understanding in:KafkaAWS pipeline orchestration technologies, such as Data Pipeline(in use), Step Functions, and Lambda (in use)AWS data access/warehousing tools, such as Athena (in use) and AuroraAWS Lake formationStrong problem solving, troubleshooting and analysis skillsAble to mentor junior grade engineersGood knowledge of Agile ScrumGood communication skillsEnglish – Upper-Intermediate Responsibilities: Develop and manage large scale data systems and ingestion capabilities and infrastructureSupport Design and development of solutions for the deployment of dashboards and reports to various stakeholdersWork directly with the business teams to rapidly prototype analytics solutions based upon business requirementsArchitect data pipelines and ETL processes to connect with various data sourcesDesign and maintain enterprise data warehouse modelsManage cloud based data & analytics platformDeploy updates and fixes and assist technical support#LI-AB8 #LI-Remote What We Offer Empowering Projects: With 500+ clients spanning diverse industries and domains, we provide an exciting opportunity to contribute to groundbreaking projects that leverage cutting-edge technologies. As a team, we engineer digital products that positively impact people’s lives.Empowering Growth: We foster a culture of continuous learning and professional development. Our dedication is to provide timely and comprehensive assistance for every consultant through our dedicated Learning & Development team, ensuring their continuous growth and success.DE&I Matters: At GlobalLogic, we deeply value and embrace diversity. We are dedicated to providing equal opportunities for all individuals, fostering an inclusive and empowering work environment.Career Development: Our corporate culture places a strong emphasis on career development, offering abundant opportunities for growth. Regular interactions with our teams ensure their engagement, motivation, and recognition. We empower our team members to pursue their career goals with confidence and enthusiasm.Comprehensive Benefits: In addition to equitable compensation, we provide a comprehensive benefits package that prioritizes the overall well-being of our consultants. We genuinely care about their health and strive to create a positive work environment.Flexible Opportunities: At GlobalLogic, we prioritize work-life balance by offering flexible opportunities tailored to your lifestyle. Explore relocation and rotation options for diverse cultural and professional experiences in different countries with our company.About GlobalLogic GlobalLogic is a leader in digital engineering. We help brands across the globe design and build innovative products, platforms, and digital experiences for the modern world.By integrating experience design, complex engineering, and data expertise—we help our clients imagine what’s possible, and accelerate their transition into tomorrow’s digital businesses.Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers around the world, extending our deep expertise to customers in the automotive, communications, financial services, healthcare and life sciences, manufacturing, media and entertainment, semiconductor, and technology industries.GlobalLogic is a Hitachi Group Company operating under Hitachi, Ltd. (TSE: 6501) which contributes to a sustainable society with a higher quality of life by driving innovation through data and technology as the Social Innovation Business.
Data Engineer
The Codest, Kyiv, Kyiv city, ua
Hello World! We are The Codest  - International Tech Software Company with tech hubs in Poland delivering global IT solutions and projects. Our core values lie in “Customers and People First” approach that prioritises the needs of our customers and a collaborative environment for our employees, enabling us to deliver exceptional products and services. Our expertise centers on  web development, cloud engineering, DevOps and quality.  After many years of developing our own product - Yieldbird, which was honored as a laureate of the prestigious Top25 Deloitte awards, we arrived at our mission: to help tech companies build impactful product and scale their IT teams through boosting IT delivery performance. Through our extensive experience with product development challenges, we have become experts in building digital products and scaling IT teams. But our journey does not end here - we want to continue our growth. If you’re goal-driven and looking for new opportunities, join our team! What awaits you is an enriching and collaborative environment that fosters your growth at every step. We are currently looking for a: Senior Data Engineer Project Description: In this role, you will have the opportunity to specialize in leveraging artificial intelligence to optimize clinical and commercial workflows within the pharmaceutical, biotechnology, and life sciences industries. You'll be at the forefront of utilizing AI-driven insights into trial design, indication selection, competitive intelligence, and market landscaping. Your focus in this project will be on simplifying complex data into actionable intelligence, ultimately enhancing efficiency and effectiveness across these vital sectors.   Your Responsibilities : Design and build robust, scalable data pipelines to support AI models Ensure smooth data flow from various sources for analysis and deployment Utilize state-of-the-art techniques for processing diverse data types Oversee database management and optimization Focus on optimizing data storage, retrieval, and security Implement optimal data structuring techniques for vector retrieval and LLM behavior Key Requirements: Extensive experience with Python programming language and with data engineering projects (5+ years of commercial experience) Experience with data orchestration tools such as Airflow and Prefect Good familiarity with PostgreSQL Proficient in web scraping techniques Strong skills in data manipulation with Pandas library Understanding of parallelization techniques for efficient processing Knowledge of anti-botting measures for secure operations Proficiency in data orchestration tools for managing workflows effectively Strong skills in designing efficient database schemas Understanding of ensuring data integrity and reliability for robust systems Advanced English in speaking and writing (you will be working with international clients only) Nice to have: Experience with AI, LLM, and embeddings Familiarity with developing multimodal models Knowledge of database optimization techniques Full stack experience with NextJS/React Experience with Agile methodologies, particularly the Scrum framework Our Promise (what you can expect from us): 24-28k on a B2B contract 100% remote work (but we have offices in Krakow and Warsaw and we’re happy to meet there from time to time ) 1800 PLN per year for your self-development 1800 PLN per year for all of your home office needs Our B2B contract contains provisions that allow you to obtain IP BOX support Integration events, education opportunities and much more… A unique opportunity to take your career to the next level - we’re looking for people who want to create an impact. You have ideas, we want to hear them! Recruitment process: 30 minute screening call online with our recruiter 30-45 minute technical interview with our developer 1h client call Offer Questions, insights? Feel free to reach out to our recruiting team: [email protected] In the meantime, feel free to visit  our website  where you can find key facts about us.
Data Engineer with NET skills
CSHARK, Kyiv, Kyiv city, ua
We are currently looking for an Engineer experienced with .NET as well as in the field of data engineering . Your responsibilities will be divided between application development in C# and .NET , and data analysis and processing utilizing Python and PySpark technologies. The project you will be involved in is for our client in the manufacturing industry and entails developing an application to visualize the amount of CO2 saved during the production of materials. In Short 100%  REMOTELY  or from one of our 2 offices 140 - 170 PLN/h + VAT ️ B2B contract Your Daily Missions Developing scalable, high-performance backend system using C# and .NET. Handling a broad range of tasks from data aggregation and database schema definition to the implementation of APIs and ETL processes. Mapping business processes to data models and/or algorithms. Ensuring code quality and maintainability through comprehensive unit testing, integration testing, and code reviews. Writing code that is scalable, resilient, testable, efficient, and easily maintainable. Developing expansive, quick-response distributed systems capable of handling vast amounts of actions. The Essentials We’re Seeking More than 4 years of programming experience. Experience with C# and .NET (latest versions). Experience with data engineering frameworks in Python/PySpark (such as Databricks, Synapse, or Microsoft Fabric). Hands-on experience with Azure Data Factory, Power Automate, Azure Functions, REST APIs, Dataverse, Dynamics 365 (ODATA API). Experience with Kafka. Knowledge of algorithms and data structures. Good practical knowledge of SQL and relational databases: PostgreSQL, SQL Server. Familiarity with software development methodologies (e.g., Agile, Scrum) and best practices in code organization, documentation, and code review. Willingness to continuously learn and stay updated with the latest trends, technologies, and best practices in software engineering. Fluency in English (minimum B2) - we work in an international team. Nice-to-haves Experience in Data Science with Python. Experience and passion for Web3 technologies such as blockchain, smart contracts, solidity, NFTs. Reasons Why You Would Enjoy Working With Us We work in the latest technologies and with international clients, and our projects are polished from < to /> - learn about some of the projects we have completed. We organize TechTalks, meet-ups and create guilds where we exchange knowledge - you can learn a lot from your teammates. We have a close-knit team and we make sure to have regular integrations - we often go out to celebrate together. You can work remotely or choose to work from our offices in Wroclaw or Bielsko-Biala. However, we count on your openness to occasionally visit the office for team meetings or client visits, or possible occasional trips to the client's headquarters. You can adjust your working hours to suit your needs, starting your day between 7:00 and 10:00 am. You can work with great specialists in their fields who also have a sense of humor and other after-hours hobbies. It is the people who create the unique atmosphere and relaxed atmosphere at CSHARK. We have a flat structure - we are not a corporation and don't want to be one . We offer bonuses for employee referrals (from 3000 PLN to 7000 PLN). We organize remote English classes and provide access to the company library. You can also become an author/author of articles published on our technical blog - https://cshark.com/blog/ .
Data Engineer
Experis Manpower Group, Kyiv, Kyiv city, ua
We are seeking someone who can assist in establishing an infrastructure foundation on the Google Cloud Platform (GCP) and migrating the current legacy data lake to a manageable, scalable, and secure data lake in the cloud. This data lake should provide the necessary data analytics capabilities to meet our growing needs. We require assistance in assessing, designing, planning, and migrating the existing on-premise or cloud data lake to Google BigQuery. Work model:  100% remote Methodology:  Agile Project duration:  06.2024 – 11.2024, extension possible Main responsibilities: Assess the current system including Flows, Data pipelines, Schemas and reports. Focus on the following areas: Data Governance leading practices, definitions, guidelines, process, and recommendations for products available in GCP: Data Catalog, Data Lineage, Data Quality, Data Masking, Data Classification Datasets and analytics infrastructure Database and application technology Extract, transform and load (ETL) or extract, load, and transform (ELT) workloads Job orchestration and scheduling needs Tooling supply plan (including supporting end of life) Plans for continuous integration between applications Business units and other teams using data solutions As – is technical state for the areas analysed above Desired future – state for data transformation, data lake, or data analytics Main requirements: Bachelor's degree in business, computer science, or a related field Experience in below technologies (minimum 3 years) BigQuery Data Flow SQL Python Experience in migration data warehouses to BigQuery English – C1 Our offer: B2B via Experis MultiSport Plus PZU group insurance Medicover e-learning platform
Data Engineer
Stonex Poland, Kyiv, Kyiv city, ua
Connecting clients to markets – and talent to opportunity   With 4,300 employees and over 400,000 retail and institutional clients from more than 80 offices spread across five continents, we’re a Fortune-100, Nasdaq-listed provider, connecting clients to the global markets – focusing on innovation, human connection, and providing world-class products and services to all types of investors. Whether you want to forge a career connecting our retail clients to potential trading opportunities, or ingrain yourself in the world of institutional investing, The StoneX Group is made up of four segments that offer endless potential for progression and growth.   Business Segment Overview:  With boots on the ground authenticity at the heart of everything we do, our comprehensive array of commercial products and services enable you to work directly with our clients, across hedging, risk management, execution and clearing, OTC products, commodity finance and more.   Position Purpose:  The Data Engineer is responsible for empowering the Data team to achieve its primary objectives: ingesting, transforming and exposing real-time, event-driven data streams pertaining to the firm’s data assets. The ideal candidate will exhibit passion for continuous improvement and a dedicated focus on enabling consumers to achieve their goals by making data driven decisions. Responsibilities Primary duties will include:  Prioritizes and executes rapid raw data collection from source systems, targets and implements efficient storage of, employs fast and reliable access patterns. Understands system protocols, how systems operate and data flows. Aware of current and emerging technology tools and their benefits. Expected to independently develop a full software stack. Understands the building blocks, interactions, dependencies, and tools required to complete software and automation work. Independent study of evolving technology is expected. Drives engineering projects by developing software solutions; conducting tests and inspections; building reports and calculations. Strong focus on innovation and enablement, contributes to designs to implement new ideas which improve an existing and new system/process/service. Understands and can apply new industry perspectives to our existing business and data models. Reviews existing designs and processes to highlight more efficient ways to complete existing workload more effectively through industry perspectives. Maintains knowledge of existing technology documents. Writes basic documentation on how technology works using collaboration tools like Confluence. Creates clear documentation for new code and systems used. Documenting systems designs, presentations, and business requirements for consumption and consideration at the manager level. Collaborates with technical teams and utilizes system expertise to deliver technical solutions. Continuously learns and teaches others existing and new technologies. Contributes to the development of others through mentoring or in-house workshops and learning sessions. Drives team practices and procedures to achieve repeatable success and defined expectation of services Provides a significant collaborative role in long-term department planning, with focus on initiatives achieving data empowerment, operational efficiency and sustainability Monitors and evaluates overall strategic data infrastructure; tracks system efficiency and reliability; identifies and recommends efficiency improvements and mitigates operational vulnerabilities. Qualifications To land this role you will need: Proficiency in programming in Python and SQL; willingness to learn and adopt new languages as necessary. Proficiency in ETL process design and implementation in cloud-based environment preferably using Databricks. Problem solving skills – able to work through a problem, analyze root cause and propose solution. Familiarity with API based data distribution. Understanding of Enterprise architecture patterns, Object Oriented & Service Oriented principles, design patterns, industry best practices. Foundational knowledge of data structures, algorithms, and designing for performance. Experience in database technology like MSSQL and caching services like Redis. Exposure to containers, microservices, distributed systems architecture, orchestrators and cloud computing. Excellent communications skills and the ability to work with subject matter expert to extract critical business concepts. Ability to work and potentially lead in an Agile methodology environment. What makes you stand out 3-5 years of experience developing software in a professional environment (preferably financial services but not required). 3 years of hands-on Data Driven Enterprise Application development, preferable in financial industry. Comfortable with core programming concepts and techniques (e.g., concurrency, memory management). Education / Certification Requirements   Bachelor’s degree or relevant work experience in Computer Science, Mathematics, Electrical Engineering or related technical discipline.
Data Engineer
Webellian Sp.z o o, Kyiv, Kyiv city, ua
About the Webellian Webellian is a well-established  Digital Transformation  and  IT consulting  company committed to creating a positive impact for our clients. We strive to make a meaningful difference in diverse sectors such as insurance, banking, healthcare, retail, and manufacturing. Our passion for cutting-edge and disruptive technologies, as well as our shared values and strong principles, are what motivate us. We are a community of engineers and senior advisors who work with our clients across industries, playing a deep and meaningful role in accelerating and realizing their vision and strategy. About the position We are looking for Regular Data Engineer , to work on a project for one of our key customers in the insurance industry. You will work in hybrid mode with your teammates based in Poland and other stakeholders located worldwide and you will be in direct contact with business users of the solution. Goals and challenges Leverage a global data platform and enrich it with additional data capabilities Design and implement the solutions for complete use case data pipelines: from data ingestion and storage, through data processing & implementation of business rules to data consumption (e.g. reporting) Define and apply best practices for development and maintenance of the platform Keep up with trends and evolving technology in the big data and analytics world Look for opportunities to improve performance, reliability and automation Hard skills we are looking for Improve and refine data ingestion and transformation pipelines (ETL/ELT). Proficiency in Python with focus on PySpark Experience with Azure Synapse Analytics would be a plus Knowledge of Azure or other Cloud technology, especially in Data solutions Continuous integration, deployment, and delivery practitioner General understanding of Infrastructure, Orchestration and IT Security Principles (especially on an enterprise level) Soft Skills Experience in Data Engineering Bachelor (BSc), Master (MSc) or equivalent experience in a technical field (for example, Computer Science, Engineering) Fluent English (written and spoken) is a must, other languages (e.g. German, French, Italian, etc.) are a plus DevOps mindset (you build it, you run it ) Experience in insurance domain is a strong plus Capability of understanding complex requirements and breaking them down into actionable implementation tasks with attention to business logic Capability of result oriented communication with people from different departments with different skill sets Writing clear technical specifications Excellent verbal communication skills Leadership, autonomy and drive to grow and learn new technologies and tools What we offer Contract under Polish law:  B2B  or  Umowa o Pracę Benefits such as  private medical care, group insurance, Multisport card There are  English classes  available Hybrid work  (at least 2 days/week on-site) in Warsaw (Mokotów) Opportunity to work with excellent professionals High standards of work and focus on the quality of code New technologies in use Continuously learning and growth International team Pinball, PlayStation & much more (on-site) Join a growing team of dedicated professionals! We love to pass on the knowledge to grow excellence, speak our minds without playing politics, and just enjoy hanging around together. If you share our passions - we want to meet you! So go ahead and apply  Please include the following statement:  “I hereby authorize Webellian Poland Sp. z o.o. to process my personal and store data included in my job application for the needs of following and future recruitment processes (in accordance with the Personnel Protection Act 29.08.1997 no 133 position 883)”.
Data Scientist hours: 08:30 am - 2:30 pm Pacific Time
Crestt, Kyiv, Kyiv city, ua
Hello! We are seeking a highly skilled Senior Data Scientist to drive applied AI/ML strategy, optimizing commercial lines insurance operations. This role is crucial in leveraging data to enhance underwriting methods, pricing frameworks, risk evaluations, and customer experience initiatives. Location: 100% remote (Anywhere, must be able to work 8:30 am - 2:30 pm Pacific Time) Rate: b2b: 140 - 175 PLN netto plus VAT / h Main Responsibilities: Lead data-driven strategies to enhance underwriting, pricing, risk evaluation, and customer initiatives. Analyze extensive datasets to uncover patterns, trends, and potential risks. Develop statistical models and predictive analytics for precise insurance risk evaluations and pricing decisions. Collaborate with actuaries, underwriters, data engineers, and IT experts to advance data-focused projects. Implement machine learning algorithms to automate and refine processes like prospect identification, policy review, claims processing, and client insights presentation. Mentor and guide junior team members and other stakeholders. Educate leaders and colleagues on ML/AI applications in their fields. Stay updated with industry trends, new technologies, and best practices in data science and insurance analytics. Technical Requirements: Must Have: Higher degree in Data Science, Statistics, Computer Science, or related fields. Fluent English. Minimum 6 years of experience as a Data Scientist. Expertise in statistical modeling, predictive analytics, machine learning, and data mining. Proficiency in Python with data manipulation and visualization libraries experience. Experience with large-scale data processing frameworks and databases (Snowflake, SQL, Vector DBs, Knowledge Graphs). Advanced problem-solving skills and the ability to derive meaningful insights from complex datasets. Strong communication and collaboration abilities. Ability to present ideas/solutions to both technical and non-technical stakeholders. Leadership qualities and the ability to mentor junior team members and senior stakeholders. Nice to Have: Passion for early-stage startups or high-growth environments. A background in insurance. Required Technical Skills: Data Science Machine Learning Python Snowflake SQL Vector PostgreSQL AWS Airflow Kafka General Project Tech Stack: Backend: TypeScript, Python, Prisma, NestJs, Ruby on Rails, SQL, Terraform, Spark Frontend: React, GraphQL, Swagger Databases & Data Processing: PostgreSQL, Snowflake, Airflow, DBT, Mode, Stitch Testing: Jest, Vitest, Playwright, RSpec CI/CD & Deployment: CircleCI, Nomad, Docker, Docker Compose Monitoring & Logging: LogRocket, Datadog, Rollbar, LaunchDarkly, AWS CloudWatch Cloud & Infrastructure: AWS (Lambdas, SQS, ECS, IAM), Kafka, Okta, AWS Athena (Nice to have) Collaboration & Design: GitHub, Figma, Atlassian Stack (Jira, Confluence, OpsGenie) AI & Machine Learning: OpenAI, Llama If you are passionate about data science and have the skills and experience to excel in this role, we encourage you to apply. Join us and play a critical role in shaping the future of our AI/ML strategy in the insurance industry. Apply Now! :)
Data Engineer
Datumo, Kyiv, Kyiv city, ua
Datumo specializes in providing Big Data and Cloud consulting services to clients from all over the world, primarily in Western Europe, Poland and the USA. Core industries we support include e-commerce, telecommunications and life science. Our team consists of exceptional people whose commitment allows us to conduct highly demanding projects .  Our team members tend to stick around for more than 3 years, and when a project wraps up, we don't let them go - we embark on a journey to discover exciting new challenges for them. It's not just a workplace; it's a community that grows together!  What we expect:  Must-have:  at least 3 years of commercial experience in Big Data proven record with a selected cloud provider GCP preferred, Azure or AWS good knowledge of Scala/Java /JVM good knowledge of Python understanding of Spark or similar distributed data processing framework experience with BigQuery, Snowflake, Hive or similar distributed datastore designing and implementing Big Data systems following best practices ensuring solution quality through automatic tests, CI / CD and code review proven collaboration with businesses English proficiency at B2 level, communicative in Polish Nice to have: experience in Snowflake/Databricks platform familiarity with Airflow or similar pipeline orchestrator  knowledge of Apache Kafka, Docker and Kubernetes technologies experience in Machine Learning projects experience in Flink willingness to share knowledge (conferences, articles, open-source projects) What’s on offer: 100% remote work, with workation opportunity  20 free days onboarding with a dedicated mentor project switching possible after a certain period individual budget for training and conferences benefits: Medicover private medical care, co-financing of the Medicover Sport card opportunity to learn English with a native speaker regular company trips and informal get-togethers Development opportunities in Datumo: participation in industry conferences establishing Datumo's online brand presence support in obtaining certifications (e.g. GCP, Azure, Snowflake) involvement in internal initiatives, like building technological roadmaps training budget access to internal technological training repositories  Discover our exemplary projects: IoT data ingestion to cloud  The project integrates data from edge devices into the cloud using Azure services. The platform supports data streaming via either the IoT Edge environment with Java or Python modules, or direct connection using Kafka protocol to Event Hubs. It also facilitates batch data transmission to ADLS. Data transformation from raw telemetry to structured tables is done through Spark jobs in Databricks or data connections and update policies in Azure Data Explorer. Petabyte-scale data platform migration to Google Cloud The goal of the project is to improve scalability and performance of the data platform by transitioning over a thousand active pipelines to GCP. The main focus is on rearchitecting existing Spark applications to either Cloud Dataproc or Cloud BigQuery SQL, depending on the Client’s requirements and automate it using Cloud Composer. Data analytics platform for investing company The project centers on developing and overseeing a data platform for an asset management company focused on ESG investing. Databricks is the central component. The platform, built on Azure cloud, integrates various Azure services for diverse functionalities. The primary task involves implementing and extending complex ETL processes that enrich investment data, using Spark jobs in Scala. Integrations with external data providers, as well as solutions for improving data quality and optimizing cloud resources, have been implemented.  Realtime Consumer Data Platform The initiative involves constructing a consumer data platform (CDP) for a major Polish retail company. Datumo actively participates from the project’s start, contributing to planning the platform’s architecture. The CDP is built on Google Cloud Platform (GCP), utilizing services like Pub/Sub, Dataflow and BigQuery. Open-source tools, including a Kubernetes cluster with Apache Kafka, Apache Airflow and Apache Flink, are used to meet specific requirements. This combination offers significant possibilities for the platform.  Recruitment process: Quiz - 15 minutes Soft skills interview - 30 minutes Technical interview - 60 minutes Find out more by visiting our website - https://www.datumo.io If you like what we do and you dream about creating this world with us - don’t wait, apply now! 
Data Engineer
Commerzbank, Kyiv, Kyiv city, ua
Join our team as a Data Engineer! About the project: In your role as a Data Engineer you will be working in the Data as a Service (DaaS) cell which builds products and services designed to provide barrier free, easy to use information and test data for the whole bank. This includes data anonymization services, user interfaces and a consolidated data layer. On a daily basis you will develop products based on database technology which are running in the Cloud (GCP) and on-prem. Your responsibilities will encompass the full Software Development Lifecycle including analysis, architecture, testing and also supporting during production issues during normal working hours. The Big Data cluster is the enabler for data scientists and provides a huge collection of data and a data science workbench in one place. BI technology within the lake infrastructure Establish a stable, state-of-the-art technology base with on-prem and cloud solutions Set up data lake as single data and analytics hub and effectively ingest most important data sources Establish data quality and metadata management Provide Data Marts and sandboxes for segments and functions with the most important combination of data sources What you will be doing? Build and enhance automated Data Pipelines that transform data into usable information Software development, defect and requirements analysis, testing, bug fixing of applications Take care of proper up-to-date system documentation 3rd Level Support for our daily processes, Site reliability engineering Drive technical discussions and implement new features to enable new business cases Participate in regular Scrum ceremonies (Daily, Planning, Review, Retro) Which technology & skills are important for us? High knowledge of SQL Basic knowledge of Data warehouse, relational and non-relational database solutions Basic knowledge of UNIX/LINUX (Red Hat) RH 7.x Google BigQuery knowledge Good knowledge of Hadoop Cloudera Data Platform (CDP) Basic knowledge of GIT Basic knowledge of Cloud Solutions (GCP) How? Hybrid on Wersalska 6 street (Twice a week work from the office - Łódź) Below you can find more information about Commerzbank Commerzbank is a leading international commercial bank with branches and offices in almost 50 countries. The world is changing, becoming digital, and so we are. We are leaving the traditional bank behind us and are choosing to move forward as a digital enterprise. This is exactly why we need talented people who will join us on this journey. We work in inter-locational and international teamwork in agile methodologies. What we offer?  Of course we offer Development Plans for employees Life insurance, flexible working hours, integration events and much more Important! Please add the clause to your CV.  You can find it on the end of the advert. * * *  Please add the following clause to your application: 1. I consent to the processing of personal data contained in this document by Commerzbank Aktiengesellschaft with its registered office in Kaiserstrasse 16, 60311 Frankfurt am Main, Germany, operating through the Branch in Poland with its registered office in Łódź, 91-203 Łódź, ul. Wersalska 6, KRS 0000631053, for the implementation of the current recruitment process and for the future recruitment for a period of 6 months, in accordance with the Regulation of the European Parliament and of the Council (EU) 2016/679 of 27 April 2016 on the protection of individuals with regard to the processing of personal data the free flow of such data and the repeal of Directive 95/46 / EC (RODO) and in accordance with the Act of 10 May 2018 on the protection of personal data (Journal of Laws of 2018, item 1000). I provided my personal data voluntarily and I declare that they are truthful. I have the right to withdraw this consent at any time. The withdrawal of consent shall not affect the lawfulness of processing based on consent before its withdrawal. 2. I have read the content of the information clause, including information about the purpose and methods of processing personal data and the right to access my personal data and about the right to correct, rectify and delete it.
Data Scientist
InPost, Kyiv, Kyiv city, ua
Key responsibilities: Partner with our product and marketing teams  to understand their needs and provide actionable insights Build predictive models  to optimize marketing strategies, enhance user experience, and shape targeting Collaborate closely with cross-functional teams  to ensure seamless integration of data-driven initiatives Stay ahead of the curve  exploring new techniques and trend in marketing analytics Communicate  your analyses outcomes to the management team and other data community members Job requirements: Education  – Bachelor’s or Master’s degree in a relevant field, e.g. data science, computer science, mathematics, econometrics Experience  – you have at least 3 years of commercial experience as a data scientist. Consulting and marketing analytics experience are a plus Mindset  – you are goal-oriented and independent, skilled in change and time management, business-conscious, able to think long-term and decompose business problems Languages  – you are proficient in English (other languages knowledge is a plus) Technical skills: Excellent knowledge of ML solutions and their impact on business, user experience and operational processes (supervised and unsupervised learning, e.g.: clustering, recommender systems, regression, classification, etc.) Hands-on experience with working with large amounts of user data Proficiency in Python 3, as well as ML and data analysis libraries (e.g. pandas, numpy, scikit-learn, keras/pytorch, etc.) Knowledge and experience in PySpark, relational databases, cloud solutions (e.g. Databricks, Azure, GCP, AWS, Snowflake) Nice to have: experience in leveraging CI/CD pipelines in data-based products experience with data pipelines framework, preferably Kedro experience with CLI tools: bash/zsh Why join us? Impact : your work will directly influence strategic decisions and operational efficiencies across multiple international markets Innovation : be part of a team that's pushing the boundaries of data analytics, working with the latest technologies and methodologies Growth : this role offers unparalleled opportunities for professional development in a data-driven, technology-forward environment Collaboration : engage with cross-functional teams and share knowledge and best practises, fostering a culture of continuous learning and improvement Benefits: Flexible work Contract selection Cafeteria worksmile Language learning platform Conceptual freedom Employee Referral Program Promotion opportunity Initiatives and competitions for employees If you're passionate about using data insights to drive meaningful change in marketing strategies, this role is for you. We're searching for an experienced Data Scientist with a keen focus on Marketing Analytics to support us in our marketing efforts and make a real difference. As a (Senior) Data Scientist at the heart of our Marketing hub, you’ll embark on a journey of data exploration, unravelling trends, and unlocking insights that redefine our brand’s narrative. Your mission? To wield the power of machine learning techniques to optimize our marketing campaigns, driving brand visibility and sharpening our edge in the market. If you’re ready to channel your skills into shaping the future of marketing in the logistics industry (and beyond), we’re eager to welcome you aboard.
Data Engineer for Voice Assistant
Samsung R&D Institute Poland, Kyiv, Kyiv city, ua
Data Engineer for Voice Assistant About our Team We invite you to the one of the largest speech and language processing teams in Europe. We work closely with other R&D teams to develop and test our next-generation personal Intelligent Assistant. In our lab engineers, researchers, and linguists work together on innovative products for the multilingual European market. We define the way users access, explore and interact with devices, knowledge, information, and services. With us you have unique opportunity to work on product available on a wide range of devices and used by millions of users.   Role and Responsibilities Development and maintenance of dashboards and internal web services to present, access, annotate text or visualize usage data related to Voice Assistant, Management of Linux servers used for data acquisition and processing, Development and maintenance of data processing pipelines used for language analytics tasks, Automation of repetitive tasks for Natural Language Processing (NLP), such as: retrieval of text data, text corpora management, text corpora annotation, Exploration of available text data, to create meaningful reports (e.g. trends report, usage patterns report) and define metrics (e.g. end to end success rate) for other development teams, Significant influence on the direction of work in the team, opportunity to participate in creation of project proposals, research and patent applications (especially in the field of data processing and analytics),Significant impact on technological stack: this is R&D team and we can decide what technologies we use more freely than regular development teams.     Technologies in use Python, DevOps (Linux, Bash, git, Jenkins, Docker, Openstack, nginx, Ansible) Data Engineering & Data Science (variety of libraries for training & test data collection, data augmentation, text corpus processing), Databases (PostgreSQL, InfluxDB) Data Visualization and dashboarding tools (Voila, Dash, Grafana, Flask, Jupyter, Python visualization stack)   Skills and Qualifications  Bachelor's or master's degree in Computer Science, Mathematics, Telecommunications or related fields. Proficiency in Python. Practical knowledge of the Linux environment and Bash scripting. Experience in Git, Github, Jenkins, Grafana, Docker or similar tools. Knowledge of English at a level that allows for easy communication. Creativity, ability to adapt knowledge to create innovation and open-mind is a plus. Nice to have Practical knowledge in Data Engineering and/or Data Science. Experience in databases (especially Postgresql, InfluxDB). Experience in any subdomain of Natural Language Processing (text classification, word & sentence embeddings, named entity recognition, information extraction, evaluation of machine learning models, sentiment analysis, deep learning methods). Experience in human-computer interaction application development text or voice (Chatbot development, voice assistant, messenger bot, Alexa Skills development, Google Assistant Actions development etc.). Ability to use data visualization and dashboarding tools in Python in practice.   We offer Team: Friendly working atmosphere Wide range of trainings (technical / soft-skills / e-learning platform) Opportunity to work in multiple projects Multidisciplinary and multicultural team  Working with the latest technologies on the market Monthly integration budget Possibility to attend local and foreign conferences Opportunity to participate in science research (scientific papers, project proposals, patents applications, development of own side-projects) Equipment: Laptop and PC workstation + 2 external monitors OS: Windows, Linux Benefits: Private medical care (possibility to add family members) Multisport card Life insurance Lunch card A partial reimbursement of the cost of an English language course Possibility to learn Korean for free Variety of discounts (Samsung products, theaters, restaurants) Unlimited free access to Copernicus Science Center for you and your friends Possibility to test new Samsung products Location: Office in Warsaw Spire near metro station / Office in Cracow Quattro Business Park Hybrid work system (3 days per week from the office)
Data Science Analyst
Allegro, Kyiv, Kyiv city, ua
A hybrid work model requires 1 - 2 days a week in the office. Please note: Despite being a role within the Data Science team, this is primarily an Analyst position. We are seeking candidates with strong analytical prowess, not specifically looking for a Data Scientist. About the team The Data Science Hub is the place where we apply analytical techniques, mathematics, and machine learning to solve a wide range of business problems. We provide valuable insights and make informed decisions by processing terabytes of data on a daily basis. Our team offers excellent growth opportunities and a rare chance to gain interdisciplinary knowledge about the functioning of e-commerce platforms. The breadth of our impact on different business domains is exemplified by our diverse portfolio of projects, which includes: logistics, logistic network optimization, marketing, pricing, finance, and more. Data Science Hub consists of 5 teams: 3 Data Science teams, Data Analytics team, Data Engineering team. We are looking for new members for the Data Analytics team.  We are looking for people who Are very familiar with Python and SQL Have basic knowledge of ML modeling  Have at least 1 year of experience working as an analyst Worked with Git Knowing Looker Studio or Tableau is advantageous Worked with tabular data and their visual representations Understand mathematical concepts, statistical modeling, and probability theory Are not afraid to challenge the current state of things Understand how the business side of data projects works Are curious and open to learning new things Know English at B2+ level In your daily work you will handle the following tasks You will become part of a team that is responsible for assessing incremental value for ML projects You will be analyzing the complete ML pipeline, highlighting potential areas for improvement and bottlenecks Daily, you will mine data to prove/disprove hypotheses Designing experiments for ML projects, such as AB tests, causality exploration, and model analysis, and subsequently offering insights derived from observations. You will co-develop new methodologies and functionalities inside of internal Python library What we offer Well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms) A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers) English classes that we pay for related to the specific nature of your job Macbook Pro / Air (depending on the role) or Dell with Windows (if you don't like Macs) and other gadgets that you may need Working in a team you can always count on — we have on board top-class specialists and experts in their areas of expertise A high degree of autonomy in terms of organizing your team’s work; we encourage you to develop continuously and try out new things Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues) If you want to learn more,   check it out Why is it worth working with us We are researching and developing our own state-of-the-art tools Big Data – several petabytes of data and Machine Learning used in production We practice Code Review, Continuous Integration, Scrum/Kanban, Domain Driven Design, Test Driven Development, Pair Programming, depending on the team Our deployment environment combines private Data Centers (tens of thousands of servers) and Public Clouds (Google Cloud and Microsoft Azure) Over 100 original open-source projects and a few thousand stars on  GitHub This may also interest you Allegro Tech Podcast →  https://podcast.allegro.tech/ Send in your CV and see why it is #dobrzetubyć (#goodtobehere)
Data Scientist
Playtech, Kyiv, Kyiv city, ua
Data ScientistFull-timeCompany DescriptionFounded in 1999 and premium listed on the Main Market of the London Stock Exchange, Playtech is a technology leader in the gambling industry with over 7,000 employees across 20 countries. Playtech is the gambling industry's leading technology company delivering business intelligence-driven gambling software, services, content, and platform technology across the industry's most popular product verticals, including, casino, live casino, sports betting, virtual sports, bingo and poker. Read more about who we are and what we do here: & Here at Playtech, we genuinely believe that people are our biggest asset. Diverse thoughts, experiences, and individual characteristics enrich our work environment and lead to better business decisions. Recognizing differences and ensuring our processes are transparent is the core of Playtech’s overall commitment to responsible business practices. BIT unit is looking for a proactive Data Scientist with analytical and good communication skills to develop production data science applications in Python, end to end Job DescriptionYour influential mission. You... Design and develop new AI services from scratch. Maintain and enhance the current production services. Cooperate with R&D and DevOps teams. Research, invent and adapt machine learning algorithms for dedicated business needs. Perform predictive and statistical modelling. Perform ad-hoc analyses as required. QualificationsComponents for success. You... Have proven experience of 4 years as a Python developer (PyCharm, git, debugging).Have proven experience of 3 years in predictive modeling, preferably in data science production services.Have a minimum of 3 years of tabular or time series data analysis as data developer, data scientist, or data analyst.Have an advanced understanding of machine learning algorithms, capable of independently selecting and applying the most effective models. Have strong skills in data manipulation and SQL. Are strong team player, with excellent communication skills, capable of working collaboratively in a remote environment. Have a high level of English. Have at least BS in Economics\Mathematics\Statistics Additional InformationPlaytech BI and Data team are Responsible for collecting, organizing, presenting & utilizing data across all Playtech products and services. The team is growing, accounting for almost 20 developers, data scientists and product managers. BI in Playtech isn’t just used internally but also offered to licensee as a product alongside its casino, poker & other gaming portfolio. The data science team is comprised of 10 data scientists who operate like a regular development team. The team develops production systems such as: personalized recommendations, replacing business rules configuration with real-time models, churn detection and more. The models control every major aspect of the player user experience. Playtech is an equal opportunities employer. Our mission is to welcome everyone and create inclusive teams. We celebrate differences and encourage everyone to join us and be themselves at work.
Data Engineer
ITDS, Kyiv, Kyiv city, ua
Join us, and innovate with cutting-edge cloud technologies! Krakow-based opportunity with the possibility to work 90% remotely! As a  Data Engineer , you will be working for our client, a leading global banking and finance institution, in the Global Banking and Markets sector. You will play a pivotal role in the GBI Transformation project, facilitating data integration across MSS Ops globally.   Your main responsibilities: Onboarding new data sources, designing, building, testing, and deploying Cloud data ingest pipelines and data models using GCP technologies (CloudStore, BigQuery, Data Fusion) Implementing appropriate authorization, authentication, encryption, and other security measures Developing procedures and scripts for data migration and initialization Reviewing and refining business and technical requirements, prioritizing tasks in Jira Managing code artifacts and CI/CD processes Ensuring applications meet non-functional requirements and IT standards Writing well-commented, maintainable code and providing support during the development lifecycle You're ideal for this role if you have: Experience with DataFusion, CDAP, or Spark data pipelines Proficiency in Java or Python for custom plugin development Strong SQL/T-SQL skills and expertise in database administration Expertise in Administration and Development of On-prem or Cloud Databases, Warehouses and Lakes Excellent understanding of GCP architecture and solution design Excellent knowledge of devops tools: Ansible, Jenkins, Puppet, Chef, Google Secrets Manager, Github Knowledge of Agile/Scrum, DevOps, and ITIL principles BS/MS degree in Computer Science or related field Fluent English Enthusiastic willingness to learn and develop technical skills independently Strong organizational and multitasking abilities It is a strong plus if you have: Experience with application monitoring and production support Broad experience with IT development and collaboration tools Understanding of IT security and application development best practices Interest in investment products and the investment banking business Experience working in global teams with diverse cultures We offer you: ITDS Business Consultants is involved in many various, innovative and professional IT projects for international companies in the financial industry in Europe. We offer an environment for professional, ambitious, and driven people. The offer includes: Stable and long-term cooperation with very good conditions Enhance your skills and develop your expertise in the financial industry Work on the most strategic projects available in the market Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years Participate in Social Events, training, and work in an international environment Access to attractive Medical Package Access to Multisport Program Internal number #4879