Ми використовуємо cookies для покращення досвіду користувачів, аналізу трафіку і відображення відповідної реклами.
Детальніше Погоджуюсь
Введіть посаду

Огляд статистики зарплатні професії "Data Security Engineer в Україні"

Отримувати інформацію зі статистикою на пошту

Огляд статистики зарплатні професії "Data Security Engineer в Україні"

17 210 ₴ Середня зарплата в місяць

Рівень середньої зарплатні за останні 12 місяців: "Data Security Engineer в Україні"

Валюта: UAH USD Рік: 2024
На гістограмі зображено зміну рівня середньої заробітної плати професії Data Security Engineer в Україні.

Розподіл вакансії "Data Security Engineer" по областям Україні

Як видно з діаграми, в Україні найбільшу кількість вакансій професії Data Security Engineer відкрито в Донецькій області. На другому місці - Полтавська область, а на третьому - Житомирська область.

Рекомендовані вакансії

Senior Java Engineer for Data&AI platform
INTELLIAS, Ukraine (remote), Poland (remote)
Let’s connect people globally! With a first-class team of engineers, Intellias provides and supports top-level services in the telecom sector all over the world. Join in and be among those who create the future of communications!Project Overview: Our client, a leading software and services provider to communications and media companies, is looking for a strong and motivated Java Developer with experience in Spring/Kafka/Kubernetes for building high-performing, scalable, enterprise-grade applications and frameworks. You will be part of a talented software scrum team that is building the next generation of our product using wide ranges of technology stack.Responsibilities: Work with other team members to improve, maintain and monitor the data audit domain in the Data and AI platform.Innovate the audit domain and extend the capabilities and knowledge within the team.Work with the architect of the domain.Requirements: Bachelor's degree in Science/IT/Computing or equivalent.Java development expert or specialist with at least (7 years).Experience and deep knowledge of Spring boot Microservices, Kubernetes, Kafka and Event-based services that use Kafka streams. (4+ years).Experience working with Linux (2+ years).Experience with Agile Methodology.Good communication skills (English Level B2+).Will be a plus:Experience with Helm charts.Experience with Grafana and Prometheus.Experience with open telemetry.#LI-VT1
Data Platform Engineer
ITDS, Odesa, Odesa Oblast, ua
Join us, and drive data excellence in financial technology! Krakow-based opportunity with the possibility to work 80% remotely! As a  Data / Platform Engineer , you will be working for our client, a leading financial institution. Joining the Client’s department, you'll support various tasks related to the development, analysis, and maintenance of data processes on GCP platforms. Your main responsibilities: Analyzing data engineering pipelines Refactoring pipelines for optimization Migrating processes between platforms and technologies Establishing best practices for data engineering Supporting CI/CD processes and DataOps Developing reporting tools Collaborating with technical and business teams Problem-solving and troubleshooting data-related issues Supporting system administration of Airflow for DAGs development Engaging in application life-cycle management You're ideal for this role if you have: Experience with Spark (Python or Scala) Strong programming and data analytics skills Proficiency in GCP data services, especially BigQuery and Dataproc Understanding of cost and performance within GCP Excellent communication and collaboration skills It is a strong plus if you have: Familiarity with Airflow for DAGs development Experience with CI/CD tools and Infrastructure as Code Understanding of SRE and Service Management principles We offer you:  ITDS Business Consultants is involved in many various, innovative and professional IT projects for international companies in the financial industry in Europe. We offer an environment for professional, ambitious and driven people. The offer includes:  Stable and long-term cooperation with very good conditions  Enhance your skills and develop your expertise in the financial industry  Work on the most strategic projects available in the market  Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years  Participate in Social Events, training and work in an international environment  Access to attractive Medical Package  Access to Multisport Program  Internal number #5219
Data Software Engineer - MLE (Machine Learning Engineer)
Epam, null, ua
Description We are seeking a highly skilled and motivated Data Software Engineer for our Machine Learning Engineering (MLE) team in Ukraine. As a Data Software Engineer, you will play a crucial role in developing and optimizing data-driven solutions, with a focus on Machine Learning. The successful candidate will have a strong background in Data Software Engineering, proficiency in Python and SQL, and experience working with cloud platforms such as Azure, AWS, or GCP. Knowledge of Databricks and Spark is considered a plus.Retraining/Upskilling Format: The selected candidate will undergo a two-month training period as part of the probation. The successful completion of the probation will be contingent upon obtaining a Databricks certification, with the associated costs covered by EPAM.If you are a passionate Data Software Engineer with a strong interest in Machine Learning, and you meet the specified requirements, we invite you to apply and join our dynamic team in Ukraine. Together, we will drive innovation and contribute to the evolution of cutting-edge data solutions.#LI-IRINABENKO Responsibilities Collaborate with the MLE team to design and implement data-driven solutions Develop and maintain data pipelines for machine learning applications Utilize Python and SQL to manipulate and analyze large datasets Work with cloud platforms, with a priority on Azure, AWS, or GCP Explore and implement advanced data processing techniques using Databricks and Spark Participate in code reviews, debugging, and troubleshooting to ensure high-quality deliverables Engage in continuous learning and stay updated on emerging trends in data engineering and machine learning Requirements 4+ years of hands-on experience in Software Engineering roles Bachelor's degree in Computer Science, Data Science, or related field Proficient in Python (must have) Strong SQL skills (must) Experience with at least one cloud platform (Azure, AWS, GCP) Upper-intermediate or higher English level, both spoken and written (B2+) Nice to have Familiarity with Databricks and Spark We Offer Competitive compensation depending on experience and skills The individual career path Social package - medical insurance, sports Compensation for sick lists and regular vacations English classes with certified English teachers Unlimited access to LinkedIn learning solutions Flexible work hours About EPAM EPAM is a leading global provider of digital platform engineering and development services. We are committed to positively impacting our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to learn and grow continuously. You will join a dedicated, creative, and diverse community that will help you discover your fullest potential. EPAM is committed to providing our global team of 61,+ EPAMers with inspiring careers. EPAMers lead with passion and honesty and think creatively. Our people are the source of our success, and we value collaboration, try always to understand our customers' business, and strive for the highest standards of excellence
Data Azure Engineer
Devapo Sp. z o. o., Odesa, Odesa Oblast, ua
Our company is seeking a skilled Data Azure Engineer to join our team and be part of a project for a large financial institution. As a Data Azure Engineer, you will develop cutting-edge data platforms using Azure technology. You will have the opportunity to lead important projects, participate in decision-making processes and collaborate with a diverse team of experts. Your expertise will drive breakthrough solutions for financial institutions and beyond. Being part of our team, you will receive a multitude of additional benefits to enjoy outside of work. What we expect: 3+ years of experience in data engineering Python programming knowledge Deep knowledge and strong experience in technologies (Azure Synapse, Databricks, Azure Datafactory, PySpark) Strong and proven knowledge working with data lakes, lakehouses, and data warehouses in the cloud (Databricks, Snowflake, Azure) Advanced proficiency working with SQL Strong experience with ETL/ELT processes, data ingestion, data transformation, data modeling Experience in code repositories(Git) Nice to have: Dbt knowledge and experience Knowledge of Kubernetes Responsibilities: Developing data platforms, data lakehose, data warehouses End-to-end development of ETL/ELT processes Building and designing data platform components to enable clients to produce and consume data Developing, implementing, and maintaining change control, testing processes Researching and implementing best practices and new approaches to our current data stack and systems Data modeling What we offer: Salary: 90 - 120 PLN + VAT (B2B contract) Co-financing of trainings and certificates and provide assured time for learning within working hours, Private medical care and Multisport card Language classes (English), Flexible working hours, Meetings and integration events, Reference bonus for recommending a new hire, Individually tailored path for your career development, The ability to work in a hybrid form from our Warsaw office
Senior Data Integration Engineer (Python, Databricks, Spark SQL)
Epam, Kyiv, Kyiv city, ua
Description DESCRIPTIONOur client is a global, privately owned company thatconnects people with ideas, data with insights, supply with demand, restaurants with deliveries and ultimately, people with the products they love. Responsibilities Implementation of DWH and Data Hubs, including the full ETL process Implementation of Data Models Unit Testing Requirements Experience building data ingestion pipelines with tools like: Databricks, SSIS, Talend, Informatica, etc Experience implementing data models that have been designed by someone else Strong SQL The plan is to have the data engineers create Databricks notebooks using either SQL or Python, but SQL is preferable Experience unit testing code Upper-Intermediate level of English, both spoken and written (B2+) Nice to have Azure/other cloud storage technologies, SSAS tabular, Azure Data Factory, Azure DevOps We Offer Competitive compensation depending on experience and skills Individual career path Unlimited access to LinkedIn learning solutions Sick leave and regular vacation English classes with certified English teachers Flexible work hours
Data Azure Engineer
Devapo Sp. z o. o., Kyiv, Kyiv city, ua
Our company is seeking a skilled Data Azure Engineer to join our team and be part of a project for a large financial institution. As a Data Azure Engineer, you will develop cutting-edge data platforms using Azure technology. You will have the opportunity to lead important projects, participate in decision-making processes and collaborate with a diverse team of experts. Your expertise will drive breakthrough solutions for financial institutions and beyond. Being part of our team, you will receive a multitude of additional benefits to enjoy outside of work. What we expect: 3+ years of experience in data engineering Python programming knowledge Deep knowledge and strong experience in technologies (Azure Synapse, Databricks, Azure Datafactory, PySpark) Strong and proven knowledge working with data lakes, lakehouses, and data warehouses in the cloud (Databricks, Snowflake, Azure) Advanced proficiency working with SQL Strong experience with ETL/ELT processes, data ingestion, data transformation, data modeling Experience in code repositories(Git) Nice to have: Dbt knowledge and experience Knowledge of Kubernetes Responsibilities: Developing data platforms, data lakehose, data warehouses End-to-end development of ETL/ELT processes Building and designing data platform components to enable clients to produce and consume data Developing, implementing, and maintaining change control, testing processes Researching and implementing best practices and new approaches to our current data stack and systems Data modeling What we offer: Salary: 90 - 120 PLN + VAT (B2B contract) Co-financing of trainings and certificates and provide assured time for learning within working hours, Private medical care and Multisport card Language classes (English), Flexible working hours, Meetings and integration events, Reference bonus for recommending a new hire, Individually tailored path for your career development, The ability to work in a hybrid form from our Warsaw office
Data Software Engineer - MLE (Machine Learning Engineer)
Epam, Odesa, Odessa Oblast, ua
Description We are seeking a highly skilled and motivated Data Software Engineer for our Machine Learning Engineering (MLE) team in Ukraine. As a Data Software Engineer, you will play a crucial role in developing and optimizing data-driven solutions, with a focus on Machine Learning. The successful candidate will have a strong background in Data Software Engineering, proficiency in Python and SQL, and experience working with cloud platforms such as Azure, AWS, or GCP. Knowledge of Databricks and Spark is considered a plus.Retraining/Upskilling Format: The selected candidate will undergo a two-month training period as part of the probation. The successful completion of the probation will be contingent upon obtaining a Databricks certification, with the associated costs covered by EPAM.If you are a passionate Data Software Engineer with a strong interest in Machine Learning, and you meet the specified requirements, we invite you to apply and join our dynamic team in Ukraine. Together, we will drive innovation and contribute to the evolution of cutting-edge data solutions.The remote option applies only to the Candidates who will be working from any location in Ukraine.#LI-IRINABENKO Responsibilities Collaborate with the MLE team to design and implement data-driven solutions Develop and maintain data pipelines for machine learning applications Utilize Python and SQL to manipulate and analyze large datasets Work with cloud platforms, with a priority on Azure, AWS, or GCP Explore and implement advanced data processing techniques using Databricks and Spark Participate in code reviews, debugging, and troubleshooting to ensure high-quality deliverables Engage in continuous learning and stay updated on emerging trends in data engineering and machine learning Requirements 4+ years of hands-on experience in Software Engineering roles Bachelor's degree in Computer Science, Data Science, or related field Proficient in Python (must have) Strong SQL skills (must) Experience with at least one cloud platform (Azure, AWS, GCP) Upper-intermediate or higher English level, both spoken and written (B2+) Nice to have Familiarity with Databricks and Spark We Offer Competitive compensation depending on experience and skills The individual career path Social package - medical insurance, sports Compensation for sick lists and regular vacations English classes with certified English teachers Unlimited access to LinkedIn learning solutions Flexible work hours About EPAM EPAM is a leading global provider of digital platform engineering and development services. We are committed to positively impacting our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to learn and grow continuously. You will join a dedicated, creative, and diverse community that will help you discover your fullest potential. EPAM is committed to providing our global team of 61,+ EPAMers with inspiring careers. EPAMers lead with passion and honesty and think creatively. Our people are the source of our success, and we value collaboration, try always to understand our customers' business, and strive for the highest standards of excellence
Security Engineer
Remitly, Odesa, Odesa Oblast, ua
Remitly’s vision is to transform lives with trusted financial services that transcend borders. Since 2011, we have been tirelessly delivering on our promises to people who send money around the world. Today, we are reimagining global financial services and building products that extend beyond traditional barriers to give customers access to more of the services they need, no matter where they call home. Join over 2,700 employees worldwide who are growing their careers with purpose and connection with our customers while having a positive impact on millions of people around the globe. The Role We're searching for a Security Engineer to join Remitly's Intrusion Detection & Response Team. This role will help the team build out and own tools and capabilities and help advance the D&R program at Remitly. The role reports to the Director of Detection & Response. You'll accomplish this with a "detection as code" engineering mindset and partner closely with other team members and stakeholders in external teams. Your work will directly impact the security of Remitly data and help to safeguard our users. What You'll Do Design and build systems to detect and investigate potentially malicious activity Create and tune analytics to proactively detect threats with high quality ATT&CK coverage and low false positive rates Investigate and triage interesting or suspicious events Drive incident response efforts across cross-functional teams Help define and execute threat detection and response strategy Participate in the team "on-call" service rotation You Have 3+ years of experience in security or systems engineering 2+ years of experience of those in threat detection or threat response, preferably in a cloud-first environment (IaaS, PaaS, Saas) Bachelor's degree in a related discipline OR equivalent practical experience Ability to work independently in complex operating environments, sometimes in high stress situations Experience building and automating threat detection analytics and threat hunting methodologies Familiar with the MITRE ATT&CK framework is and how to apply it Strong alignment to our mission and values Attention to detail, operates with a high degree of discretion Strong written and verbal communication skills in English Our Benefits Employee Stock Purchase Plan (ESPP) Equity in the company (RSUs) min. 26 days paid holidays + additional Remitly days off Royalties (KUP) Hybrid work arrangements with an office in a Kraków City Centre Commuting to work expenses reimbursement Health/Dental Coverage - LUX MED VIP for employee and family Life Insurance Travel insurance for employee and family Sodexo Lunch Card/Multisport Education / Conferences Budget Equipment of your choice Mental health program for employee and their dependents Family planning program Employee Pension Plan (PPK) Headphones Reimbursement Referral bonus scheme
Senior Data DevOps Engineer with Azure, Databricks
Epam, Lviv, Lviv Oblast, ua
Description We are looking for a Senior Data DevOps to join EPAM and contribute to a project for a large customerAs a Senior Data DevOps in Data Platform, you will focus on maintaining and implementing new features to the data transformation architecture, which is the backbone of the Customer's analytical data platform. As a key figure in our team, you'll implement and deliver high-performance data processing solutions that are efficient and reliable at scale Responsibilities Design, build and maintain highly available production systems utilizing Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse Analytics Design and implement build, deployment, and configuration management systems together with CI/CD experience improvements based on Terraform and Azure DevOps pipeline solutions across multiple subscriptions and environments Improving users experience with Databricks platform based on best practices of Databricks cluster management, cost-effective setups, data security models etc Design, implement and improve monitoring and alerting system Collaborate with Architecture teams to ensure platform architecture and design standards align with support model requirements Identify opportunities to optimize platform activities and processes, implement automation mechanisms to streamline operations Requirements 4 years of professional experience 2 years hands-on experience with a variety of Azure services Proficiency in Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse Analytics Solid Linux/Unix systems administration background Advanced skill in configuring, managing and maintaining networking on Azure cloud Solid experience in managing production infrastructure with Terraform Hands-on experience with one of the Azure DevOps/GitLab CI/GitHub Actions pipelines for infrastructure management and automation Hands-on experience with Databricks platform Practical knowledge of Python combined with SQL knowledge Hands-on experience in one scripting language: Bash, Perl, Groovy Advanced skill in Kubernetes/Docker Good knowledge of Security Best Practices Good knowledge of Monitoring Best Practices Good organizational, analytical and problem solving skills Ability to present and communicate the architecture in a visual form English language ability to direct communicate with customer. B2 level is required We Offer Competitive compensation depending on experience and skills Individual career path Unlimited access to LinkedIn learning solutions Sick leave and regular vacation English classes with certified English teachers Flexible work hours About EPAM EPAM is a leading global provider of digital platform engineering and development services. We are committed to positively impacting our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to learn and grow continuously. You will join a dedicated, creative, and diverse community that will help you discover your fullest potential. EPAM is committed to providing our global team of 54,+ EPAMers with inspiring careers. EPAMers lead with passion and honesty and think creatively. Our people are the source of our success, and we value collaboration, try always to understand our customers' business, and strive for the highest standards of excellence
Security Engineer
Remitly, Kyiv, Kyiv city, ua
Remitly’s vision is to transform lives with trusted financial services that transcend borders. Since 2011, we have been tirelessly delivering on our promises to people who send money around the world. Today, we are reimagining global financial services and building products that extend beyond traditional barriers to give customers access to more of the services they need, no matter where they call home. Join over 2,700 employees worldwide who are growing their careers with purpose and connection with our customers while having a positive impact on millions of people around the globe. The Role We're searching for a Security Engineer to join Remitly's Intrusion Detection & Response Team. This role will help the team build out and own tools and capabilities and help advance the D&R program at Remitly. The role reports to the Director of Detection & Response. You'll accomplish this with a "detection as code" engineering mindset and partner closely with other team members and stakeholders in external teams. Your work will directly impact the security of Remitly data and help to safeguard our users. What You'll Do Design and build systems to detect and investigate potentially malicious activity Create and tune analytics to proactively detect threats with high quality ATT&CK coverage and low false positive rates Investigate and triage interesting or suspicious events Drive incident response efforts across cross-functional teams Help define and execute threat detection and response strategy Participate in the team "on-call" service rotation You Have 3+ years of experience in security or systems engineering 2+ years of experience of those in threat detection or threat response, preferably in a cloud-first environment (IaaS, PaaS, Saas) Bachelor's degree in a related discipline OR equivalent practical experience Ability to work independently in complex operating environments, sometimes in high stress situations Experience building and automating threat detection analytics and threat hunting methodologies Familiar with the MITRE ATT&CK framework is and how to apply it Strong alignment to our mission and values Attention to detail, operates with a high degree of discretion Strong written and verbal communication skills in English Our Benefits Employee Stock Purchase Plan (ESPP) Equity in the company (RSUs) min. 26 days paid holidays + additional Remitly days off Royalties (KUP) Hybrid work arrangements with an office in a Kraków City Centre Commuting to work expenses reimbursement Health/Dental Coverage - LUX MED VIP for employee and family Life Insurance Travel insurance for employee and family Sodexo Lunch Card/Multisport Education / Conferences Budget Equipment of your choice Mental health program for employee and their dependents Family planning program Employee Pension Plan (PPK) Headphones Reimbursement Referral bonus scheme
Senior Data Integration Engineer (Azure, Databricks, SQL)
Epam, Kyiv, Kyiv city, ua
Description DESCRIPTIONOur client is a German multinational pharmaceutical and biotechnology company and is one of the largest pharmaceutical companies and biomedical companies in the world. Responsibilities Create data models and related data pipelines in Azure DataBricks and Data Factory for analytical dashboards, integrating multiple data assets Support the architectural decisions, and participate in the elaboration of new implementation proposals for our customers, e.g. providing high level estimations and helping establish the right assumptions Drive, lead and coach other BE engineers to implement data pipelines following the best practices and influencing the customer requirements Strive to understand the problems to solve and proactively make suggestions on the best way to addressed them (performance, data volume, data discrepancies or mismatches, operational costs, etc.) Understanding, having had the experience of working with sales and consumer good analytics the most important metrics and aggregations to provide and the challenges they often have Working with and supporting several analytical teams (frontend developers, product owners, QAs, solution architect) Requirements 5+ years of relevant development experience and practice with data management, data storage, data modeling, data analytics, data migration, and database design Expert hands-on experience with Databricks Expert-level knowledge of SQL Experienced with Azure Data Factory Production coding experience with one of the data-oriented programming languages Delta Lake Upper-Intermediate level of English, both spoken and written (B2+) We Offer Competitive compensation depending on experience and skills Individual career path Unlimited access to LinkedIn learning solutions Sick leave and regular vacation English classes with certified English teachers Flexible work hours
Data Center Engineer
Experis, Kyiv, Kyiv city, ua
As a Data Center Network Engineer you will join the team of network professionals, where you will be responsible for maintaining stable, reliable, and secure data center network services, their development and improvement. Requirements (not everything is required): Experience with  Cisco Nexus 3K/5K/9K Knowledge of dynamic routing protocols: OSPF, BGP Understanding of NGFW firewalls (Paloalto, Fortigate, Checkpoint) Understanding of IP networking, L2/L3 network protocols, TCP/IP, VLAN, VRRP, LACP, MC-LAG, EVPN with VXLAN, DHCP, DNS Good understanding of Application Delivery Controller (F5) Excellent analytical skills to troubleshoot complex technical problems Not required, but it’s an advantage if you are experienced with: Network automatization for infrastructure deployment Python programming o VMware NSX-V/NSX-T technologies Experience in public cloud network technologies Your personality: Strong organizational skills Team player Great time management skills Strong goal-oriented mindset and focus on high quality Strong sense of ownership of the network Proactive problem solver Fluent in English, written and spoken Offer: 100% remote work B2B via Experis MultiSport Plus Group insurance Medicover E-learning platform
Senior Data DevOps Engineer with Azure, Databricks
Epam, null, ua
Description We are looking for a Senior Data DevOps to join EPAM and contribute to a project for a large customerAs a Senior Data DevOps in Data Platform, you will focus on maintaining and implementing new features to the data transformation architecture, which is the backbone of the Customer's analytical data platform. As a key figure in our team, you'll implement and deliver high-performance data processing solutions that are efficient and reliable at scale Responsibilities Design, build and maintain highly available production systems utilizing Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse Analytics Design and implement build, deployment, and configuration management systems together with CI/CD experience improvements based on Terraform and Azure DevOps pipeline solutions across multiple subscriptions and environments Improving users experience with Databricks platform based on best practices of Databricks cluster management, cost-effective setups, data security models etc Design, implement and improve monitoring and alerting system Collaborate with Architecture teams to ensure platform architecture and design standards align with support model requirements Identify opportunities to optimize platform activities and processes, implement automation mechanisms to streamline operations Requirements 4 years of professional experience 2 years hands-on experience with a variety of Azure services Proficiency in Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse Analytics Solid Linux/Unix systems administration background Advanced skill in configuring, managing and maintaining networking on Azure cloud Solid experience in managing production infrastructure with Terraform Hands-on experience with one of the Azure DevOps/GitLab CI/GitHub Actions pipelines for infrastructure management and automation Hands-on experience with Databricks platform Practical knowledge of Python combined with SQL knowledge Hands-on experience in one scripting language: Bash, Perl, Groovy Advanced skill in Kubernetes/Docker Good knowledge of Security Best Practices Good knowledge of Monitoring Best Practices Good organizational, analytical and problem solving skills Ability to present and communicate the architecture in a visual form English language ability to direct communicate with customer. B2 level is required We Offer Competitive compensation depending on experience and skills Individual career path Unlimited access to LinkedIn learning solutions Sick leave and regular vacation English classes with certified English teachers Flexible work hours About EPAM EPAM is a leading global provider of digital platform engineering and development services. We are committed to positively impacting our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to learn and grow continuously. You will join a dedicated, creative, and diverse community that will help you discover your fullest potential. EPAM is committed to providing our global team of 54,+ EPAMers with inspiring careers. EPAMers lead with passion and honesty and think creatively. Our people are the source of our success, and we value collaboration, try always to understand our customers' business, and strive for the highest standards of excellence
Senior Data Integration Engineer (Azure, Databricks, SQL)
Epam, null, ua
Description Our client is a German multinational pharmaceutical and biotechnology company and is one of the largest pharmaceutical companies and biomedical companies in the world Responsibilities Create data models and related data pipelines in Azure DataBricks and Data Factory for analytical dashboards, integrating multiple data assets Support the architectural decisions, and participate in the elaboration of new implementation proposals for our customers, e.g. providing high level estimations and helping establish the right assumptions Drive, lead and coach other BE engineers to implement data pipelines following the best practices and influencing the customer requirements Strive to understand the problems to solve and proactively make suggestions on the best way to addressed them (performance, data volume, data discrepancies or mismatches, operational costs, etc.) Understanding, having had the experience of working with sales and consumer good analytics the most important metrics and aggregations to provide and the challenges they often have Working with and supporting several analytical teams (frontend developers, product owners, QAs, solution architect) Requirements 5+ years of relevant development experience and practice with data management, data storage, data modeling, data analytics, data migration, and database design Expert hands-on experience with Databricks Expert-level knowledge of SQL Experienced with Azure Data Factory Production coding experience with Python Delta Lake or other similar technology Upper-Intermediate level of English, both spoken and written (B2+) We Offer Competitive compensation depending on experience and skills Individual career path Unlimited access to LinkedIn learning solutions Sick leave and regular vacation English classes with certified English teachers Flexible work hours About EPAM EPAM is a leading global provider of digital platform engineering and development services. We are committed to positively impacting our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to learn and grow continuously. You will join a dedicated, creative, and diverse community that will help you discover your fullest potential. EPAM is committed to providing our global team of 53,+ EPAMers with inspiring careers. EPAMers lead with passion and honesty and think creatively. Our people are the source of our success, and we value collaboration, try always to understand our customers' business, and strive for the highest standards of excellence
Data Quality Engineer
NeoGames, Odesa, Odesa Oblast, ua
/Only for candidates based in Warsaw or Cracow who are comfortable working on Umowa o Pracę - Employment Contract (no possibility to work on B2B)/ Project description: Aristocrat Interactive is a leader in the Online Real Money Gaming space offering solutions spanning game content and aggregation, lotteries, online casino, sportsbook, game systems and managed services offered industry leading platforms. The Data & BI team owns the group’s Data & Analytics platforms spanning Data Engineering, Analytical Engineering and Business Intelligence to lead the group’s data-driven modernization both internally and for its clients. The Data Quality Engineer will play a vital role to develop and own automated and manual testing of data pipelines, reports and data products for customers and internal stakeholders. In this role, the chosen candidate will drive the implementation of Data Quality automation frameworks, tools, and processes to implement scalable and robust data-focused test frameworks and scripts. The ideal candidate will have experience using modern data stacks including Apache Airflow, Data Build Tool, Snowflake, and Kafka, along with a deep understanding of Python. Responsibilities: Establish Data Quality automation frameworks, tools, and processes Design and implement scalable and robust data-focused test frameworks and tools across all the stages of the data lifecycle – from ingestion through transformation to consumption Develop high-quality, automated data-centric tests using Airflow and dbt Creating thorough test strategies, test plans, and test cases for all Data features and products Participate in the data development lifecycle from research, technical design to implementation and maintenance Collaborate with the DevOps team to integrate automation test frameworks and tools into CI/CD Collaborate with the other development and product teams to define high-value test strategies and test cases on large datasets and transformations to determine data quality and data integrity Rigorously test data infrastructure components to comply with privacy regulations including the EU GDPR, PCI, etc. Participate in security audits and implement remedial actions Learn to identify opportunities to compound the growth and efficiency in testing and automation Drive innovation by constantly learning about new automation technologies and methodologies, along with experimentation   Required Skills and Experience: 3+ years of work experience in testing automation - preferably in a role related to data warehouses, databases, ETL/ELT and/or data visualisation Experience in testing applications using frameworks like Cypress, Selenium, and Appium Prior use of Apache Airflow, Data Build Tool, Snowflake, Kafka, Ansible and other data specific tools and applications Proficient in Advanced SQL and Python Experienced with data warehouses like Snowflake, BigQuery, RedShift or equivalent Experienced with databases like Microsoft SQL, PostgreSQL Experienced with data visualization tools like Power BI, Metabase, QlikSense, Tableau, SIsense or equivalent Understanding of software application testing best practices and philosophies with an emphasis around data integrity and data quality Familiarity with data streaming and/or event-driven data pipelines will be considered an asset Clear English written and spoken business communication skills What we offer: High-level compensation on an employment contract and regular performance based salary and career development reviews; Medical insurance (health), employee assistance program; Multisport Card; English classes with native speakers, trainings, conferences participation; Referral program; Team buildings, corporate events. How many interview stages do we have? HR interview Technical interview with the manager Send your CV to our email ASAP, because we can’t wait to start working with U and create cool projects together! LET’S MOVE THE WORLD TOGETHER!
Senior Data Integration Engineer (Azure, Databricks, SQL)
Epam, Lviv, Lviv Oblast, ua
Description Our client is a German multinational pharmaceutical and biotechnology company and is one of the largest pharmaceutical companies and biomedical companies in the world Responsibilities Create data models and related data pipelines in Azure DataBricks and Data Factory for analytical dashboards, integrating multiple data assets Support the architectural decisions, and participate in the elaboration of new implementation proposals for our customers, e.g. providing high level estimations and helping establish the right assumptions Drive, lead and coach other BE engineers to implement data pipelines following the best practices and influencing the customer requirements Strive to understand the problems to solve and proactively make suggestions on the best way to addressed them (performance, data volume, data discrepancies or mismatches, operational costs, etc.) Understanding, having had the experience of working with sales and consumer good analytics the most important metrics and aggregations to provide and the challenges they often have Working with and supporting several analytical teams (frontend developers, product owners, QAs, solution architect) Requirements 5+ years of relevant development experience and practice with data management, data storage, data modeling, data analytics, data migration, and database design Expert hands-on experience with Databricks Expert-level knowledge of SQL Experienced with Azure Data Factory Production coding experience with Python Delta Lake or other similar technology Upper-Intermediate level of English, both spoken and written (B2+) We Offer Competitive compensation depending on experience and skills Individual career path Unlimited access to LinkedIn learning solutions Sick leave and regular vacation English classes with certified English teachers Flexible work hours About EPAM EPAM is a leading global provider of digital platform engineering and development services. We are committed to positively impacting our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to learn and grow continuously. You will join a dedicated, creative, and diverse community that will help you discover your fullest potential. EPAM is committed to providing our global team of 53,+ EPAMers with inspiring careers. EPAMers lead with passion and honesty and think creatively. Our people are the source of our success, and we value collaboration, try always to understand our customers' business, and strive for the highest standards of excellence
Data Center Engineer
Experis, Odesa, Odesa Oblast, ua
As a Data Center Network Engineer you will join the team of network professionals, where you will be responsible for maintaining stable, reliable, and secure data center network services, their development and improvement. Requirements (not everything is required): Experience with  Cisco Nexus 3K/5K/9K Knowledge of dynamic routing protocols: OSPF, BGP Understanding of NGFW firewalls (Paloalto, Fortigate, Checkpoint) Understanding of IP networking, L2/L3 network protocols, TCP/IP, VLAN, VRRP, LACP, MC-LAG, EVPN with VXLAN, DHCP, DNS Good understanding of Application Delivery Controller (F5) Excellent analytical skills to troubleshoot complex technical problems Not required, but it’s an advantage if you are experienced with: Network automatization for infrastructure deployment Python programming o VMware NSX-V/NSX-T technologies Experience in public cloud network technologies Your personality: Strong organizational skills Team player Great time management skills Strong goal-oriented mindset and focus on high quality Strong sense of ownership of the network Proactive problem solver Fluent in English, written and spoken Offer: 100% remote work B2B via Experis MultiSport Plus Group insurance Medicover E-learning platform
Data Software Engineer - MLE (Machine Learning Engineer)
Epam, Dnipro, Dnipropetrovsk Oblast, ua
Description We are seeking a highly skilled and motivated Data Software Engineer for our Machine Learning Engineering (MLE) team in Ukraine. As a Data Software Engineer, you will play a crucial role in developing and optimizing data-driven solutions, with a focus on Machine Learning. The successful candidate will have a strong background in Data Software Engineering, proficiency in Python and SQL, and experience working with cloud platforms such as Azure, AWS, or GCP. Knowledge of Databricks and Spark is considered a plus.Retraining/Upskilling Format: The selected candidate will undergo a two-month training period as part of the probation. The successful completion of the probation will be contingent upon obtaining a Databricks certification, with the associated costs covered by EPAM.If you are a passionate Data Software Engineer with a strong interest in Machine Learning, and you meet the specified requirements, we invite you to apply and join our dynamic team in Ukraine. Together, we will drive innovation and contribute to the evolution of cutting-edge data solutions.The remote option applies only to the Candidates who will be working from any location in Ukraine.#LI-IRINABENKO Responsibilities Collaborate with the MLE team to design and implement data-driven solutions Develop and maintain data pipelines for machine learning applications Utilize Python and SQL to manipulate and analyze large datasets Work with cloud platforms, with a priority on Azure, AWS, or GCP Explore and implement advanced data processing techniques using Databricks and Spark Participate in code reviews, debugging, and troubleshooting to ensure high-quality deliverables Engage in continuous learning and stay updated on emerging trends in data engineering and machine learning Requirements 4+ years of hands-on experience in Software Engineering roles Bachelor's degree in Computer Science, Data Science, or related field Proficient in Python (must have) Strong SQL skills (must) Experience with at least one cloud platform (Azure, AWS, GCP) Upper-intermediate or higher English level, both spoken and written (B2+) Nice to have Familiarity with Databricks and Spark We Offer Competitive compensation depending on experience and skills The individual career path Social package - medical insurance, sports Compensation for sick lists and regular vacations English classes with certified English teachers Unlimited access to LinkedIn learning solutions Flexible work hours About EPAM EPAM is a leading global provider of digital platform engineering and development services. We are committed to positively impacting our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to learn and grow continuously. You will join a dedicated, creative, and diverse community that will help you discover your fullest potential. EPAM is committed to providing our global team of 61,+ EPAMers with inspiring careers. EPAMers lead with passion and honesty and think creatively. Our people are the source of our success, and we value collaboration, try always to understand our customers' business, and strive for the highest standards of excellence
Data Quality Engineer
NeoGames, Kyiv, Kyiv city, ua
/Only for candidates based in Warsaw or Cracow who are comfortable working on Umowa o Pracę - Employment Contract (no possibility to work on B2B)/ Project description: Aristocrat Interactive is a leader in the Online Real Money Gaming space offering solutions spanning game content and aggregation, lotteries, online casino, sportsbook, game systems and managed services offered industry leading platforms. The Data & BI team owns the group’s Data & Analytics platforms spanning Data Engineering, Analytical Engineering and Business Intelligence to lead the group’s data-driven modernization both internally and for its clients. The Data Quality Engineer will play a vital role to develop and own automated and manual testing of data pipelines, reports and data products for customers and internal stakeholders. In this role, the chosen candidate will drive the implementation of Data Quality automation frameworks, tools, and processes to implement scalable and robust data-focused test frameworks and scripts. The ideal candidate will have experience using modern data stacks including Apache Airflow, Data Build Tool, Snowflake, and Kafka, along with a deep understanding of Python. Responsibilities: Establish Data Quality automation frameworks, tools, and processes Design and implement scalable and robust data-focused test frameworks and tools across all the stages of the data lifecycle – from ingestion through transformation to consumption Develop high-quality, automated data-centric tests using Airflow and dbt Creating thorough test strategies, test plans, and test cases for all Data features and products Participate in the data development lifecycle from research, technical design to implementation and maintenance Collaborate with the DevOps team to integrate automation test frameworks and tools into CI/CD Collaborate with the other development and product teams to define high-value test strategies and test cases on large datasets and transformations to determine data quality and data integrity Rigorously test data infrastructure components to comply with privacy regulations including the EU GDPR, PCI, etc. Participate in security audits and implement remedial actions Learn to identify opportunities to compound the growth and efficiency in testing and automation Drive innovation by constantly learning about new automation technologies and methodologies, along with experimentation   Required Skills and Experience: 3+ years of work experience in testing automation - preferably in a role related to data warehouses, databases, ETL/ELT and/or data visualisation Experience in testing applications using frameworks like Cypress, Selenium, and Appium Prior use of Apache Airflow, Data Build Tool, Snowflake, Kafka, Ansible and other data specific tools and applications Proficient in Advanced SQL and Python Experienced with data warehouses like Snowflake, BigQuery, RedShift or equivalent Experienced with databases like Microsoft SQL, PostgreSQL Experienced with data visualization tools like Power BI, Metabase, QlikSense, Tableau, SIsense or equivalent Understanding of software application testing best practices and philosophies with an emphasis around data integrity and data quality Familiarity with data streaming and/or event-driven data pipelines will be considered an asset Clear English written and spoken business communication skills What we offer: High-level compensation on an employment contract and regular performance based salary and career development reviews; Medical insurance (health), employee assistance program; Multisport Card; English classes with native speakers, trainings, conferences participation; Referral program; Team buildings, corporate events. How many interview stages do we have? HR interview Technical interview with the manager Send your CV to our email ASAP, because we can’t wait to start working with U and create cool projects together! LET’S MOVE THE WORLD TOGETHER!
Senior/Lead Embedded Security Engineer
GlobalLogic, Ukraine, Lviv
Description: The clinical environmental design company that enables a better care experience at the point of care in medical, dental, and animal health.Requirements: Bachelor’s degree in computer science, Cybersecurity, or a related field.Has experience in firmware developmentProfound experience with Linux Yocto, including setting up and managing encryption using dm-crypt/LUKS for full disk encryption and eCryptfs for file-level encryption.Demonstrated ability in securing Windows environments, particularly with EMR Plugins onWindows 10, 11, Server 2019, and Server 2022.Extensive knowledge of secure communication protocols and encryption techniques such asAES-256 and TLS over USB-HID connection.Familiarity with healthcare compliance and regulatory standards relevant to medical devicesecurity (e.g., FDA, HIPAA, GDPR).Proven ability to design and implement CI/CD pipelines with integrated security testing,preferred but not required.Practical experience with hardware security modules (HSM)/Key Vault or any Key Storage toolsand sophisticated key management systems, preferred but not requiredExcellent problem-solving, communication, and teamwork skills. Responsibilities: Implement full disk encryption using dm-crypt with LUKS or LUKS on dm-integrity and file-levelencryption using eCryptfs or EncFS on Linux-based devices.Develop and maintain secure data transmission protocols between devices and Plugins oversecured interfaces.Apply robust encryption measures on Windows platforms (Windows 10, 11, Server 2019, andServer 2022) to secure data within Plugins.Adhere to and enforce compliance with healthcare industry security standards and regulatoryrequirements.Perform periodic security audits and vulnerability assessments to fortify security measurescontinuously.Collaborate with software development teams to integrate security best practices into thelifecycle of product development.Document and update security protocols and encryption standards. What We Offer Empowering Projects: With 500+ clients spanning diverse industries and domains, we provide an exciting opportunity to contribute to groundbreaking projects that leverage cutting-edge technologies. As a team, we engineer digital products that positively impact people’s lives.Empowering Growth: We foster a culture of continuous learning and professional development. Our dedication is to provide timely and comprehensive assistance for every consultant through our dedicated Learning & Development team, ensuring their continuous growth and success.DE&I Matters: At GlobalLogic, we deeply value and embrace diversity. We are dedicated to providing equal opportunities for all individuals, fostering an inclusive and empowering work environment.Career Development: Our corporate culture places a strong emphasis on career development, offering abundant opportunities for growth. Regular interactions with our teams ensure their engagement, motivation, and recognition. We empower our team members to pursue their career goals with confidence and enthusiasm.Comprehensive Benefits: In addition to equitable compensation, we provide a comprehensive benefits package that prioritizes the overall well-being of our consultants. We genuinely care about their health and strive to create a positive work environment.Flexible Opportunities: At GlobalLogic, we prioritize work-life balance by offering flexible opportunities tailored to your lifestyle. Explore relocation and rotation options for diverse cultural and professional experiences in different countries with our company.About GlobalLogic GlobalLogic is a leader in digital engineering. We help brands across the globe design and build innovative products, platforms, and digital experiences for the modern world.By integrating experience design, complex engineering, and data expertise—we help our clients imagine what’s possible, and accelerate their transition into tomorrow’s digital businesses.Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers around the world, extending our deep expertise to customers in the automotive, communications, financial services, healthcare and life sciences, manufacturing, media and entertainment, semiconductor, and technology industries.GlobalLogic is a Hitachi Group Company operating under Hitachi, Ltd. (TSE: 6501) which contributes to a sustainable society with a higher quality of life by driving innovation through data and technology as the Social Innovation Business.