Skip navigation EPAM

Data Integration Engineer Kazakhstan

  • hot

Data Integration Engineer Description

Job #: 66100
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.

We are looking for a Data Integration Engineer to make our team even stronger! As a member of the team, you will have a chance to learn the most modern technologies, prove your proficiency during challenging project engagements and be recognized as a world-class specialist.
#LI-DNI
#LI-DNP

What You’ll Do

  • Design and implement Data Integration solutions, model databases, build enterprise data platforms using classic Data technologies and tools (Databases, ETL/ELT technology & tools, MDM tools, etc.) and implement modern Cloud or Hybrid data solutions
  • Work with project tech. lead/architect to understand data product requirements, estimate new features and develop corresponding solution components
  • Perform detailed analysis of business problems and technical environments and use this in designing high-quality technical solutions
  • Actively participate in code review and testing of solutions to ensure it meets specification and quality requirements
  • Support a high-performance engineering culture
  • Write project documentation, participate in project meetings, meet with a customer, conduct demos to present the results of completed work

What You Have

  • At least 1 year of relevant development experience and practice with data integration, data management, data storage, data modeling, and database design
  • Some experience of working with public cloud providers (AWS, Azure, GCP)
  • General knowledge of leading cloud data warehousing solutions (Google BigQuery, Snowflake, Redshift, Azure Synapse Analytics, etc.)
  • Production coding experience in one of the data-oriented programming languages
  • Expertise in outstanding analytical and problem-solving skills
  • Ability to play a developer role on a project and ensure that delivered solutions meet product requirements
  • Experience working with modern Agile developing methodologies and tools

Technologies

  • Cloud providers stack (AWS/Azure/GCP): Storage, Compute, Networking, Identity and Security, DataWarehousing and DB solutions (RedShift, Snowflake, BigQuery, Azure Synapse, etc.)
  • Standard Data Integration tools (Azure Data Factory, AWS Glue, GCP Dataflow, Talend, Informatica, Pentaho, Apache NiFi, KNIME, SSIS, etc.)
  • Coding with on one of the data-oriented programming languages: SQL, Python, SparkSQL, PySpark, R, Bash, Scala
  • Relational Databases (RDBMS: MS SQL Server, Oracle, MySQL, PostgreSQL)
  • Dataflow orchestration, replication and preparation tools
  • Version Control Systems (Git, SVN)
  • Testing: Component/ Integration Testing / Reconciliation

We offer

  • Outstanding career development opportunities
  • Knowledge-sharing with colleagues all around the world
  • Unlimited access to learning courses (LinkedIn learning, EPAM training courses, English regular classes, Internal Library)
  • Community of 43,500+ industry’s top professionals
  • Regular assessments and salary reviews
  • Competitive compensation
  • Friendly team and enjoyable working environment
  • Social package – medical & family care
  • Flexible working schedule
  • Corporate and social events

在亿磐成长

周剑
解决方案架构师
苏州

朱晓华
首席软件测试工程师
苏州

金秋
首席软件工程师
苏州

我们在世界其他地方。。。