Senior Data Integration Engineer - Remote Remote Canada
Senior Data Integration Engineer - Remote Description
Job #: 74960Description
You are sharp, driven and inquisitive. You are not afraid to take risks and grow by learning from mistakes. You let your voice be heard and love a good challenge. If this sounds like you, this could be the perfect opportunity to join EPAM as a Senior Data Integration Engineer. Scroll down to learn more about the position’s responsibilities and requirements.
Req. #306881847
#REF_DATAQ122_CA
What You’ll Do
- Design and implement Data Integration solutions and contribute to building data platforms using classic Data technologies and tools (Databases, ETL/ELT technology & tools, MDM tools)
- Implement modern Cloud or Hybrid Data solutions
- Partake In Data Modeling
- Work with product and engineering teams to understand data product requirements, evaluate new features and architecture to help and drive decisions
- Build collaborative partnerships with architects and key individuals within other functional groups
- Perform detailed analysis of business problems and technical environments and use this in designing high-quality technical solutions
- Actively participate in code review and testing of solutions to ensure it meets specification and quality requirements
- Build and foster a high-performance engineering culture, supervise junior team members and provide them technical leadership
Requirements
- Advanced knowledge of Data Integration tools (Azure Data Factory, AWS Glue, GCP Dataflow, Talend, Informatica, Pentaho, Apache NiFi, KNIME, SSIS)
- Advanced knowledge of Relational Databases (SQL Optimization, Relations, Stored Procedures, Transactions, Isolation Levels, Security)
- Practical hands-on experience with Data Solutions in Cloud environments (AWS, Azure, GCP)
- Ability to design, implement, deploy and monitor scalable and fault-tolerant data solutions
- Solid understanding of core cloud technologies and approaches
- Awareness of niche and case-specific cloud services
- Expected ability to troubleshoot the outages of average complexity, identify and trace performance issues
- Pattern-driven solutioning, choosing the best requirements and technical constraints
- Advanced knowledge of Data Security (Row-level data security, audit)
- Production experience in data-oriented programming languages (SQL, Python, SparkSQL, PySpark, R, Bash)
- Production project experience in Data Management, Data Storage, Data Analytics, Data Visualization, Data Integration, MDM, Disaster Recovery, Availability, Operation & Security
- Experience with Data Modeling (OLAP, OLTP, ETL and DWH / Data Lake /Delta Lake/ Data Mesh methodologies
- Exposure to Inman vs Kimbal, Staging Areas, SCD & other dimension types)
- Good understanding of Online and streaming integrations & micro-batching
- Understanding of CDC methods and delta extracts
- General understanding of Housekeeping processes (archiving, purging, retention policies, hot/cold data)
- Good understanding of CI/CD principles and best practices
- Understanding of concepts of "Canary release", Blue-Green, Red-Black deployment models
- Data-oriented focus and possessing compliance awareness, such as PI, GDPR, HIPAA
What We Offer
- Extended Healthcare with Prescription Drugs, Dental and Vision Insurance (Company Paid)
- Life and AD&D Insurance (Company Paid)
- Employee Assistance Program (Company Paid)
- Unlimited access to LinkedIn learning solutions
- Long-Term Disability
- Registered Retirement Savings Plan (RRSP) with company match
- Paid Time Off
- Critical Illness Insurance
- Employee Discounts