Dutech’s Job

Informatica Data Engineer

Austin,TX

DatePosted : 3/13/2024 4:33:45 AM

JobNumber : DTS1017185992
JobType : W2
Skills: Knowledge of Snowflake data sharing, Familiarity with GitHub or equivalent version control systems.
Job Description

Performs advanced (senior-level) Data Pipeline development work with Informatica Cloud (IICS) including data integration, source to target data modeling, Extract-Transaction-Load development, consuming Oracle data connections, RESTful API application-based data connections, targeting Snowflake data connections, designing Snowflake target data lake databases, data warehouse modeling with ELT. This candidate will be using Informatica Cloud to build mass ingestion pipelines or other EL from an Oracle transaction database(s) to our Snowflake Data Lake. Additional duties may include working within Snowflake to create other API based data ingestion routines and/or setting up Data Sharing of these accumulated data. Dimensional modeling of data may be required in our Data Warehouse to accomplish other objectives like data reporting and improved performance data handling.

Salary Range: 90K - 100K Per Annum

  • Development of ETL/ELT data mappings and workflows for data pipeline development with Informatica Cloud Data Integration.
  • Practical experience using and building Informatica Mass Ingestion Pipelines.
  • Demonstrated experience with Oracle Database as a Data Connector source.
  • Expert with Snowflake as a Target database platform.
  • Experience with the Snowflake platform and ecosystem. Knowledge of Snowflake data sharing and Snowpark is a plus.
  • Knowledge of the advantages as well as previous experience working with Informatica push-down optimization
  • Experience with Snowflake database creation, optimization, and architectural advantages.
  • Practical experience with Snowflake SQL.  
  • Determines database requirements by analyzing business operations, applications, and programming; reviewing business objectives; and evaluating current systems.
  • Obtains data model requirements, develops, and implements data models for new projects, and maintains existing data models and data architectures.
  • Creates graphics and other flow diagrams (including ERD) to show complex database design and database modeling more simply.
  • Performs related work as assigned.
  • Practical experience with one time data loads as well as Change Data Capture (CDC) for bulk data movement.
  • Creation of technical documentation for process and interface documentation is a key element of this role, as the Team is working on release two of a multiple release effort.
  • Ability to review the work of others, troubleshoot, and provide feedback and guidance to meet tight deliverable deadlines is required.
  • Ability to promote code from development environments to production.
  • Familiarity with GitHub or equivalent version control systems.
  • Experience working with state agencies as well as security protocols and processes.

CANDIDATE SKILLS AND QUALIFICATIONS

Minimum Requirements: Candidates that do not meet or exceed the minimum stated requirements (skills/experience) will be displayed to customers but may not be chosen for this opportunity.

Years Required/Preferred Experience
8 Required Generating advanced SQL queries and using other data interrogation methods..
8 Required Experience with Informatica Products with at least 4 years direct experience with Informatica Cloud Data Integration
8 Required Reviewing, interpreting, and translation business requirements and design specifications into data mappings, and data pipeline development using all major data integration patterns.
8 Required Experience in relational database design concepts which including direct experience in Oracle RDBMS.
8 Required Experience with Data Warehouse architectural patterns including modeling Facts and Dimensions.
8 Required Experience with static mapping and Change Data Capture (CDC) for bulk data movement.
8 Required Experience with Informatica Mass Ingestion
8 Required Experience using Snowflake (creating, managing, optimizing databases as well as time-travel and other technical advantages of cloud data warehousing).
1 Preferred Any experience working with Snowpark, servicing the needs of Python programmers, or understanding Data Scientists requirements against Snowflake databases will make the candidate more successful

SHARE THIS JOB

;