Dutech’s Job

Data Integration Architect – Enterprise Data Platforms

Austin,TX

DatePosted : 4/22/2026 8:44:04 AM

JobNumber : DTS1017187698
JobType : Contract
Skills: Data Engineering | ETL/ELT | SSIS | Snowflake | SQL | Data Pipelines | API Integration | Kafka | Python | Power BI | Data Quality | Cloud Data Platforms
Job Description

We are seeking a highly skilled Senior Data Engineer with strong expertise in data integration, ETL/ELT, and cloud data platforms. The ideal candidate will design and build scalable data pipelines, develop integration strategies, and ensure high-quality, reliable data movement across enterprise systems.


Key Responsibilities:

  • Define and implement data integration strategy, architecture, and roadmap (batch vs real-time, API vs ETL)
  • Design, develop, and maintain scalable data pipelines for internal and external data sources
  • Build and manage ETL/ELT processes using tools such as SSIS and modern data platforms
  • Develop API-based integrations and data transformation workflows
  • Ensure data quality through validation, cleansing, and reconciliation processes
  • Optimize data workflows for performance, scalability, and reliability
  • Implement monitoring, logging, error handling, and alerting mechanisms for data pipelines
  • Maintain documentation for data flows, mappings, and transformation logic
  • Ensure compliance with data governance, security, and regulatory standards (PII, etc.)
  • Collaborate with business and technical stakeholders to understand data requirements and deliver solutions

Required Qualifications:

  • 7+ years of experience in Data Engineering and Data Integration
  • Strong expertise in Data Warehousing concepts and architecture
  • Hands-on experience with ETL/ELT tools (SSIS preferred)
  • Strong proficiency in SQL Server or other RDBMS
  • Experience building and maintaining data pipelines
  • Experience implementing data quality frameworks (QA/QC)
  • Hands-on experience with cloud data platforms (e.g., Snowflake)
  • Experience with BI tools (e.g., Power BI)
  • Experience with API integrations (e.g., MuleSoft)
  • Strong communication and stakeholder management skills

Preferred Qualifications:

  • Experience with real-time data pipelines (Kafka, Python)
  • Knowledge of Data Lakehouse architecture
  • Experience handling sensitive data (PII, HIPAA compliance)
  • Exposure to Salesforce integrations
  • Familiarity with AWS (Lambda), Kubernetes, or containerization tools
  • Experience with Microsoft 365 and MS Access

SHARE THIS JOB

;