Want to professionalize your AI skills, pivot to an AI role and increase your salary?
Master AI Engineering with the most practical and comprehensive LLM Development certifications at Towards AI Academy.

IBM

Sr. Consultant, Data Architect

IBM

Published 03 Apr 2026
Carrollton, TX, USA& Other locations
Temporary

Share this job

Role Highlights

Languages used

S3
SQL
Python

Key skills

Data Engineer
Data Architect
Data Governance
Integrations
API
CICD
Data Protection
Data Warehousing
Solution Architect
Team Management
Enterprise Architect
AI
Architecture
Cloud
Security
SAAS
Authentication
Testing
IAM
Glue
Clustering
Machine Learning

Tools, Libraries and Frameworks

GitLab CI
IBM
SnowFlake
AWS
Lambda
SQS
RBAC
Terraform
Kinesis

Description

\\\\Introduction\\\\ At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk \\\\Your role and responsibilities\\\\ As a Solution Data Architect within IBM Consulting's Data & AI practice, you will operate at the intersection of enterprise architecture, cloud data platform delivery, and client leadership. In this role, you will serve as the primary technical authority on large-scale, complex engagements - owning end-to-end architecture decisions, guiding a high-performing delivery team, and building trusted relationships with client stakeholders to drive measurable outcomes. This is a senior, client-facing role for an architect who is equally comfortable whiteboarding a platform strategy with an executive and diving deep into technical design with an engineering team. · Serve as the primary technical point of contact for client stakeholders and executives; lead engagement planning, sprint ceremonies, and executive status reporting · Proactively identify and communicate risks, blockers, and scope changes with proposed resolutions; partner with project managers and practice leadership on delivery health · Own end-to-end Snowflake platform architecture including objects, governance models, and security at SnowPro Advanced: Architect depth · Architect Bronze/Silver/Gold medallion data models optimized for Snowflake performance and downstream BI consumption · Design and govern API strategy for exposing data products, including integrations with internal applications, external partners, and SaaS platforms via API-driven and event-based patterns on AWS (API Gateway, Lambda, EventBridge, SQS) · Implement secure API authentication patterns including OAuth2 and token-based access; manage API lifecycle and cataloging aligned to enterprise standards · Translate healthcare payer business requirements spanning claims, member, provider, eligibility, and care management domains into scalable data and integration architectures that comply with HIPAA, PHI/PII protection policies, and payer-specific regulatory standards · Lead dbt model design including materialization strategy, macros, packages, snapshots, and testing; establish GitLab CI/CD pipelines and enforce coding and documentation standards · Architect Snowflake-AWS integrations across S3, IAM, VPC, PrivateLink, and Glue; monitor pipeline SLAs and costs and lead root-cause analysis for incidents; support migration of legacy data feeds to modern cloud-based, event-driven architectures · Manage and mentor data and analytics engineers; drive enablement on Snowflake, dbt, and GitLab best practices and contribute reusable accelerators to the broader practice This role can be performed from anywhere in the US \\\\Required technical and professional expertise\\\\ \\\ 8+ years in data engineering, data warehousing, or solution architecture roles \\\ Client-facing consulting or professional services delivery experience with concurrent stakeholder and team management \\\ 3+ years hands-on Snowflake experience: warehouse management, clustering, RBAC, security model, Streams, Tasks, and Dynamic Tables \\\ dbt Core or Cloud: materialization strategies, macros, packages, snapshots, and testing framework \\\ GitLab CI/CD: pipeline configuration, merge request workflows, and dbt integration \\\ AWS production experience: S3, Glue, IAM, VPC, PrivateLink, and Secrets Manager \\\ SQL and Python proficiency for pipeline development and automation \\\ Healthcare payer domain knowledge spanning claims, member, provider, eligibility, care/case management, and risk adjustment \\\ Familiarity with HIPAA, PHI/PII data protection requirements, and secure healthcare data integration standards \\\\Preferred technical and professional experience\\\\ \\\ Snowflake SnowPro Advanced: Architect certification \\\ AWS Certified Data Analytics - Specialty or equivalent \\\ Snowflake Cortex AI/ML functions and agentic data workflows \\\ Snowflake Iceberg tables and external volume architecture \\\ Terraform for Snowflake and AWS infrastructure-as-code \\\ Event-driven integration patterns: EventBridge, Kinesis, SQS for operational and analytic data movement \\\ Enterprise metadata, lineage, and data governance tooling \\\* Experience designing secure healthcare APIs with strict PHI/PII access controls IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.

Required Qualifications and Skills

The role requires over 8 years of experience in data engineering, data warehousing, or solution architecture. Client-facing consulting or professional services delivery experience with stakeholder and team management is necessary. A minimum of 3 years of hands-on Snowflake experience is required, including warehouse management, clustering, RBAC, security model, Streams, Tasks, and Dynamic Tables. Proficiency in dbt Core or Cloud, covering materialization strategies, macros, packages, snapshots, and testing framework, is essential. Experience with GitLab CI/CD for pipeline configuration, merge request workflows, and dbt integration is also required. Additionally, AWS production experience with S3, Glue, IAM, VPC, PrivateLink, and Secrets Manager, along with SQL and Python proficiency for pipeline development and automation, is needed. Healthcare payer domain knowledge spanning claims, member, provider, eligibility, care/case management, and risk adjustment is a requirement, as is familiarity with HIPAA, PHI/PII data protection requirements, and secure healthcare data integration standards.

Disclaimer

Disclaimer: Job and company description information and some of the data fields may have been generated via GPT-4 summarisation and could contain inaccuracies. The full external job listing link should always be relied on for authoritative information.

About the company

IBM

Size

305978

Website

ibm.com

HQ

Armonk, New York, US

Public/Private

Public Company

Description

IBM infuses core business operations with intelligence, from machine learning to generative AI, to make organizations more responsive, productive, and resilient. It helps clients put AI into action now, creating real value with trust, speed, and confidence across various areas like digital labor, IT automation, and security. The ability to utilize all data is critical, as AI's effectiveness is dependent on the quality of data fueling it, with IBM's AI, and data platform aiming to scale and accelerate AI's impact with trusted data. IBM's hybrid cloud platform offers a comprehensive approach to development, security, and operations across hybrid environments, laying a flexible foundation for leveraging data wherever it resides.

Share

Share this job

Related jobs

AI
Deep Learning
NLP
Machine Learning
Cambridge, MA, USA
Full Time
Data Engineer
Integrations
Open Source
Data Processing
Bangalore, India
Full Time
Tech Lead
API
CICD
Product Development
Pune, India
Full Time