Summary
Data Solutions Architect with 10+ years of experience designing enterprise-grade data platforms that translate complex business requirements into scalable, high-trust analytical systems. Deep expertise in cloud architecture, medallion design patterns, dimensional modeling, and end-to-end pipeline orchestration across Azure and AWS. Proven ability to define data strategy, establish governance frameworks, and deliver integrated solutions — from raw ingestion through semantic layer and self-serve reporting — that directly support experimentation, measurement, and executive decision-making. Known for bridging the gap between technical architecture and business outcomes, with a track record of building platforms that scale with the organization.
Experience
National CineMedia (NCM)
- Defined and implemented the enterprise data architecture strategy for a centralized analytics platform, integrating various data sources via Fivetran, Azure Data Factory, and custom pipelines to support experimentation, measurement, and strategic decision-making across the organization.
- Architected medallion architecture patterns (Bronze → Silver → Gold) in Databricks, establishing design standards for each layer to ensure long-term analytical flexibility, performance optimization, and maintainability at scale.
- Designed and governed automated ingestion pipelines using Azure Data Factory, REST APIs, incremental sync patterns, and distributed processing in Databricks and Azure Data Lake Storage, with MERGE-based upsert logic and Z-order optimization applied at scale.
- Established enterprise data quality and governance frameworks through schema change detection, automated validation, and CI/CD workflows via Azure DevOps — defining standards that prevent silent data corruption and ensure high trust across critical datasets.
- Defined analytics-focused dimensional models and semantic layer architecture in SSAS and Azure SQL, balancing reporting performance, self-service flexibility, and long-term maintainability across business domains.
- Architected Power BI semantic models with DAX measures, calculated columns, and row-level security — translating complex business requirements into intuitive, governed reporting solutions for cross-functional stakeholders.
- Led end-to-end delivery of self-serve analytics solutions, from platform architecture through interactive dashboarding, enabling analysts and non-technical stakeholders to explore insights without engineering dependencies.
Tools: Python, SQL, Spark SQL, DAX, M, Databricks, Azure Data Factory, Azure Data Lake Storage, Azure DevOps, Fivetran, Power BI, SSAS, SSMS, SQL Server, MySQL, AWS (Glue, Athena, Lambda, Redshift, S3), SSIS
Insight Global (AT&T Contractor)
- Designed and delivered analytics-ready data solutions supporting executive decision-making, strategic performance reviews, and operational planning for a large enterprise client.
- Defined data modeling standards and transformation logic to ensure metric correctness, consistency, and long-term scalability across analytical use cases.
- Architected end-to-end ETL pipelines ingesting data from relational databases, flat files, and APIs into centralized reporting and warehouse layers, supporting full and incremental load patterns.
- Designed SSIS-based pipeline architecture to automate extraction, transformation, and loading workflows, incorporating business rule logic, cleansing, deduplication, and error handling.
- Established error handling, audit logging, and alerting standards across pipeline workflows to ensure reliable nightly processing and rapid identification of failures.
- Developed and maintained stored procedures and SQL Agent jobs to automate data processing, scheduling, and pipeline workflows.
- Architected and optimized semantic data models and query patterns, improving downstream analytics performance and reducing query latency by approximately 40%.
Tools: Python, T-SQL, SQL Server, Power BI, SSAS, SSRS, SSIS, DAX, M, SSMS, ALM Toolkit, Tabular Editor, DAX Studio, Azure DevOps
Provisions Group
- Architected end-to-end ETL pipelines ingesting data from relational databases, flat files, and APIs into centralized data warehouse layers, establishing design patterns for full and incremental load strategies.
- Designed SSIS-based pipeline architecture with business rule logic, data cleansing, type casting, deduplication, error handling, and audit logging to ensure reliable, production-grade data delivery.
- Defined dimensional modeling standards in SSAS and Azure SQL, balancing reporting performance, self-service flexibility, and governance across business domains.
- Architected Power BI semantic models with DAX measures, calculated columns, and row-level security, translating stakeholder requirements into governed, intuitive reporting solutions.
- Established dataset reliability standards through performance tuning, validation frameworks, and clear documentation — ensuring consistent, trustworthy outputs across reporting and semantic layers.
Tools: Python, T-SQL, SQL Server, Power BI, SSAS, SSRS, SSIS, DAX, M, SSMS, ALM Toolkit, Tabular Editor, DAX Studio, Azure DevOps
- Built end-to-end ETL pipelines ingesting data from relational databases, flat files, and APIs into centralized data marts and warehouse layers, supporting full and incremental load patterns.
- Developed SSIS packages and stored procedures to automate ETL pipeline workflows with business rule logic, cleansing, deduplication, and error handling across full and incremental load patterns.
- Designed and maintained SSAS tabular and OLAP models with DAX measures, calculated columns, and row-level security to support self-service and governed analytics.
- Built Power BI dashboards and reports using star schema data models, DAX, M, and Power Query — deploying to Power BI Premium capacity via deployment pipelines.
- Converted legacy Tableau reports to Power BI and created SSRS paginated reports with parameterization, scheduled delivery, and export to PDF and Excel.
- Analyzed and optimized slow-running DAX queries and reports using DAX Studio and Performance Analyzer, improving reporting performance across high-volume datasets.
- Collaborated cross-functionally to define metrics, enforce governance standards, and document data assets to improve trust and adoption.
Tools: Python, T-SQL, PL/SQL, DAX, M, SQL Server, Oracle, MySQL, Redshift, Power BI, SSAS, SSRS, SSIS, SSMS, DAX Studio, Tabular Editor, Tableau, Google Analytics, Google Data Studio, Power Automate, Azure Data Factory, Azure DevOps
- Built and maintained ETL pipelines extracting data from Progress OpenEdge and other source systems into SQL Server staging tables, performing transformations and loading into data warehouse layers using SSIS.
- Designed and deployed parameterized SSRS reports and SSAS reporting cubes using SQL Server views, complex stored procedures, and DAX — delivering operational and financial insights to leadership.
- Designed dimensional data models in Power BI and SSAS with DAX, calculated columns, and row-level security, producing rich dashboards and mobile solutions for self-service analytics.
- Developed ABL-based dashboards and reporting pipelines using inmydata and Phocas Software for live operational data insights across logistics and manufacturing environments.
- Partnered directly with business stakeholders, consultants, and end users to translate complex operational requirements into durable, production-grade data pipelines and reporting solutions.
- Supported development lifecycle activities including technical specifications, work estimates, testing, and go-live actions to ensure reliable delivery of data and application projects.
Tools: T-SQL, PL/SQL, DAX, M, SQL Server, Oracle, MySQL, Power BI, SSAS, SSRS, SSIS, SSMS, DAX Studio, Tabular Editor, Tableau, Progress OpenEdge, OpenEdge ABL, inmydata, Phocas, Azure DevOps
Tec-Masters Inc.
- Led the design and maintenance of database-driven systems supporting enterprise logistics, operations, and reporting.
- Built automated SQL-based data pipelines and scheduled jobs to ensure data integrity and reliable nightly processing.
- Developed high-performance reporting solutions providing operational visibility for leadership and program stakeholders.
- Established early version control, automation, and documentation practices for long-running production data workflows.