Aaron Rodriguez
Data Solutions Architect
Toney, AL · 901.607.6553 · ajrodr1982@gmail.com · ajrodriguez.me · github.com/ajrodr82
↓ Download PDF

Data Solutions Architect with 10+ years of experience designing enterprise-grade data platforms that translate complex business requirements into scalable, high-trust analytical systems. Deep expertise in cloud architecture, medallion design patterns, dimensional modeling, and end-to-end pipeline orchestration across Azure and AWS. Proven ability to define data strategy, establish governance frameworks, and deliver integrated solutions — from raw ingestion through semantic layer and self-serve reporting — that directly support experimentation, measurement, and executive decision-making. Known for bridging the gap between technical architecture and business outcomes, with a track record of building platforms that scale with the organization.

Data Solutions Architect
May 2022 – Present
National CineMedia (NCM)
  • Defined and implemented the enterprise data architecture strategy for a centralized analytics platform, integrating various data sources via Fivetran, Azure Data Factory, and custom pipelines to support experimentation, measurement, and strategic decision-making across the organization.
  • Architected medallion architecture patterns (Bronze → Silver → Gold) in Databricks, establishing design standards for each layer to ensure long-term analytical flexibility, performance optimization, and maintainability at scale.
  • Designed and governed automated ingestion pipelines using Azure Data Factory, REST APIs, incremental sync patterns, and distributed processing in Databricks and Azure Data Lake Storage, with MERGE-based upsert logic and Z-order optimization applied at scale.
  • Established enterprise data quality and governance frameworks through schema change detection, automated validation, and CI/CD workflows via Azure DevOps — defining standards that prevent silent data corruption and ensure high trust across critical datasets.
  • Defined analytics-focused dimensional models and semantic layer architecture in SSAS and Azure SQL, balancing reporting performance, self-service flexibility, and long-term maintainability across business domains.
  • Architected Power BI semantic models with DAX measures, calculated columns, and row-level security — translating complex business requirements into intuitive, governed reporting solutions for cross-functional stakeholders.
  • Led end-to-end delivery of self-serve analytics solutions, from platform architecture through interactive dashboarding, enabling analysts and non-technical stakeholders to explore insights without engineering dependencies.
Tools: Python, SQL, Spark SQL, DAX, M, Databricks, Azure Data Factory, Azure Data Lake Storage, Azure DevOps, Fivetran, Power BI, SSAS, SSMS, SQL Server, MySQL, AWS (Glue, Athena, Lambda, Redshift, S3), SSIS
Senior BI Developer
March 2022 – May 2022
Insight Global (AT&T Contractor)
  • Designed and delivered analytics-ready data solutions supporting executive decision-making, strategic performance reviews, and operational planning for a large enterprise client.
  • Defined data modeling standards and transformation logic to ensure metric correctness, consistency, and long-term scalability across analytical use cases.
  • Architected end-to-end ETL pipelines ingesting data from relational databases, flat files, and APIs into centralized reporting and warehouse layers, supporting full and incremental load patterns.
  • Designed SSIS-based pipeline architecture to automate extraction, transformation, and loading workflows, incorporating business rule logic, cleansing, deduplication, and error handling.
  • Established error handling, audit logging, and alerting standards across pipeline workflows to ensure reliable nightly processing and rapid identification of failures.
  • Developed and maintained stored procedures and SQL Agent jobs to automate data processing, scheduling, and pipeline workflows.
  • Architected and optimized semantic data models and query patterns, improving downstream analytics performance and reducing query latency by approximately 40%.
Tools: Python, T-SQL, SQL Server, Power BI, SSAS, SSRS, SSIS, DAX, M, SSMS, ALM Toolkit, Tabular Editor, DAX Studio, Azure DevOps
Senior Data Engineer
September 2021 – March 2022
Provisions Group
  • Architected end-to-end ETL pipelines ingesting data from relational databases, flat files, and APIs into centralized data warehouse layers, establishing design patterns for full and incremental load strategies.
  • Designed SSIS-based pipeline architecture with business rule logic, data cleansing, type casting, deduplication, error handling, and audit logging to ensure reliable, production-grade data delivery.
  • Defined dimensional modeling standards in SSAS and Azure SQL, balancing reporting performance, self-service flexibility, and governance across business domains.
  • Architected Power BI semantic models with DAX measures, calculated columns, and row-level security, translating stakeholder requirements into governed, intuitive reporting solutions.
  • Established dataset reliability standards through performance tuning, validation frameworks, and clear documentation — ensuring consistent, trustworthy outputs across reporting and semantic layers.
Tools: Python, T-SQL, SQL Server, Power BI, SSAS, SSRS, SSIS, DAX, M, SSMS, ALM Toolkit, Tabular Editor, DAX Studio, Azure DevOps
Data Engineer / BI Developer
2018 – 2021
  • Built end-to-end ETL pipelines ingesting data from relational databases, flat files, and APIs into centralized data marts and warehouse layers, supporting full and incremental load patterns.
  • Developed SSIS packages and stored procedures to automate ETL pipeline workflows with business rule logic, cleansing, deduplication, and error handling across full and incremental load patterns.
  • Designed and maintained SSAS tabular and OLAP models with DAX measures, calculated columns, and row-level security to support self-service and governed analytics.
  • Built Power BI dashboards and reports using star schema data models, DAX, M, and Power Query — deploying to Power BI Premium capacity via deployment pipelines.
  • Converted legacy Tableau reports to Power BI and created SSRS paginated reports with parameterization, scheduled delivery, and export to PDF and Excel.
  • Analyzed and optimized slow-running DAX queries and reports using DAX Studio and Performance Analyzer, improving reporting performance across high-volume datasets.
  • Collaborated cross-functionally to define metrics, enforce governance standards, and document data assets to improve trust and adoption.
Tools: Python, T-SQL, PL/SQL, DAX, M, SQL Server, Oracle, MySQL, Redshift, Power BI, SSAS, SSRS, SSIS, SSMS, DAX Studio, Tabular Editor, Tableau, Google Analytics, Google Data Studio, Power Automate, Azure Data Factory, Azure DevOps
Data Engineer / BI Developer / Application Developer
2013 – 2018
  • Built and maintained ETL pipelines extracting data from Progress OpenEdge and other source systems into SQL Server staging tables, performing transformations and loading into data warehouse layers using SSIS.
  • Designed and deployed parameterized SSRS reports and SSAS reporting cubes using SQL Server views, complex stored procedures, and DAX — delivering operational and financial insights to leadership.
  • Designed dimensional data models in Power BI and SSAS with DAX, calculated columns, and row-level security, producing rich dashboards and mobile solutions for self-service analytics.
  • Developed ABL-based dashboards and reporting pipelines using inmydata and Phocas Software for live operational data insights across logistics and manufacturing environments.
  • Partnered directly with business stakeholders, consultants, and end users to translate complex operational requirements into durable, production-grade data pipelines and reporting solutions.
  • Supported development lifecycle activities including technical specifications, work estimates, testing, and go-live actions to ensure reliable delivery of data and application projects.
Tools: T-SQL, PL/SQL, DAX, M, SQL Server, Oracle, MySQL, Power BI, SSAS, SSRS, SSIS, SSMS, DAX Studio, Tabular Editor, Tableau, Progress OpenEdge, OpenEdge ABL, inmydata, Phocas, Azure DevOps
Database & Reporting Engineer
2002 – 2013
Tec-Masters Inc.
  • Led the design and maintenance of database-driven systems supporting enterprise logistics, operations, and reporting.
  • Built automated SQL-based data pipelines and scheduled jobs to ensure data integrity and reliable nightly processing.
  • Developed high-performance reporting solutions providing operational visibility for leadership and program stakeholders.
  • Established early version control, automation, and documentation practices for long-running production data workflows.
Languages
Python, SQL, Spark SQL, DAX, M, D3.js
Architecture & Modeling
Medallion architecture, dimensional modeling, schema evolution, schema change detection, multitenant design
Data Engineering
ETL, batch and near-real-time pipelines, API ingestion, incremental sync patterns, MERGE-based upserts, distributed processing
Platforms
Databricks, Azure Data Factory, Azure Data Lake Storage, Fivetran, Power BI, SSAS, Snowflake, AWS (Glue, Athena, Lambda, Redshift, S3), SQL Server, MySQL, DuckDB, MotherDuck, Supabase, Streamlit, Vercel
DevOps & Governance
Azure DevOps, GitHub Actions, CI/CD, automated validation, access controls, documentation
Orchestration
dbt, SSIS, Dagster, Apache Airflow
Storage
Cloudflare R2, Azure Data Lake Storage, AWS S3, Supabase
Box Office Analytics Pipeline
End-to-end data pipeline using web scraping, cloud orchestration, DuckDB/MotherDuck, dbt transformations, and an interactive Streamlit dashboard for box office trend analysis. Built as a portfolio project demonstrating modern data stack tooling.
Tools: Python, DuckDB, MotherDuck, dbt, Apache Airflow, Streamlit
Weather Data Pipeline
End-to-end pipeline pulling live data from a public weather API, staging raw CSV files in Cloudflare R2 storage, and orchestrating ingestion into Snowflake via Apache Airflow DAGs. Demonstrates cloud storage integration, workflow orchestration, and data warehouse loading patterns.
Tools: Python, Apache Airflow, Cloudflare R2, Snowflake
YNAB Personal Finance Dashboard
Incremental sync pipeline from the YNAB API to Supabase PostgreSQL, scheduled via GitHub Actions, with a D3.js frontend deployed on Vercel. Demonstrates full-stack data engineering from API ingestion through interactive visualization.
Tools: Python, Supabase, PostgreSQL, GitHub Actions, D3.js, Vercel