Passenger Maritime Transportation
Data QA and Data Engineering leadership for enterprise marketing and customer data integration
Led data quality and data engineering efforts in a large passenger transportation environment, building
Python and PySpark testing frameworks in Azure Databricks and Unity Catalog while also developing ADF pipelines
and integration notebooks for Salesforce Marketing Cloud, CRM and Adobe APIs.
The delivery included large-scale transfer paths involving Oracle, Unity Catalog and file-based ingestion,
with millions to billions of records moving across multiple business-critical flows.
Reduced data quality issues by 40% through implementation of automated testing frameworks.
Reduced data processing time by approximately 16 hours per 45 minutes through automation and optimization of data pipelines and validation logic.
Oil & Gas
Principal Data QA support for campaigns, dashboards and cross-platform data validation
Acted as the lead Data QA professional in a data-focused team, validating Snowflake, Salesforce, campaign data,
Tableau dashboards and supporting strategic definition of testing scenarios, acceptance criteria and quality expectations.
Global Maritime Cargo Transportation
Cloud migration quality strategy for highly sensitive financial, HR and customer data
Supported a multi-layer cloud migration from on-premise into Azure-based medallion architecture, helping define the
testing framework, architecture direction and end-to-end validation approach.
A critical defect in financial precision was identified before production impact, protecting downstream data integrity
in high-value calculations.
Detected precision loss after the 10th decimal point in financial data validation,
preventing potential multi-million dollar reporting discrepancies.
Large-Scale Ecommerce
Real-time validation across AWS and GCP in a high-throughput migration scenario
Worked in a highly demanding ecommerce environment with millions of transactions per second, validating real-time
synchronization in both directions across AWS and GCP using Kafka, MongoDB, Docker, Rancher, SQL Server and PostgreSQL.
Media, Radio, Communication & Entertainment
Data QA strategy and data integrity validation for hundreds of radio entities across the US
Defined testing scope, strategy and framework direction together with client leadership, validating large file-based
datasets with millions of records stored in AWS S3 and transferred into cloud and Salesforce destinations.
Defined data quality rules that prevented the publication of inaccurate radio audience metrics, preserving client trust and advertiser confidence.
Advanced Manufacturing / Battery Industry
QA modernization through transition from Selenium to Cypress with parallel framework alignment
Supported modernization of the testing approach by helping transition a Selenium-based solution into Cypress while
also validating APIs and internal processes in parallel with another automation stream using WebdriverIO.
Implementation of Cypress-based testing framework reduced test execution time by 70% and improved test reliability by 30% compared to the previous Selenium-based approach.
Geospatial Utilities
Quality and systems support for geospatial network mapping in utility environments
Worked in geoprocessing and utility-related systems supporting energy, telecommunications, water and sewage contexts,
with focus on platforms used for geographical survey and operational mapping of electrical networks and related infrastructure.
The work required understanding domain-specific data structures, geographic representation and operational consistency
across technical environments where field accuracy matters.
Implementation of QA processes improved data accuracy and reduced errors in geospatial mapping.
QA Advocate role helped align technical teams with operational needs, ensuring that geospatial data supported critical utility operations effectively.
Healthcare
System validation in healthcare environments with unstable connectivity and constantly evolving public APIs
Worked in healthcare delivery contexts involving care units with intermittent internet availability, which introduced
major operational and validation challenges for systems that depended on synchronization and API communication.
A key difficulty was dealing with frequent API changes from the Brazilian Ministry of Health while maintaining functional
system behavior and reliability in real-world environments with inconsistent connectivity.
Public Safety / Police Intelligence
BI solution built from the ground up for public security decision support
Built a business intelligence solution from scratch in 2012 for a Brazilian public safety environment, supporting the
analysis and decision-making needs of the security sector.
The project required end-to-end ownership of the BI structure, combining system thinking, data organization and reporting
strategy in a highly sensitive institutional context.
The solution gave rise to a more informed and responsive public safety decision-making process.