Building data pipelines, architecting solutions and embracing my inner nerd.
With a 6-year journey in the tech world, my day-to-day is transforming complex architectures into data flows that actually work. I’m addicted to solving integration puzzles and ensuring that every bit of information has governance, security, and high availability.
- Postgraduate student in Data Architecture (PUC Minas) and B.S. in Computer Engineering (UTFPR)
- Data Architecture: Designing scalable solutions for data transport, Data Lakehouse strategies, and Governance models
- Integration Architecture: Designing complex service buses, API governance, and messaging strategies for cross-system communication
- Data Engineering & DataOps: Building ETL/ELT pipelines, deployment automation, and continuous monitoring of the data lifecycle
- Data Science & Obs: Applying statistical models and telemetry monitoring to ensure data health and business value
- Data Science & Analytics: Development of predictive models, exploratory analysis, and intelligent automation with AI tools (Azure AI/OCR).
- Observability (Full-Stack): Implementation of distributed telemetry, performance dashboards in Grafana/Datadog, and log analysis.
- Data Engineering: Development of pipelines for large-scale data orchestration using PySpark, Scala, and SQL.
- DataOps: Implementation of a monitoring culture, automated testing, and CI/CD to ensure the integrity of analytical environments.
- Backend & Integrations: Development of APIs (.NET/Java) and messaging flows for consistency across domains in hybrid clouds (Azure/AWS).

