Education: Final-year IT student at Saigon University.
Passion: Crafting robust Data Pipelines, Big Data Architectures, and System Optimization.
Mission: Transforming raw data into actionable insights through scalable platforms.
Currently Mastering: Apache Airflow, Spark, dbt, and Cloud Infrastructure (AWS/GCP).
Let's Talk: Python, SQL, ETL/ELT, Data Modeling, and System Design.
Development: Strong Python & SQL fundamentals with a focus on high-performance code.
Engineering: End-to-end pipeline orchestration: Source ➔ Processing ➔ Storage ➔ Analytics.
Modern Stack: Hands-on with Airflow, Kafka, RabbitMQ, Docker, and Vector Databases.
Mindset: Data-quality first. Proactive in debugging consistency and performance bottlenecks.
The Pipeline: Python → Airflow + Kafka → PostgreSQL → Analytics
I choose tools based on reliability and maintainability: Python for flexible ETL logic, Airflow for orchestration, Kafka/RabbitMQ for event-driven flow, and PostgreSQL for structured analytics-ready storage.
| Aspect | Details |
|---|---|
| Problem | Build a complete food ordering workflow with clear menu, cart, and order tracking experience |
| My Role | Developed backend business flow, API integration, and relational schema handling across ordering modules |
| Scale / Impact | Delivered a full end-to-end product flow for team development and realistic academic deployment |
| Tech | Java Spring Boot MySQL REST API Git/GitHub |
| Learning outcomes | Improved system design mindset, transaction-aware CRUD handling, and cross-layer collaboration |
🔗 Repository: phatle224/sgu_cnpm_foodfast
I'm open to internship opportunities and collaboration around data engineering, backend systems, and practical product-building projects.



