End-to-End QA Infrastructure — Built from Zero
Built the entire testing infrastructure from scratch as sole QA engineer for a high-traffic property platform.
Skyloov Property Portal
The Problem
Skyloov was shipping high-risk releases with no test coverage and no performance data, across a platform handling 120K+ property listings. Bugs were reaching production users regularly. There was no definition of what "good performance" looked like, and no way to catch regressions between releases. I was hired as the only QA engineer in a ~12-person engineering team to change this — from day one.
Testing Pipeline Architecture
Built two parallel testing tracks that ran every release cycle. Functional track: Postman collections covering 500+ backend endpoints, executed automatically via Newman in Jenkins CI. Performance track: Gatling simulating 1,000 concurrent users across 200+ APIs with p95 latency, TPS, and error rate metrics exported to Grafana dashboards. For the first time, the team had a performance baseline they could compare against after every deployment.
Business Impact
- 500+ endpoint regression suite running on every deployment — regressions caught before reaching 120K+ property listings users
- Platform launched successfully across 3 apps (web, iOS, Android) with established pass/fail gates at every release
- First-ever performance baselines defined: p95 latency, max throughput, error rate thresholds enforced per release
- AI-powered property search chatbot fully validated across search, support, and lead generation flows before launch
- ElasticSearch relevance and Kafka async event processing validated under real workload conditions