PHD Belgium provides media planning and buying across all online and offline media. In delivering value to their customers they need to access and analyse massive amounts of data. Most of that data is scattered across several external services and ad buying platforms. Bringing that data together is where we try to help them.
For different purposes we developed several ETL data pipelines where we transform or validate the data as needed. We developed a robust system that can handle concurrent workloads, complex retry mechanisms, temporization strategies and validation algorithms to align the data from all sources.
We have a scraping system running that performs validation of ad tagging on a major clients' website. This system checks and detects regressions with daily status updates.
Instead of manually gathering data via exports, PHD has fully automated ETL pipelines in use that allows them to report on most of their data and explore running campaigns and projects.
Phoenix/Elixir, PostgreSQL, Ansible, Docker, Amazon (Lambda, EC2, RDS, S3)Visit PHD Belgium