By Jesse Michelsen, Ad Proxy Technical Lead, Verizon Media –
OTT enables broadcasters and content creators to go well beyond the traditional linear TV experience, making it possible to personalize video streams for every viewer. By modifying the raw stream on the server-side before sending it to the player, custom experiences can be created for every viewer, including tailored content and programming, and highly targeted advertising. The result is stronger viewer engagement and higher CPMs.
Another compelling aspect of ad-supported streams is that, unlike traditional linear broadcasts, they can provide advertisers, content creators and distributors with access to a wide range of data about their viewers.
Brands can run multi-screen ad campaigns that can be bought programmatically. Media companies can glean better data on who their users are and what shows they’re watching, informing programming decisions.
OTT is everything the industry has been promising for some time but has largely been unable to deliver.
This is because of challenges around ad sourcing, playback, and verification. Many standards around OTT advertising are nascent, fragmented and still evolving. In-depth debug tools and analysis around quality of service (QoS) are often limited, creating uncertainty about the viewer’s quality of experience (QoE) and reducing advertisers’ confidence.
One approach to regaining advertisers’ trust is the use of tracking pixels that are placed into an ad so providers can track when and where the ad has run. While the technology is sound, in theory, it can be problematic when used with server-side ad insertion due to a lack of standards across devices and connected TVs. Also, some publishers and device manufacturers are working to intentionally mask key parameters in an attempt to prevent data leakage.
Another approach, which has far fewer dependencies, is for the servers responsible for assembling the personalized stream and obtaining and inserting ads to also capture and store data about ad performance and viewership. This data can then be used to evaluate QoS and derive useful insights into QoE and its impact on viewer retention.
An overview of how such a system works is in the diagram above. As the workflow starts, the player makes requests to a content server until it has enough information to request ads from an ad decision server (ADS) such as FreeWheel or Google Ad Manager. Once that happens, a dedicated ad proxy server is responsible for fetching and validating each ad asset and inserting it into the stream. Critically, the ad proxy server also stores all the data it captures about the performance of the ADS and the ads themselves in a central database along with beacon information from clients.
Deliver as promised
A significant challenge is that many of the ads on a given server are just wrappers pointing to the actual ads on a different server, and you can have wrappers pointing to other wrappers. Some wrappers never resolve to an actual ad asset. For live streams, in particular, such “waterfalling” can increase latency and result in a degraded viewership experience, not to mention a lost revenue opportunity.
Data collected during the ad insertion process includes the raw request and response data from ad servers, response times and sizes, the number of ads returned, ad pod location, total wrappers or redirects, as well as details about errors (i.e., parsing failures or connection errors), ad provider warnings or request failures.
With this ad workflow data — available in near real time — publishers can quickly identify and resolve demand sources that don’t result in ads served and take well-informed steps to optimize the viewing experience. Further, by joining ad performance data with session data and appropriate metadata, they can gain contextual insights into how the performance of ads and content relates to QoE, and ultimately viewer retention.
Knowledge is power. Visibility into an unknown workflow, like the server-side ad insertion process, enables publishers to optimize streams by troubleshooting using near-real-time data and then analyzing data in batches to identify trends that answer business questions, inform strategy, and increase revenue.