eCDN Considerations for Streaming behind and beyond the Firewall
Reaching a large audience with a live webcast of an executive speech, product announcement, or shareholders’ meeting isn’t new. Enterprises have been doing it for years. However, such video is now an expectation. High-quality video streaming—from Netflix and Hulu to Periscope and Facebook Live—has become part of our daily lives. Increasingly, no matter our device or bandwidth, we can watch live streams anywhere. And these trends in the consumer market have raised expectations inside corporate firewalls. CIOs everywhere are being challenged to step up their game to use streaming as a tool to improve internal communications.
Distributing live streaming content in an enterprise can be a challenge because you want to keep it behind the firewall across work sites while also reaching mobile workers, shareholders, customers, etc. How can CIOs implement a streamlined enterprise content delivery network (CDN) architecture to deliver one-to-many live streaming to the right places, both behind and beyond the firewall, while efficiently managing bandwidth?
Ensuring the network engineers and members of the streaming team are highly aligned is an important step. Before you start a deployment, you’ll want to scope out the likely size and locations of your streaming audience, and the average streaming bitrates you expect to deliver. You can then model the impact of streaming to the full audience using your current network topology, assuming the use of one or more centralized streaming media servers delivers across your network using unicast.
Unicast streaming is the traditional method of delivering audio and video over an IP network, in which there is a one-to-one connection between each viewer and your streaming server. An organization relying only on standard unicast streaming may overload itsstreaming media server and saturate segments of its local network.
Whatever your network challenges and constraints, there's a large number of streaming distribution models available to address them
From there, you can map your streaming or network upgrade plans to best meet your business requirements. In the meantime, if you need to get started with streaming right away and your current topology is not up to the task, you might see if you can if you can start by streaming to a smaller audience, constraining video to lower bitrates, or offloading delivery for some audience locations to cloud-based streaming.
Assuming you’ll need to change your streaming architecture, common CDN technologies for streaming include multicast, peer-to-peer, origin-edge streaming servers, HTTP caching, cloud streaming, and a hybrid of these:
IP multicast streaming, much like an old-fashioned TV tower, sends out one live signal on a network that any viewer can “tune” into. On any LAN or WAN segment, you’ll only ever have one stream from any given event crossing it, greatly reducing the load on your network as compared with unicast.
Multicast requires that all of your network routers support multicast, which may require either a simple configuration change or a multimillion-dollar hardware upgrade. Further, multicast often does not satisfactorily address Wi-Fi streaming bottlenecks, so organizations sometimes block multicast streaming over Wi-Fi and have Wi-Fi users roll over to unicast.
Peer-to-peer (P2P) delivery, sometimes also referred to as peer acceleration, grid delivery, mesh networking, or application multicast, shares a live stream to a few of your colleagues, and they in turn share the stream with others. When designed properly, this offloads more than 80percent of the network traffic from major segments and onto local routers.
Unfortunately, peer-to-peer streaming got a bad reputation among network administrators, both for the language it was often based on (Java) and the association with applications such as Napster and BitTorrent. Fortunately, new HTML5-based P2P technologies eliminate the need for Java and other installed client code. In addition, because the P2P algorithms require a deep understanding of conditions at each client device, using P2P means you get built-in, detailed viewing analytics.
Origin-Edge Streaming Servers
Tiered servers offload streaming from the origin to the edge by fanning live streams out to servers in each heavilypopulatedsubnet throughout the organization's network. As with P2P, this keeps the traffic off major segments and largely relegated to local subnet routers. They can often cache on-demand content as well, allowing you to offload that traffic.
The biggest challenge is usually the highcost and management burden to support a large deployment of new streaming media edge servers. However, it is usually possible to control the edge servers through a central management console, or with automated content-delivery rules and monitoring. These edge streaming servers often put little load on the server hardware, especially for live events, and can sometimes be piggybacked on an existing storage, email, or caching server.
Similar to tiered streaming servers, HTTP caching servers are often distributed in each heavilypopulated subnet to cache frequently used web content. If you adopt HTTP streaming formats, such as Apple HLS or MPEG-DASH, you may be able to use existing or new HTTP caches to intelligently distribute not only web content, but also your live and on-demand streaming content.
If you also deploy HTTP caches at your Internet gateways, HTTP streaming may give you another option: moving your origin servers to the cloud. While streaming directly to each user inside the firewall from an external source would normally be bandwidth-prohibitive, making use of HTTP caching where the streams enter your LAN almost eliminates the streaming of duplicate content across your Internet connection.
Cloud streaming, whether using DIY virtual machines or a streaming service provider, eliminates the CapEx of streaming server purchases, server facilities, and other cost burdens. It replaces them with a predictable yet elastic OpEx model of delivery that offloads the heavy stream processing and hardware maintenance from your infrastructure.
While any one of the models above might enable your organization to confidently embrace high-quality streaming, sometimes the best solution is a hybrid. Within a large main campus, you might receive streams from cloud-based origin servers at an on-premises media server, and then deliver streams from there via multicast to desktops and use origin-edge servers to support viewers using Wi-Fi. You might use HTTP caching at the Internet gateway of your largerfield offices to reduce incoming stream duplication, employ P2P in small offices, and deliver unicast to reach your distributed mobile workers.
Whatever your network challenges and constraints, there's a large number of streaming distribution models available to address them. Many streaming technology vendors and consultants have deep experience in building out robust enterprise CDN solutions that can allow you to offer high-quality streaming to all of the stakeholders in your organization.