Top of Page

Links to move inside this page.

DTV Delivery and Other IIJ Streaming Technology Development

November 9, 2011
(Original Japanese article translated on October 3, 2012)

For streaming technology, the question of how to implement and operate applications developed in the United States presented a large problem in the 1990s. However, it is now more and more common for IIJ to develop its own applications, including a DTV delivery server and an independently-developed player. Here, we look at these development projects in context, and discuss future prospects.

The Move Towards the International Standardization of Video Compression/Extension Technology

In the 1990s we dealt with applications such as StreamWorks, RealAudio (RealVideo), and VDOLive. These encoders, servers, and clients were designed as total solutions, and were all proprietary, including the transfer protocols. However, in the Internet world of that day, the development of proprietary products by each company was generally accepted.
A shift towards international standards began in the 2000s. One example of this was the establishment of H.264 video compression/extension technology as an international standard. H.264 is said to be twice as efficient as the MPEG2 format that was used widely until then. This makes it possible to send content of the same quality using half the bandwidth, meaning that file sizes are also cut in half. H.264 came to be used throughout the world due to its efficiency, and was applied to a wide range of fields, including 1 SEG with its tight bandwidth restrictions, and Blu-ray Discs that had to support Full HD content.

The Advantage of Using File Conversion

When Apple announced HTTP Live Streaming (HLS), I thought it was a groundbreaking protocol typical of Apple. That is because on the protocol level it made no mention of quality control for video transmission, which is something that streaming vendors had previously taken great pains with. This proposal confidently proclaimed that video delivery was possible over the Internet using HTTP, which was a pleasant surprise to me, having previously thought that quality control was secured by the protocol.
HLS works by transferring MPEG2-TS files in segments to the client via HTTP. This segmented nature is its strength, as it makes the progressive download of a file possible, so it can be downloaded while it is being played back. Because it is purely file-oriented, there is essentially no distinction between live and VoD (Video on Demand). It is also easy to reuse this segmented file. The fact that HTTP is used without any extensions is another well-conceived point. Because there is no need to prepare a special server for HLS delivery, we use Apache or Squid.

Apple HTTP Live Streaming

Development of Server Technology for Cloud-Based Format Conversion

In light of these trends, IIJ has now begun development of related applications centered on MPEG technology, a compression method for video and audio data.
Between 2007 and 2008 we developed the "mod_ntv" DTV delivery server for acTVila, but before we grasped the inner workings of MPEG technology, we were not able to tie the client-server system together well, resulting in issues such as frames that should have been played back getting dropped. Based on this experience, we later began development of video transmission/delivery systems including MPEG.
We are currently working on server technology for saving an MPEG2-TS stream transmitted from a hardware encoder to the cloud as a file, and converting its format. MPEG2-TS is a widely used standard in the world of communications, but it cannot be viewed as-is on iPhone or PC players. This means that conversion technology is required. The targets for conversion include the compression and expansion method used for the video and audio signals, as well as the "container" format. This makes it possible to deliver content to match the streaming method required by users regardless of the hardware encoder format.
However, encoders developed for streaming in the past were extremely expensive, difficult to use, and required engineers to monitor them on-site. We decided to adopt a hardware encoder to reduce the associated costs. The hardware encoder is an appliance for compressing the input audio/video signals and transmitting them over Internet protocol, and features SDI (a professional-spec video interface), which is used extensively by broadcasters. Using an encoder like this makes it possible to drastically reduce on-site operating costs. It also enables power savings and reduced installation space, allowing for quick rollout and removal when broadcasting events, and the high reliability of dedicated hardware means it can be used with peace of mind.
We also began development of technology for editing this video footage online. In the past, hardware encoders such as these were used in combination with hardware decoders. Encoders and decoders were treated as a kind of media converter from the perspective of video, merely serving as a tunnel for the video signal. After the video was extracted, editing work was carried out on a conventional non-linear editing machine. We propose a change to a web-based workflow. This system has the following benefits.

  1. Clips can be created more simply than on a non-linear editing machine, allowing for large-scale production.
  2. Cloud-based editing makes it easy to secure resources temporarily. Especially large volumes of disk space can be secured cheaply and swiftly.
  3. Encoder portability, operability, and reliability are improved.

We are also progressing with initiatives to expand the playback infrastructure of HLS. Apple developed HLS with delivery to devices such as the iPhone and iPad in mind, so on computers it is only supported by the QuickTime Player for Mac OS X. We solved the issue of playback not being possible on Windows PCs or Android devices by implementing a program for interpreting the HLS format in Flash Player. This independently-developed player enables playback on platforms that can run Flash Player without converting the HLS format on a server, etc. We believe this is a true implementation of single-source, multi-use.

Bunji Yamamoto

Author Profile

Bunji Yamamoto

Digital Content Delivery Section, Application Service Department, Service Division, IIJ
Mr. Yamamoto joined IIJ Media Communications in 1995. He has worked at IIJ since 2005. He is chiefly engaged in the development of streaming technology. He also contributes to market development through activities such as hosting the Streams-JP Mailing List for discussion of this technology.

End of the page.

Top of Page