Processing Large Media Objects

Rainer Schmidt and Matthias Rella:
An Approach for Processing Large and Non-Uniform Media Objects on MapReduce-based Clusters.
Lecture Notes in Computer Science, 2011, Volume 7008/2011, 172-181.
DOI: 10.1007/978-3-642-24826-9_23

Abstract
Cloud computing enables us to create applications that take advantage of large computer infrastructures on demand. Data intensive computing frameworks leverage these technologies in order to generate and process large data sets on clusters of virtualized computers. MapReduce provides an highly scalable programming model in this context that has proven to be widely applicable for processing structured data. In this paper, we present an approach and implementation that utilizes this model for the processing of audiovisual content. The application is capable of analyzing and modifying large audiovisual files using multiple computer nodes in parallel and thereby able to dramatically reduce processing times. The paper discusses the programming model and its application to binary data. Moreover, we summarize key concepts of the implementation and provide a brief evaluation.

Download: link

Leave a Reply