Real-Time Music-Driven Movie Design Framework

+Real-time+operating+systems%22&type=SU">CCS Concepts: Computer systems organization --> Real-time operating systems, Applied computing --> Sound and music computing, Media arts, Computer systems organization, Sound and music computing, Applied computing, time operating systems, Real -->
Popis: Cutting to music is a widely used stylistic device in film making. The usual process involves an editor manually adjusting the movie's sequences contingent upon beat or other musical features. But with today's movie productions starting to leverage real-time systems, manual effort can be reduced. Automatic cameras can make decisions on their own according to pre-defined rules, even in real time. In this paper, we present an approach to automatically create a music video. We have realised its implementation as a coding framework integrating with the fmod api and Unreal Engine 4. The framework provides the means to analyze a music stream at runtime and to translate the extracted features into an animation story line, supported by cinematic cutting. We demonstrate its workings by means of an instance of an artistic, music-driven movie.
Film Editing and Directing
Sarah Hofmann, Maximilian Seeger, Henning Rogge-Pott, and Sebastian von Mammen
DOI: 10.2312/wiced.20221052
Přístupová URL adresa: https://explore.openaire.eu/search/publication?articleId=doi_________::a0fa9806b283961655103df172a5c49e
Přírůstkové číslo: edsair.doi...........a0fa9806b283961655103df172a5c49e
Autor: Hofmann, Sarah, Seeger, Maximilian, Rogge-Pott, Henning, von Mammen, Sebastian
Rok vydání: 2022
Předmět:
DOI: 10.2312/wiced.20221052
Popis: Cutting to music is a widely used stylistic device in film making. The usual process involves an editor manually adjusting the movie's sequences contingent upon beat or other musical features. But with today's movie productions starting to leverage real-time systems, manual effort can be reduced. Automatic cameras can make decisions on their own according to pre-defined rules, even in real time. In this paper, we present an approach to automatically create a music video. We have realised its implementation as a coding framework integrating with the fmod api and Unreal Engine 4. The framework provides the means to analyze a music stream at runtime and to translate the extracted features into an animation story line, supported by cinematic cutting. We demonstrate its workings by means of an instance of an artistic, music-driven movie.
Film Editing and Directing
Sarah Hofmann, Maximilian Seeger, Henning Rogge-Pott, and Sebastian von Mammen
Databáze: OpenAIRE