Deconstruction of Cinematic visualization
The aim of the project was deconstruction and abstracted recomposition of the imaginary space created when interpreting scenes from cinema, as well as a speculative study into what kind of spatial experiences can be generated. Looking through the lens of cinematography, the color was analyzed, light and camera angle, that composes a movie scene. An emotional scene is formed by an exquisitely crafted balance of multiple elements. What happens if we isolate one or few of those elements? Through a series of experiments, we intend to investigate these questions, translate these sensory aspects of a cinematic scene into an experiential virtual reality world and explore the uncanny aesthetics that are generated during the process
Academic: Design Project
Team: Yulia Marouda, Hesham Hattab
The simulation is inspired from the ambiguity, uncertainty, ghostly and mystic scenes from the movie. The Colorado hall and the space around that in the overlook hotel are chosen as the simulation space. Through this experiment, we want to see what kind of space is generated by deconstructing a cinema scene. Can this method be developed into a syntax which can guide an application facilitated by AI(Artificial Intelligence) to create virtual imaginary spaces from critically acclaimed cinemas? Translating a strong 2D motion into a real virtual experience.
Taking the kid scene from the movie "The Shining" where Danie encounters the two ghostly daughters, Pixel sorting was done on the frames from the scene to create a glitchy image of the hues of the frames. The idea is to analyze, whether the colors from a thrilling scene combined with the background score of the scene put together can recreate the sensory experience of the cinema scene. The sorted pixel frames depict each and change in the scene and the facial changes of the characters. It tries to recapture the motion and the stillness of the scenes.
Taking the kid scene from the movie "The Shining" where Danie encounters the two ghostly daughters. Grouping and extruding of the color pixels negating the color black and white. Aligning these extruded frames according to the movement of the frame and create a motion particle simulation