Multi-Scale Auralization for Multimedia Analytical Feature Interaction
Modern human-computer interaction systems use multiple perceptual dimensions to enhance intuition and efficiency of the user by improving their situational awareness. A signal processing and interaction framework is proposed for auralizing signal patterns and augmenting the visualization-focused analysis tasks of social media content analysis and annotations, with the goal of assisting the user in analyzing, retrieving, and organizing relevant information for marketing research. Audio signals are generated from video/audio signal patterns as an auralization framework, for example, using the audio frequency modulation that follows the magnitude contours of video color saturation. The integration of visual and aural presentations will benefit the user interactions by reducing the fatigue level and sharping the users’ sensitivity, thereby improving work efficiency, confidence, and satisfaction.
Read more . . .
Le Thanh Nguyen, N., Lee, H., Johnson, J., Ogihara, M., Ren, G., & Beauchamp, J. W. (2019). Multi-scale auralization for multimedia analytical feature interaction. Paper presented at 147th Audio Engineering Society International Convention 2019, New York, United States. https://www.aes.org/e-lib/browse.cfm?elib=20579