Overview
Taxonomy
Listening Modes
Technology


Key Readings

Bovermann, T., Rohrhuber, J., & de Campo, A. (2011). Laboratory Methods for Experimental Sonification. In The sonification handbook. Berlin: Logos Verlag.

De Campo, A., Rohrhuber, J., Bovermann, T., & Frauenberger, C. (2011). Sonification and Auditory Display in SuperCollider. In The SuperCollider Book. Retrieved from http://pub.uni-bielefeld.de/publication/2018277

Ballora, Mark. (2011) Opening Your Ears to Data. TEDxPSU Retrieved from http://www.youtube.com/watch?v=aQJfQXGbWQ4&feature=youtube_gdata_player

De Campo, A. (2007). Toward a data sonification design space map. In Proceedings of the International Conference on Auditory Display (ICAD) (pp. 342–347). Retrieved from http://dev.icad.org/Proceedings/2007/deCampo2007.pdf

Grond, F., & Hermann, T. (2014). Interactive Sonification for Data Exploration: How listening modes and display purposes define design guidelines. Organised Sound, 19(01), 41–51. doi:10.1017/S1355771813000393
Abstract:
A brief history of sonification and its definition are reviewed outlining the dichotomies that have been used such as artistic/scientific, symbolic/analogic, and listening/hearing leading into the question of the role of aesthetics in auditory display. The five listening modes described by Schaeffer (1967) are compared to recent contributions by Vickers (2012) who proposes an extension of the direct listening modes in order to accommodate complementing modalities and interaction and Tuuri and Eerola (2012) who give an alternative embodied cognition-centred conceptualization of listening modes. "Musical" and "everyday" listening modes are described noting the role that "reduced listening" has in interactive sonification and a comparison of the different theoretical perspectives (Schaeffer, Gaver, and Tuuri) is provided. The described listening modes are then linked to display purposes related to descriptive and normative intentions promoting the taxonomy proposed by Tuuri and Eerola and reduced listening for descriptive sonifications. The role of repetition is described providing four guidelines relating to data, interaction, playback, and reproduction. This is followed by an overview of interaction with the "sound object" and listening modes being considered in relation to sound making and proposes a listening centered approach inspired my ergo audition and Schaeffers "musicianly listening" by suggesting sonic exploitation, sonic exploration, and sonic expression as 3 categories for sonic interaction.

Hermann, T., Hunt, A., & Neuhoff, J. G. (2011). The sonification handbook. Chapter 1 Introduction. Berlin: Logos Verlag.
Annotation:
The author describes Sonification as as subcategory of Auditory Display. "Auditory Display encompasses all aspects of a human-machine interaction system, including the setup, speakers or headphones,modes of interaction with the display system, and any technical solution for the gathering,processing, and computing necessary to obtain sound in response to the data. In contrast, Sonification is a core component of an auditory display: the technique of rendering sound in response to data and interactions." An overview of the fields that currently use auditory display include "chaos theory, bio-medicine, and interfaces for visually disabled people, to data mining, seismology, desktop computer interaction, and mobile devices, to name just a few" with the research disciplines that are required including Physics, Acoustics, Psychoacoustics,Perceptual Research, Sound Engineering, Computer Science as core disciplines with Psychology, Musicology, Cognitive Science, Linguistics, Pedagogies, Social Sciences and Philosophy also being necessary. The use of Auditory display benefits from the highly complexity, power, and flexibility of our auditory systems noting that at times the recent interactive aspects of sonifcation can be counter productive as our listening system requires time to adapt to the auditory display. The author concludes the chapter with an overview of the book.

Part I introduces the fundamentals of sonification, sound and perception. Theoretical foundations (chapter 2), psychoacoustics (chapter 3), perception research (chapter 4), Sonic Interaction Design (chapter 5), psychology and evaluation (chapter 6) and design (chapter 7). Part II moves towards the procedural aspects of sonification technology.The representation of data and statistical aspects of data (chapter 8), how sound is represented, generated or synthesized (chapter 9), suitable computer languages and programming systems (chapter 10), control and exploration of data using sound (chapter 11). The different Sonification Techniques are presented in Part III. Audification (chapter 12), Auditory Icons (chapter 13), Earcons (chapter 14), Parameter Mapping Sonification (chapter 15) and Model-Based Sonification (chapter 16). Part IV focuses on specific application fields for Sonification and Auditory Display. Assistive Technology (chapter 17), Process Monitoring (chapter 18), Intelligent Auditory Alarms (chapter 19), The use of sonification to assist navigation (chapter 20), and the Interactive representation of body movements by sonification, driven by the idea that sound can support skill learning and performance without the need to attend a located visual display (chapter 21).
Hermann, T. (2008). Taxonomy And Definitions For Sonification And Auditory Display.

Hermann, T. (2008). Taxonomy And Definitions For Sonification And Auditory Display.

Song, H. J., & Beilharz, K. (2008). Aesthetic and Auditory Enhancements for Multi-stream Information Sonification. In Proceedings of the 3rd International Conference on Digital Interactive Media in Entertainment and Arts (pp. 224–231). New York, NY, USA: ACM. doi:10.1145/1413634.1413678
sonification.de. (n.d.). Retrieved from http://sonification.de/

Tuuri, K., & Eerola, T. (2012). Formulating a Revised Taxonomy for Modes of Listening. Journal of New Music Research, 41(2), 137–152. doi:10.1080/09298215.2011.614951

Walker, B. N., & Nees, M. A. (2011). Theory of Sonification. Retrieved from http://sonify.psych.gatech.edu/~ben/references/nees_theory_of_sonification.pdf

Worrall, D. (2014). Can Micro-Gestural Inflections Be Used to Improve the Soniculatory Effectiveness of Parameter Mapping Sonifications? Organised Sound, 19(01), 52–59. doi:10.1017/S135577181300040X

Top Priority
Ballora, M. (2014). Sonification, Science and Popular Music: In search of the “wow.” Organised Sound, 19(01), 30–40. doi:10.1017/S1355771813000381
Barrass, S. (2005). A perceptual framework for the auditory display of scientific data. ACM Transactions on Applied Perception (TAP), 2(4), 389–402.
Barrass, S., Whitelaw, M., & Bailes, F. (2006). Listening to the mind listening: an analysis of sonification reviews, designs and correspondences. Leonardo Music Journal, 16, 13–19.
Ben-Tal, O., & Berger, J. (2004). Creative aspects of sonification. Leonardo, 37(3), 229–233.
Chafe, C., & Leistikow, R. (2001). Levels of temporal resolution in sonification of network performance. In Proceedings of the 2001 International Conference on Auditory Display. Retrieved from http://blog.zhdk.ch/zmoduletelematic/files/2014/02/tempResNetPerf.pdf
Dean, R. T., Whitelaw, M., Smith, H., & Worrall, D. (2006). The mirage of real-time algorithmic synaesthesia: Some compositional mechanisms and research agendas in computer music and sonification. Contemporary Music Review, 25(4), 311–326. doi:10.1080/07494460600760981

De Campo, A., Dayé, C., Frauenberger, C., Vogt, K., Wallisch, A., & Eckel, G. (2006). Sonification as an interdisciplinary working process. In Proc. of the 12th International Conference on Auditory Display, London, UK. Retrieved from http://iem.kug.ac.at/fileadmin/media/iem/altdaten/projekte/publications/paper/icad/icad.pdf
Eigenfeldt, A., & Pasquier, P. (2010). Real-Time Timbral Organisation: Selecting samples based upon similarity. Organised Sound, 15(02), 159–166. doi:10.1017/S1355771810000154
Fishwick, P. A. (2013). Aesthetic Computing. The Encyclopedia of Human-Computer Interaction, 2nd Ed. Retrieved from /encyclopedia/aesthetic_computing.html
Frauenberger, C. (2007). Ears))): a methodological framework for auditory display design. In CHI’07 extended abstracts on Human factors in computing systems (pp. 1641–1644). ACM. Retrieved from http://dl.acm.org/citation.cfm?id=1240872
Gingrich, O., Renaud, A., Emets, E., & Xiao, Z. (2014). Transmission: A Telepresence Interface for Neural and Kinetic Interaction. Leonardo, 47(4), 375–385.
Grond, F., & Hermann, T. (2012). Aesthetic strategies in sonification. AI & SOCIETY, 27(2), 213–222. doi:10.1007/s00146-011-0341-7

Hermann, T., & Hunt, A. (2005). An introduction to interactive sonification. IEEE Multimedia, 20–24.
Hermann, T., Hunt, A., & Neuhoff. (2011a). Interactive Sonification. In The sonification handbook. Berlin: Logos Verlag.
Hermann, T., Hunt, A., & Neuhoff, J. G. (2011b). Sonification Design and Aesthetics. In The sonification handbook. Berlin: Logos Verlag.
Hermann, T., Hunt, A., & Neuhoff, J. G. (2011c). The sonification handbook. Berlin: Logos Verlag.
Hermann, T., Niehus, C., & Ritter, H. (2003). Interactive visualization and sonification for monitoring complex processes. In International Conference on Auditory Display (pp. 247–250). Retrieved from http://www.icad.org/websiteV2.0/Conferences/ICAD2003/paper/60%20Hermann2-%20complex.pdf
Hussein, K., Tilevich, E., Bukvic, I. I., & Kim, S. (2009). Sonification design guidelines to enhance program comprehension. In Program Comprehension, 2009. ICPC’09. IEEE 17th International Conference on (pp. 120–129). IEEE. Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5090035
Introduction to Data Sonification - Oxford Handbooks. (n.d.). Retrieved October 23, 2014, from http://www.oxfordhandbooks.com/view/10.1093/oxfordhb/9780199792030.001.0001/oxfordhb-9780199792030-e-016
Johannsen, G. (2004). Auditory Displays in Human-Machine Interfaces. Proceedings of the IEEE, 92(4), 742–758. doi:10.1109/JPROC.2004.825905
Joy, J. (2012). What NMSAT says about sonification. AI & SOCIETY, 27(2), 233–244. doi:10.1007/s00146-011-0343-5
Kramer, G., Walker, B., & Bargar, R. (1999). Sonification report: Status of the field and research agenda. International Community for Auditory Display. Retrieved from http://www.researchgate.net/publication/224927615_Sonification_report_Status_of_the_field_and_research_agenda/file/32bfe510a92dcbb8c7.pdf
Lazar, J. (Ed.). (2007). Universal usability: designing computer interfaces for diverse user populations. Chichester ; Hoboken, NJ: John Wiley & Sons.
McGee, R. (2009). Auditory Displays and Sonification: Introduction and Overview. press. Retrieved from http://spatialization.org/writing/Sonification_Auditory_Display.pdf
Pauletto, S., & Hunt, A. (2004). A Toolkit for Interactive Sonification. In ICAD. Retrieved from http://www.icad.org/websiteV2.0/Conferences/ICAD2004/papers/pauletto_hunt.pdf
Pelz-Sherman, M. (n.d.). Suggested Applications of Musical Improvisation Analysis to Sonification. Retrieved from http://interactive-sonification.org/ISon2007/proceedings/papers/Pelz-Sherman_ISon2007.pdf
Schoon, A., & Dombois, F. (2009). Sonification in music. In Copenhagen, Denmark. Re: New–Digital Arts Forum: Re: New–Digital Arts Forum. Retrieved from http://www.icad.org/Proceedings/2009/SchoonDombois2009.pdf

Tuuri, K., Pirhonen, A., & Hoggan, E. (2009). Some severe deficiencies of the input-output HCI-paradigm and their influence on practical design. In Proceedings of the European conference on Cognitive Ergonomics (ECCE 2009), Designing beyond the product–understanding activity and user experience in ubiquitous environments, Helsinki, Finland, September (pp. 363–369). Retrieved from http://www.auditorysigns.com/kai/papers/Tuuri_etal_2009_Some_severe_deficiencies.pdf
Vickers, P. (2005). Ars Informatica–Ars Electronica: Improving Sonification Aesthetics. Retrieved from http://nrl.northumbria.ac.uk/11296/
Walker, B. N., & Nees, M. A. (2011). 2 Theory of Sonification. Retrieved from http://sonify.psych.gatech.edu/~ben/references/nees_theory_of_sonification.pdf

Yeo, W. S., Berger, J., & Lee, Z. (2004). SonART: A framework for data sonification, visualization and networked multimedia applications. In Proceedings of the Internaional Computer Music Conference. Retrieved from https://ccrma.stanford.edu/~zune/sources/projects/sonart/sonart.files/icmc2004.pdf

High Priority
Al Bregman’s Website. (n.d.). Retrieved September 11, 2014, from http://webpages.mcgill.ca/staff/Group2/abregm1/web/asaoutline.htm
Ballora, M. (2014). Sonification, Science and Popular Music: In search of the “wow.” Organised Sound, 19(01), 30–40. doi:10.1017/S1355771813000381
Barrass, S. (2005a). A comprehensive framework for auditory display: Comments on Barrass, ICAD 1994. ACM Transactions on Applied Perception (TAP), 2(4), 403–406.
Barrass, S. (2005b). A perceptual framework for the auditory display of scientific data. ACM Transactions on Applied Perception (TAP), 2(4), 389–402.
Barrass, S., & Kramer, G. (1999). Using sonification. Multimedia Systems, 7(1), 23–31.
Barrass, S., & Robertson, P. (1997). Auditory information design. Citeseer.
Barrass, S., Whitelaw, M., & Bailes, F. (2006). Listening to the mind listening: an analysis of sonification reviews, designs and correspondences. Leonardo Music Journal, 16, 13–19.
Ben-Tal, O., & Berger, J. (2004). Creative aspects of sonification. Leonardo, 37(3), 229–233.
Bonebright, T. L., & Miner, N. E. (2005). Evaluation of auditory displays: Comments on Bonebright et al., ICAD 1998. ACM Transactions on Applied Perception (TAP), 2(4), 517–520.
Cádiz, R. F. (2006). A fuzzy-logic mapper for audiovisual media. Computer Music Journal, 30(1), 67–82.
Chafe, C., & Leistikow, R. (2001). Levels of temporal resolution in sonification of network performance. In Proceedings of the 2001 International Conference on Auditory Display. Retrieved from http://blog.zhdk.ch/zmoduletelematic/files/2014/02/tempResNetPerf.pdf
Collins, N. (2012). LMJ 22: Acoustics. Leonardo Music Journal, 22(1), 1–2.
Cullen, C., & Coyle, E. (2004). Analysis of data sets using trio sonification. Retrieved from http://digital-library.theiet.org/content/conferences/10.1049/cp_20040532
Dean, R. T. (Ed.). (2009). The Oxford handbook of computer music. Oxford ; New York: Oxford University Press.
Dean, R. T., Whitelaw, M., Smith, H., & Worrall, D. (2006). The mirage of real-time algorithmic synaesthesia: Some compositional mechanisms and research agendas in computer music and sonification. Contemporary Music Review, 25(4), 311–326. doi:10.1080/07494460600760981
De Campo, A., Dayé, C., Frauenberger, C., Vogt, K., Wallisch, A., & Eckel, G. (2006). Sonification as an interdisciplinary working process. In Proc. of the 12th International Conference on Auditory Display, London, UK. Retrieved from http://iem.kug.ac.at/fileadmin/media/iem/altdaten/projekte/publications/paper/icad/icad.pdf
Eigenfeldt, A., & Pasquier, P. (2010). Real-Time Timbral Organisation: Selecting samples based upon similarity. Organised Sound, 15(02), 159–166. doi:10.1017/S1355771810000154
Eldridge, A. (2006). Issues in auditory display. Artificial Life, 12(2), 259–274.
Fishwick, P. (2006). An introduction to aesthetic computing. Aesthetic Computing, 3–27.
Fishwick, P., Diehl, S., Prophet, J., & Löwgren, J. (2005). Perspectives on Aesthetic Computing. Leonardo, 38(2), 133–141.
Frauenberger, C. (2007). Ears))): a methodological framework for auditory display design. In CHI’07 extended abstracts on Human factors in computing systems (pp. 1641–1644). ACM. Retrieved from http://dl.acm.org/citation.cfm?id=1240872
Frauenberger, C. (n.d.). Auditory Display Design An Investigation of a Design Pattern Approach. Retrieved from http://frauenberger.name/files/frauenbergerthesis.pdf
Gingrich, O., Renaud, A., Emets, E., & Xiao, Z. (2014). Transmission: A Telepresence Interface for Neural and Kinetic Interaction. Leonardo, 47(4), 375–385.
Grond, F. (2013). Listening-Mode-Centered Soni cation Design for Data Exploration. Retrieved from http://pub.uni-bielefeld.de/luur/download?func=downloadFile&recordOId=2675399&fileOId=2675400
Grond, F., & Hermann, T. (2014). Interactive Sonification for Data Exploration: How listening modes and display purposes define design guidelines. Organised Sound, 19(01), 41–51. doi:10.1017/S1355771813000393
Hermann, T. (2008). Taxonomy And Definitions For Sonification And Auditory Display.
Hermann, T., & Hunt, A. (2005). An introduction to interactive sonification. IEEE Multimedia, 20–24.
Hermann, T., Hunt, A., & Neuhoff, J. G. (2011). The sonification handbook. Berlin: Logos Verlag.
Hermann, T., & Ritter, H. (2005). Model-based sonification revisited—authors’ comments on Hermann and Ritter, ICAD 2002. ACM Transactions on Applied Perception (TAP), 2(4), 559–563.
Hoffman, M., & Cook, P. R. (2006). FEATURE-BASED SYNTHESIS FOR SONIFICATION AND. Retrieved from http://www.icad.org/Proceedings/2006/HoffmanCook2006.pdf
Hussein, K., Tilevich, E., Bukvic, I. I., & Kim, S. (2009). Sonification design guidelines to enhance program comprehension. In Program Comprehension, 2009. ICPC’09. IEEE 17th International Conference on (pp. 120–129). IEEE. Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5090035
Jekosch, U. (2005). Assigning Meaning to Sounds—Semiotics in the Context of Product-Sound Design. In Communication Acoustics (pp. 193–221). Springer. Retrieved from http://link.springer.com/chapter/10.1007/3-540-27437-5_8
Joy, J. (2012). What NMSAT says about sonification. AI & SOCIETY, 27(2), 233–244. doi:10.1007/s00146-011-0343-5
Kramer, G. (1994). Auditory Display: Sonification, Audification, And Auditory Interfaces. (* E., Ed.). Reading, Mass: Westview Press.
Kramer, G., Walker, B., & Bargar, R. (1999). Sonification report: Status of the field and research agenda. International Community for Auditory Display. Retrieved from http://www.researchgate.net/publication/224927615_Sonification_report_Status_of_the_field_and_research_agenda/file/32bfe510a92dcbb8c7.pdf
Lazar, J. (Ed.). (2007). Universal usability: designing computer interfaces for diverse user populations. Chichester ; Hoboken, NJ: John Wiley & Sons.
Lewis, J. W., Talkington, W. J., Tallaksen, K. C., & Frum, C. A. (2012). Auditory object salience: human cortical processing of non-biological action sounds and their acoustic signal attributes. Frontiers in Systems Neuroscience, 6. doi:10.3389/fnsys.2012.00027
Lodha, S. K., Beahan, J., Heppe, T., Joseph, A., & Zane-Ulman, B. (1997). Muse: A musical data sonification toolkit. In Proceedings of International Conference on Auditory Display (ICAD’97). Retrieved from http://www.icad.org/websiteV2.0/Conferences/ICAD97/Lodha.pdf
Mandic, D. (2008). Signal Processing Techniques for Knowledge Extraction and Information Fusion.
McGee, R. (2009). Auditory Displays and Sonification: Introduction and Overview. press. Retrieved from http://spatialization.org/writing/Sonification_Auditory_Display.pdf
Miranda, E. R., Bull, L., Gueguen, F., & Uroukov, I. S. (2009). Computer Music Meets Unconventional Computing: Towards Sound Synthesis with In Vitro Neuronal Networks. Computer Music Journal, 33(1), 9–18.
Nort, D. V., Wanderley, M. M., & Depalle, P. (2014). Mapping Control Structures for Sound Synthesis: Functional and Topological Perspectives. Computer Music Journal, 38(3), 6–22.
Ox, J. (2010). A 21st-Century Pedagogical Plan for Artists: How Should We Be Training Artists for Today? Leonardo, 43(1), 2–2.
Parente, P., & Bishop, G. (2009). Out from behind the curtain: learning from a human auditory display. In CHI’09 Extended Abstracts on Human Factors in Computing Systems (pp. 2575–2584). ACM. Retrieved from http://dl.acm.org/citation.cfm?id=1520363
Pauletto, S., & Hunt, A. (2004). A Toolkit for Interactive Sonification. In ICAD. Retrieved from http://www.icad.org/websiteV2.0/Conferences/ICAD2004/papers/pauletto_hunt.pdf
Pelz-Sherman, M. (n.d.). Suggested Applications of Musical Improvisation Analysis to Sonification. Retrieved from http://interactive-sonification.org/ISon2007/proceedings/papers/Pelz-Sherman_ISon2007.pdf
Peres, S. C., & Lane, D. M. (2003). Sonification of statistical graphs. In International Conference on Auditory Display, Boston, MA. Retrieved from http://icad.org/Proceedings/2003/PeresLane2003.pdf
Polli, A. (2005). Atmospherics/Weather Works : A Spatialized Meteorological Data Sonification Project. Leonardo, 38(1), 31–36.
Polli, A. (2006). Heat and the Heartbeat of the City: Sonifying Data Describing Climate Change. Leonardo Music Journal, 16(1), 44–45.
Schoon, A., & Dombois, F. (2009). Sonification in music. In Copenhagen, Denmark. Re: New–Digital Arts Forum: Re: New–Digital Arts Forum. Retrieved from http://www.icad.org/Proceedings/2009/SchoonDombois2009.pdf
Song, H. J., & Beilharz, K. (2008). Aesthetic and Auditory Enhancements for Multi-stream Information Sonification. In Proceedings of the 3rd International Conference on Digital Interactive Media in Entertainment and Arts (pp. 224–231). New York, NY, USA: ACM. doi:10.1145/1413634.1413678
sonification.de. (n.d.). Retrieved from http://sonification.de/
Sturm, B. L. (2005). Pulse of an ocean: sonification of ocean buoy data. Leonardo, 38(2), 143–149.
Treadaway, C. (2009). Materiality, Memory and Imagination: Using Empathy to Research Creativity. Leonardo, 42(3), 231–237.
Tuuri, K., & Eerola, T. (2012). Formulating a Revised Taxonomy for Modes of Listening. Journal of New Music Research, 41(2), 137–152. doi:10.1080/09298215.2011.614951
Vickers, P. (2011). Sonification for Process Monitoring. In The sonification handbook. Berlin: Logos Verlag.
Vickers, P. (n.d.). WAYS OF LISTENING AND MODES OF BEING: Electroacoustic Auditory Display. Journal of Sonic Studies. Retrieved from http://journal.sonicstudies.org/vol02/nr01/a04
Walker, B. N., & Kramer, G. (1996). Mappings and metaphors in auditory displays: An experimental assessment. In Proceedings of the Third International Conference on Auditory Display, ICAD (Vol. 96, pp. 71–74). Citeseer.
Walker, B. N., & Kramer, G. (2005). Sonification design and metaphors: Comments on Walker and Kramer, ICAD 1996. ACM Transactions on Applied Perception (TAP), 2(4), 413–417.
Walker, B. N., & Nees, M. A. (2011). 2 Theory of Sonification. Retrieved from http://sonify.psych.gatech.edu/~ben/references/nees_theory_of_sonification.pdf
Weinberg, G., & Thatcher, T. (2006). Interactive Sonification: Aesthetics, Functionality and Performance. Leonardo Music Journal, 16(1), 9–12.
Williamson, J., & Murray-Smith, R. (2005). Sonification of probabilistic feedback through granular synthesis. IEEE Multimedia, 12(2), 45–52.
Worrall, D. (2010). Towards the Better Perception of Sonic Data Mappings. In Proceedings of the ACMA Conference (pp. 24–25). Retrieved from http://worrall.avatar.com.au/papers/worrall_ACMA2010.pdf
Worrall, D. (2014). Can Micro-Gestural Inflections Be Used to Improve the Soniculatory Effectiveness of Parameter Mapping Sonifications? Organised Sound, 19(01), 52–59. doi:10.1017/S135577181300040X
worrallThesis_Complete.pdf. (n.d.). Retrieved October 23, 2014, from http://www.avatar.com.au/papers/phd/worrallThesis_Complete.pdf
Yeo, W. S., Berger, J., & Lee, Z. (2004). SonART: A framework for data sonification, visualization and networked multimedia applications. In Proceedings of the Internaional Computer Music Conference. Retrieved from https://ccrma.stanford.edu/~zune/sources/projects/sonart/sonart.files/icmc2004.pdf
Ystad, S., & CMMR <6, 2009, København>, ICAD <15, 2009, København> (Eds.). (2010). Auditory display: 6th international symposium ; revised papers. Berlin; Heidelberg; New York, NY: Springer.


Further Interest
AeSon Toolkit for aesthetic interactive sonification. (n.d.). Retrieved September 11, 2014, from http://www.kirstybeilharz.com.au/aeson.html
Anderson, J. (2005). Creating an empirical framework for sonification design. In International Conference on Auditory Display (ICAD2005), Limerick, Ireland. Retrieved from http://dev.icad.org/Proceedings/2005/Anderson2005.pdf
Ballora, M. (2000). Data analysis through auditory display: applications in heart rate variability. McGill University. Retrieved from http://markballora.com/publications/balloradiss.pdf
Ballora, M., Pennycook, B., Ivanov, P. C., Glass, L., & Goldberger, A. L. (2004). Heart rate sonification: A new approach to medical diagnosis. Leonardo, 37(1), 41–46.
Barrett, N., & Mair, K. (2014). Aftershock: A science–art collaboration through sonification. Organised Sound, 19(01), 4–16. doi:10.1017/S1355771813000368
Beilharz, K., Moere, A. V., Stiel, B., Calo, C., Tomitsch, M., & Lombard, A. (2010). Expressive Wearable Sonification and Visualisation: Design and Evaluation of a Flexible Display. Proc. NIME 2010. Retrieved from http://www.infoscape.org/publications/nime10.pdf
Bovermann, T., & Griffiths, D. (2014). Computation as Material in Live Coding. Computer Music Journal, 38(1), 40–53.
Bruckner, H.-P., Theimer, W., & Blume, H. (2014). Real-time low latency movement sonification in stroke rehabilitation based on a mobile platform. In Consumer Electronics (ICCE), 2014 IEEE International Conference on (pp. 264–265). IEEE. Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6775997
Cheng, H. Y., & Mirkhani, S. A. (2002). The Singing Plant: Analysis of plant-wide disturbances through sonification. Retrieved from http://www.ee.ucl.ac.uk/~afernand/Example2.pdf
Chion, M. (n.d.). Guide To Sound Objects: Pierre Schae er and Musical Research. Retrieved from http://monoskop.org/images/0/01/Chion_Michel_Guide_To_Sound_Objects_Pierre_Schaeffer_and_Musical_Research.pdf
Collins, N. (2006). Noises Off: Sound Beyond Music. Leonardo Music Journal, 16(1), 7–8.
Cooper, M., Foote, J., Pampalk, E., & Tzanetakis, G. (2006). Visualization in Audio-Based Music Information Retrieval. Computer Music Journal, 30(2), 42–62.
Davis, T. (2012). Complexity as Practice: A Reflection on the Creative Outcomes of a Sustained Engagement with Complexity. Leonardo, 45(2), 106–112.
Diaz-Merced, W. L., Candey, R. M., Brickhouse, N., Schneps, M., Mannone, J. C., Brewster, S., & Kolenberg, K. (2011). Sonification of Astronomical Data. Proceedings of the International Astronomical Union, 7(S285), 133–136. doi:10.1017/S1743921312000440
Dubus, G. (2012). Evaluation of four models for the sonification of elite rowing. Journal on Multimodal User Interfaces, 5(3-4), 143–156. doi:10.1007/s12193-011-0085-1
Dunn, J., & Clark, M. A. (1999). Life music: the sonification of proteins. Leonardo, 32(1), 25–32.
Effenberg, A., Melzer, J., Weber, A., & Zinke, A. (2005). Motionlab sonify: A framework for the sonification of human motion data. In Information Visualisation, 2005. Proceedings. Ninth International Conference on (pp. 17–23). IEEE. Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1509054
European Conference on Cognitive Ergonomics, N., Leena, & Valtion teknillinen tutkimuskeskus (Eds.). (2009). ECCE 2009: European Conference on Cognitive Ergonomics : Designing beyond the product - understanding activity and user experience in ubiquitous environments. [Espoo, Finland]: VTT.
Foner, L. N. (1999). Artificial synesthesia via sonification: a wearable augmented sensory system. Mobile Networks and Applications, 4(1), 75–81.
Giannachi, G. (2012). Representing, Performing and Mitigating Climate Change in Contemporary Art Practice. Leonardo, 45(2), 124–131.
Gresham-Lancaster, S., & Sinclair, P. (2012). Sonification and Acoustic Environments. Leonardo Music Journal, 22(1), 67–71.
Grond, F., & Hermann, T. (2012). Singing function: Exploring auditory graphs with a vowel based sonification. Journal on Multimodal User Interfaces, 5(3-4), 87–95. doi:10.1007/s12193-011-0068-2
Grosshauser, T., & Hermann, T. (2009). The sonified music stand–an interactive sonification system for musicians. In Proceedings of the 6th Sound and Music Computing Conference (pp. 233–238). Casa da Musica, Porto, Portugal. Retrieved from http://pdf.aminer.org/000/238/209/acoumotion_an_interactive_sonification_system_for_acoustic_motion_control.pdf
Han, Y. C., & Han, B. (2013). Digiti Sonus. Leonardo, 46(4), 392–393.
Hermann, T., Meinicke, P., & Ritter, H. (2000). Principal curve sonification. In Proceedings of International Conference on Auditory Display (Vol. 2000). Citeseer. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.24.3276&rep=rep1&type=pdf
Hug, D. (2009). Using a systematic design process to investigate narrative sound design strategies for interactive commodities. In Proceedings of the 15th international conference on Auditory Display. Copenhagen, Denmark. Retrieved from http://www.researchgate.net/publication/200056502_Using_a_Systematic_Design_Process_to_Investigate_Narrative_Sound_Design_Strategies_for_Interactive_Commodities/file/d912f50b630ec38866.pdf
Hunt, A., & Wanderley, M. M. (2002). Mapping performer parameters to synthesis engines. Organised Sound, 7(02). doi:10.1017/S1355771802002030
Jensenius, A. R. (2013). Some Video Abstraction Techniques for Displaying Body Movement in Analysis and Performance. Leonardo, 46(1), 53–60.
Kabisch, E., Kuester, F., & Penny, S. (2005). Sonic panoramas: experiments with interactive landscape image sonification. In Proceedings of the 2005 international conference on Augmented tele-existence (pp. 156–163). ACM. Retrieved from http://dl.acm.org/citation.cfm?id=1152428
Kane, B. (2007). L’Objet Sonore Maintenant: Pierre Schaeffer, sound objects and the phenomenological reduction. Organised Sound, 12(01), 15. doi:10.1017/S135577180700163X
Kim, S.-J. (2010). A Critique On Pierre Schaeffer’s Phenomenological Approaches: Based On The Acousmatic And Reduced Listening. Retrieved from http://books.google.com/books?hl=en&lr=&id=YnCcAQAAQBAJ&oi=fnd&pg=PP2&dq=%22into+the%22+%22as+follows:+%E2%80%9CFor+years,+we+have+been%22+%22to+do+phenomenology+is+to+attend%22+%22asserts,+every+phenomenologist+must%22+%22in+phenomenology+proper+and+the%22+%22there+has+been+much+debate+about%22+&ots=IilmTsox3C&sig=DyHGr_rTzu1TjYuZ9eC5lRNhIk4
Knees, P., Pohle, T., & Widmer, G. (2012). Sound/tracks: artistic real-time sonification of train journeys. Journal on Multimodal User Interfaces, 6(1-2), 87–93. doi:10.1007/s12193-011-0089-x
Kostelnick, J. C., McDermott, D., Rowley, R. J., & Bunnyfield, N. (2013). A Cartographic Framework for Visualizing Risk. Cartographica: The International Journal for Geographic Information and Geovisualization, 48(3), 200–224.
Mee, E. B. (2013). Hearing the Music of the Hemispheres. TDR: The Drama Review, 57(3), 148–150.
Nikolaidis, R., Walker, B., & Weinberg, G. (2012). Generative musical tension modeling and its application to dynamic sonification. Computer Music Journal, 36(1), 55–64.
Schaeffer, P. (1964). Traité des objets musicaux (Nouv. éd edition.). Paris: Seuil.
Shinn-Cunningham, B. G., & Streeter, T. (2005). Spatial auditory display: Comments on Shinn-Cunningham et al., ICAD 2001. ACM Transactions on Applied Perception (TAP), 2(4), 426–429.
Shi, X. J., Cai, Y. Y., & Chan, C. W. (2007). Electronic music for bio-molecules using short music phrases. Leonardo, 40(2), 137–141.
Straebel, V., & Thoben, W. (2014). Alvin Lucier’s Music for Solo Performer: Experimental music beyond sonification. Organised Sound, 19(01), 17–29. doi:10.1017/S135577181300037X
Thibert, R., Akbarieh, M., & Tawashi, R. (1988). Morphic features variation of solid particles after size reduction: sonification compared to jet mill grinding. International Journal of Pharmaceutics, 47(1), 171–177.
Tittel, C. (2009). Sound Art as Sonification, and the Artistic Treatment of Features in our Surroundings. Organised Sound, 14(01), 57. doi:10.1017/S1355771809000089
Tuuri, K. (n.d.). Hearing Gestures Vocalisations as Embodied Projections of Intentionality in Designing Non-Speech Sounds for Communicative Functions. Retrieved October 23, 2014, from https://jyx.jyu.fi/dspace/bitstream/handle/123456789/27256/9789513943677.PDF?sequence=3
Walker, B. N., & Nees, M. A. (2005). Brief training for performance of a point estimation sonification task. In the Proceedings of the International Conference on Auditory Display (ICAD2005) (pp. 6–9). Retrieved from http://www.researchgate.net/publication/228374620_Brief_training_for_performance_of_a_point_estimation_sonification_task/file/9fcfd5100a69d06dc4.pdf
Wilson, A. J. (2009). A symbolic sonification of L-systems. Ann Arbor, MI: MPublishing, University of Michigan Library. Retrieved from http://www.adamjameswilson.info/papers/A_Symbolic_Sonification_of_L-Systems.pdf
Zhao, H., Smith, B. K., Norman, K., Plaisant, C., & Shneiderman, B. (2005). Interactive sonification of choropleth maps. MultiMedia, IEEE, 12(2), 26–35.


Potential Interest

Adderley, W. P., & Young, M. (2009). Ground-breaking: Scientific and Sonic Perceptions of Environmental Change in the African Sahel. Leonardo, 42(5), 404–411.
Baier, G., Hermann, T., & Stephani, U. (2007). Event-based sonification of EEG rhythms in real time. Clinical Neurophysiology, 118(6), 1377–1386. doi:10.1016/j.clinph.2007.01.025
Bertacchini, F., Bilotta, E., Gabriele, L., Pantano, P., & Tavernise, A. (2013). Toward the Use of Chua’s Circuit in Education, Art and Interdisciplinary Research: Some Implementation and Opportunities. Leonardo, 46(5), 457–463.
Blauert, J. (2005). Communication acoustics. Berlin: Springer-Verlag. Retrieved from http://site.ebrary.com/id/10130107
Blow, M. (2013). Solar Work #2: A Solar-Powered Sound Artwork. Leonardo Music Journal, 23(1), 10–11.
Bodle, C. (2006). Sonification/Listening Up. Leonardo Music Journal, 16(1), 51–52.
Broeckmann, A. (2004). Reseau/Resonance: Connective Processes and Artistic Practice. Leonardo, 37(4), 281–284.
Burraston, D. (2012). Environmental Sonification of Rainfall with Long Wire Instruments. Leonardo Music Journal, 22(1), 11–14.
Burraston, D. (2012). Rainwire: Environmental Sonification of Rainfall. Leonardo, 45(3), 288–289.
Cai, Y. Y., Lu, B. F., Fan, Z. W., Chan, C. W., Lim, K. T., Qi, L., & Li, L. (2006). Proteins, Immersive Games and Music. Leonardo, 39(2), 135–137.
Charles, J.-F. (2008). A tutorial on spectral sound processing using Max/MSP and Jitter. Computer Music Journal, 32(3), 87–102.
Childs, E. (2007). ICAD 2006: Global Music - The World by Ear. Computer Music Journal, 31(1), 95–96.
Chion, M. (2004). Le Son. Paris: Armand Colin.
Chion, M., Gorbman, C., & Murch, W. (1994). Audio-vision: sound on screen. New York: Columbia University Press.
Chrysakis, T. (2006). Spatio-Aural Terrains. Leonardo Music Journal, 16(1), 40–42.
Clarke, A. (2013). Orgasms and Oppositions: Dani Ploeger’s ELECTRODE. TDR: The Drama Review, 57(3), 158–163.
Cluett, S. (2006). Toward a Post-Phenomenology of Extra-Musical Sound as Compositional Determinant. Leonardo Music Journal, 16(1), 42–42.
Diaz-Jerez, G. (2011). Composing with Melomics: Delving into the Computational World for Musical Inspiration. Leonardo Music Journal, 21(1), 13–14.
Diebes, J. (2002). Notes On presence: A Music Installation for Phantom Chamber Orchestra. PAJ: A Journal of Performance and Art, 24(2), 34–41.
Döbereiner, L. (2011). Models of constructed sound: Nonstandard synthesis as an aesthetic perspective. Computer Music Journal, 35(3), 28–39.
Dumitriu, A., Tenetz, A., & Lawrence, D. (2010). KryoLab. Leonardo, 43(5), 486–487.
Edelstein, P. (2012). Sonic Objects, Resonance and Chaotics. Leonardo Music Journal, 22(1), 54–54.
Ekman, I., & Rinott, M. (2010). Using Vocal Sketching for Designing Sonic Interactions. Retrieved from http://dl.acm.org/citation.cfm?id=1858171
Ge, D., & Li, Y. (2012). Songs of Genes, by Genes, for Genes. Leonardo, 45(1), 96–97.
Gerhard, D. (2006). The 6th International Conference on Music Information Retrieval (ISMIR 2005). Computer Music Journal, 30(2), 90–92.
Gibson, J. (2006). sLowlife: Sonification of Plant Study Data. Leonardo Music Journal, 16(1), 42–44.
Grond, F., Olmos, A., & Cooperstock, J. R. (2013). Making Sculptures Audible through Participatory Sound Design. Leonardo Music Journal, 23(1), 12–13.
Harp, G. A. (2007). Deconstructing the Genome with Cinema. Leonardo, 40(4), 376–381.
Johnson, C. (2013). World on a Wire: Sound as Sensual Objects. Leonardo Music Journal, 23(1), 75–77.
Jones, R., & Nevile, B. (2005). Creating visual music in jitter: Approaches and techniques. Computer Music Journal, 29(4), 55–70.
Kirke, A., Miranda, E., Chiaramonte, A., Troisi, A. R., Matthias, J., Fry, N., … Bull, M. (2013). Cloud Chamber: A Performance with Real Time Two-Way Interaction Between Subatomic Particles and Violinist. Leonardo, 46(1), 84–85.
Knebusch, J. (2007). The Perception of Climate Change. Leonardo, 40(2), 113–113.
Kuchera-Morin, J. (2011). Performing in Quantum Space: A Creative Approach to N-Dimensional Computing. Leonardo, 44(5), 462–463.
Lim, K. A., & Raphael, C. (2010). InTune: A System to Support an Instrumentalist’s Visualization of Intonation. Computer Music Journal, 34(3), 45–55.
Mangolte, B., Jonas, J., Bishop, C., Shaw, H., Blau, H., Ong, H., … Hentschker, F. (2012). Being Contemporary. PAJ: A Journal of Performance and Art, 34(1), 43–59.
Michel Chion’s “Guide to Sound Objects: Pierre Schaeffer and Musical Research.” (n.d.). Retrieved from http://modisti.com/news/?p=14239
Middleton, J. N., & Dowd, D. (2008). Web-based algorithmic composition from extramusical resources. Leonardo, 41(2), 128–135.
Miranda, E. R., & Brouse, A. (2005). Interfacing the Brain Directly with Musical Systems: On Developing Systems for Making Music with Brain Signals. Leonardo, 38(4), 331–336.
Momeni, A., & Henry, C. (2006). Dynamic independent mapping layers for concurrent control of audio and video synthesis. Computer Music Journal, 30(1), 49–66.
Niemeyer, G. (2005). PING : Poetic Charge and Technical Implementation. Leonardo, 38(4), 312–313.
Ox, J. (2001). Intersenses/Intermedia: A Theoretical Perspective. Leonardo, 34(1), 47–48.
Pearlman, E. (2013). Russia Tiptoes into New Media. PAJ: A Journal of Performance and Art, 35(3), 49–54.
Quay, Y. de, Skogstad, S., & Jensenius, A. (2011). Dance Jockey: Performing Electronic Music by Dancing. Leonardo Music Journal, 21(1), 11–12.
Randerson, J. (2007). Between Reason and Sensation: Antipodean Artists and Climate Change. Leonardo, 40(5), 442–448.
Rodgers, T. (2006). Special Section Introduction: Sound and the Social Organization of Space. Leonardo Music Journal, 16(1), 49–50.
Root-Bernstein, R. S. (2001). Music, creativity and scientific thinking. Leonardo, 34(1), 63–68.
Roshto, B. L., Panouklia, E., Holder, J., & Pryor, S. (2010). Ephemeron — Sculpting a Collective Consciousness and Mapping a Collaborative Process. Leonardo, 43(5), 496–497.
Rubery, M. (2013). Canned Literature: The Book after Edison. Book History, 16(1), 215–245. doi:10.1353/bh.2013.0012
Rudi, J. (2005). Computer music video: A composer’s perspective. Computer Music Journal, 29(4), 36–44.
Simanowski, R. (2010). Digital Anthropophagy: Refashioning Words as Image, Sound and Action. Leonardo, 43(2), 159–163.
Stockholm, J. (2008). Eavesdropping: Network Mediated Performance in Social Space. Leonardo Music Journal, 18(1), 55–58.
Wilson, S., Lorway, N., Coull, R., Vasilakos, K., & Moyers, T. (2014). Free as in BEER: Some Explorations into Structured Improvisation Using Networked Live-Coding Systems. Computer Music Journal, 38(1), 54–64.
Yeo, W. S., Kim, K., Kim, S., & Lee, J. (2012). TAPIR Sound as a New Medium for Music. Leonardo Music Journal, 22(1), 49–51.