Loading…
This event has ended. Visit the official site or create your own event on Sched.
Demo-Poster Entry [clear filter]
Monday, June 4
 

11:30am EDT

Poster 1.17
CABOTO: A Graphic-Based Interactive System for Composing and Performing Electronic Music
by Riccardo Marogna


CABOTO is an interactive system for live performance and composition. A graphic score sketched on paper is read by a computer vision system. The graphic elements are scanned following a symbolic-raw hybrid approach, that is, they are recognised and classified according to their shapes but also scanned as waveforms and optical signals. All this information is mapped into the synthesis engine, which implements different kind of synthesis techniques for different shapes. In CABOTO the score is viewed as a cartographic map explored by some navigators. These navigators traverse the score in a semi-autonomous way, scanning the graphic elements found along their paths. The system tries to challenge the boundaries between the concepts of composition, score, performance, instrument, since the musical result will depend both on the composed score and the way the navigators will traverse it during the live performance.

Exhibitors
avatar for Riccardo Marogna

Riccardo Marogna

Musician, Technician, Magician, Institute of Sonology, Royal Conservatoire in The Hague
Musician, improviser, composer, born in Verona (Italy), currently based in The Hague (NL). His research is focused on developing an improvisational language in the electro-acoustic scenario, where the electronic manipulations and the acoustic sounds merge seamlessly in the continuum... Read More →


Monday June 4, 2018 11:30am - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

11:30am EDT

Poster 1.18
Performance Systems for Live Coders and Non Coders
by Avneesh Sarwate, Ryan Taylor Rose, Jason Freeman & Jack Armitage


This paper explores the question of how live coding musicians can perform with musicians who are not using code (such as acoustic instrumentalists or those using graphical and tangible electronic interfaces). This paper investigates performance systems that facilitate improvisation where the musicians can interact not just by listening to each other and changing their own output, but also by manipulating the data stream of the other performer(s). In a course of performance-led research four prototypes were built and analyzed them using concepts from NIME and creative collaboration literature. Based on this analysis it was found that the systems should 1) provide a commonly modifiable visual representation of musical data for both coder and non-coder, and 2) provide some independent means of sound production for each user, giving the non-coder the ability to slow down and make non-realtime decisions for greater performance flexibility.

Exhibitors
avatar for Jack Armitage

Jack Armitage

PhD student, Augmented Instruments Lab, C4DM, QMUL
Jack Armitage is a PhD student in the Augmented Instruments Lab, Centre for Digital Music, Queen Mary University of London. His topic is on supporting craft in digital musical instrument design, supervised by Dr. Andrew McPherson.


Monday June 4, 2018 11:30am - 1:30pm EDT
Moss Arts Center - Orchestra Lobby
 
Wednesday, June 6
 

12:00pm EDT

Demo 3.01
game over - a musical 2D game engine
by Christof Ressi


This demo shows a self-developed game engine which is meant to be used in the context of installations and audio-visual performances. The player is supposed to explore different game worlds and interact freely with their environments and non-player-characters. Almost everything the player does has musical consequences. The game worlds are collages of well-known vintage video game genres, like platformer, dungeon crawler, NES 'Mode 7' or isometric pseudo 3D, creating bizarre and confusing scenarios. It is possible modify the game while it is running (e.g. spawning/ destroying/teleporting actors, changing the tile map, trigger events etc.). This adds a level of live coding to a musical performance where the programmer can act as a musical partner. While the game engine is written in C++, the game logic is scripted with Lua. All sound is done in Pure Data using a selfwritten emulator of the Roland D-110 sound module and various other synthesis and sampling techniques. Maps can be desgined with a custom level editor. game_over was developed within the artistic research project "GAPPP" at the Institute of Electronic Music in Graz, Austria and is founded by the "Österreichisches Wissenschaftsfonds" (project number AR364-G24).

Exhibitors

Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Demo 3.02
Teeth
by Spencer Salazar


Teeth is a composition for solo performer using the Auraglyph musical sketchpad software on an iPad. Using two hand-drawn oscillator waveforms and delays, the performer explores a series of modulated timbres and frequencies. The performer's actions on the tablet are displayed to the audience via an overhead camera, making transparent the connection from the performer's action to sonic and visual representations.

Exhibitors

Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Demo 3.03
Romp in Chaos
Demo by Edgar Berdahl


The sound of chaos can be joyous! This electroacoustic miniature is an exercise that explores the edge of chaos, which is realized by two digital waveguides resonating against the Peter de Jong chaotic map. For this work, an embedded acoustic instrument was created with five pressure sensors and five potentiometers. As the performer changes the parameters to and fro, the sound romps back and forth between chaotic regimes and more tonal sounds. Long live chaos!

Exhibitors
avatar for Edgar Berdahl

Edgar Berdahl

Assistant Professor, Louisiana State University


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Demo 3.04
Circles
by Barry Moon


Circles was created in response to the alienating experience of concert music, where the audience is confined to seating far away from the performer, who reads from a score hidden from their sight. I was interested in making the score more interesting to the spectator, and allowing the audience to move among the performers instead of being confined to seats. Performers each wear a helmet containing a Raspberry Pi computer, sensor for color and motion, speaker, and battery for power. Pure Data is used for audio processing. Changes in computer processing are dictated by changes in colors and movement of the performers picked up by the sensor. Performances by Annie Stevens (percussion) and Kyle Hutchins (saxophone).

Exhibitors

Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Demo 3.05
Paint the Melody in Virtual Reality
by Kaiming Cheng

Exhibitors
KC

Kaiming Cheng

Undergrad, University of Virginia


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Demo 3.06
Unsichtbares Klavier (invisible piano), virtual controller
by Remmy Canedo


Play an imaginary instrument that simulates the attributes of a grand piano without its physical form.

Exhibitors

Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Demo 3.07
Musical Chairs
by Matthew Blessing


Play a coffee table, end table, ottoman, and wingback chair - each outfitted with embedded sensors, CPUs, and audio drivers.

Exhibitors
avatar for Matthew Blessing

Matthew Blessing

Doctoral Candidate, Louisiana State University


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Demo 3.08
OtoKin installation
by Palle Dahlstedt & Ami
Skånberg-Dahlstedt

In OtoKin, an invisible sound space is explored through ear (Oto) and movement (Kinesis). With eyes closed, you enter a high-dimensional acoustic space, where every small body movement matters. Through this re-translation of three-dimensional body action and position into infinite-dimensional sound texture and timbre, you are forced to re-think and re-learn: Position as place, position as posture, posture as timbre, timbre as a bodily construction. The OtoKin sound space is also shared with other users, with added modes of presence, proximity and interaction.

Exhibitors
avatar for Palle Dahlstedt

Palle Dahlstedt

University of Gothenburg, Aalborg University
Palle Dahlstedt (b.1971), Swedish improviser, researcher, and composer of  everything from chamber and orchestral music to interactive and  autonomous computer pieces, receiving the Gaudeamus Music Prize in 2001. Currently Obel Professor of Art & Technology at Aalborg University... Read More →


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Demo 3.09
Body Biopotential Sonification
by Alan Macy


The Body Biopotential Sonification art project explores the idea of human nervous system expression in a movement / dance setting.  A participant movement artist’s biopotentials are monitored and the measured electricity, sourced from the artist’s body physical activity, is greatly amplified and then conditioned to be presented as sound expression. The project operates via the principle of differential biopotential signal amplification.  The detected signals are the participant’s Lead I electrocardiogram and associated electromyogram.  The visceral, sonic, environment generated within the confines of the project is established by the participant’s measured biopotentials.  The project enables a participant to co-create a musical composition that’s sourced from an individual’s physiology.  Physiological metrics are measured, transformed, and then re-introduced as auditory stimuli to the participant movement artist and audience.

Exhibitors

Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Demo 3.10
Music Demo
by Kristina Warren


Arrest (2018), a work for Exo/Rosie. Exo/Rosie is a wearable, analog-digital instrument/persona I created that uses body-to-body connections, such as wrist to wrist, to vary analog audio and digital control output. These closed, covered gestures reflect the carceral state and limited agency around music technology today. Arrest explores the complete body – choreographically, expressively, and socially – as a crucial musical affordance.

Artists
KW

Kristina Warren

Kristina Warren is a composer, improviser, and critical maker whose multimodal practice - from building and playing unique analog-digital instruments, to composing for and with chamber ensembles - explores diverse acts of listening and making noise. Her first solo album, filament... Read More →


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.01
Low Frequency Feedback Drones: A non-invasive augmentation of the double bass
by Thanos Polymeneas Liontiris

This paper illustrates the development of a Feedback Resonating Double Bass. The instrument is essentially the augmentation of an acoustic double bass using positive feedback. The research aimed to reply the question of how to augment and convert a double bass into a feedback resonating one without following an invasive method. The conversion process illustrated here is applicable and adaptable to double basses of any size, without making irreversible alterations to the instruments.


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.02
The Orchestra of Speech: a speech-based instrument system
by Daniel Formo


The Orchestra of Speech is a performance concept resulting from a recent artistic research project exploring the relationship between music and speech, in particular improvised music and everyday conversation. As a tool in this exploration, a digital musical instrument system has been developed for “orchestrating” musical features of speech into music, in real time. Through artistic practice, this system has evolved into a personal electroacoustic performance concept.

Exhibitors
DF

Daniel Formo

Research Fellow, Norwegian University of Science and Technology


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.03
Surveying the Compositional and Performance Practices of Audiovisual Practitioners
by Anna Weisling, Anna Xambó, Ireti Olowe, Mathieu Barthet


This paper presents a brief overview of an online survey conducted with the objective of gaining insight into compositional and performance practices of contemporary audiovisual practitioners. The survey gathered information regarding how practitioners relate aural and visual media in their work, and how compositional and performance practices involving multiple modalities might differ from other practices. Discussed here are three themes: compositional approaches, transparency and audience knowledge, and error and risk, which emerged from participants' responses. We believe these themes contribute to a discussion within the NIME community regarding unique challenges and objectives presented when working with multiple media.

Exhibitors
avatar for Anna Xambó

Anna Xambó

Postdoctoral Research Assistant, Queen Mary University of London


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.04
Sound Opinions: Creating a Virtual Tool for Sound Art Installations through Sentiment Analysis of Critical Reviews
by Anthony T. Marasco


The author presents Sound Opinions, a custom software tool that uses sentiment analysis to create sound art installations and music compositions. The software runs inside the NodeRed.js programming environment. It scrapes text from web pages, pre-processes it, performs sentiment analysis via a remote API, and parses the resulting data for use in external digital audio programs. The sentiment analysis itself is handled by IBM's Watson Tone Analyzer.
 
 The author has used this tool to create an interactive multimedia installation, titled Critique. Sources of criticism of a chosen musical work are analyzed and the negative or positive statements about that composition work to warp and change it. This allows the audience to only hear the work through the lens of its critics, and not in the original form that its creator intended.

Exhibitors
avatar for Anthony T. Marasco

Anthony T. Marasco

Louisiana State University
Embedded instruments, software for emergent media experiences, audiovisual installations, composition, modular synthesizers, the TV show “Perfect Strangers. ”


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.05
A web-based 3D environment for gestural interaction with virtual music instruments as a STEAM education tool
by Kosmas Kritsis, Aggelos Gkiokas, Carlos Árpád Acosta, Quentin Lamerand, Robert Piéchaud, Maximos Kaliakatsos-Papakostas & Vassilis Katsouros


We present our work in progress on the development of a web-based system for music performance with virtual instruments in a virtual 3D environment, which provides three means of interaction (i.e physical, gestural and mixed), using tracking data from a Leap Motion sensor. Moreover, our system is integrated as a creative tool within the context of a STEAM education platform that promotes science learning through musical activities.

 The presented system models string and percussion instruments, with realistic sonic feedback based on Modalys, a physical model-based sound synthesis engine. Our proposal meets the performance requirements of real-time interactive systems and is implemented strictly with web technologies.

Exhibitors

Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.06
CubeHarmonic: A New Interface from a Magnetic 3D Motion Tracking System to Music Performance
by Maria C. Mannone, Eri Kitamura, Jiawei Huang, Ryo Sugawara & Yoshifumi Kitamura


We developed a new musical interface, CubeHarmonic, with the magnetic tracking system, IM3D, created at Tohoku University. IM3D system precisely tracks positions of tiny, wireless, battery-less, and identifiable LC coils in real time. The CubeHarmonic is a musical application of the Rubik's cube, with notes on each little piece. Scrambling the cube, we get different chords and chord sequences. Positions of the pieces which contain LC coils are detected through IM3D, and transmitted to the computer, that plays sounds. The central position of the cube is also computed from the LC coils located into the corners of Rubik's cube, and, depending on the computed central position, we can manipulate overall loudness and pitch changes, as in theremin playing. This new instrument, whose first idea comes from mathematical theory of music, can be used as a teaching tool both for math (group theory) and music (music theory, mathematical music theory), as well as a composition device, a new instrument for avant-garde performances, and a recreational tool.

Exhibitors
avatar for Yoshifumi Kitamura

Yoshifumi Kitamura

Tohoku University
avatar for Maria Mannone

Maria Mannone

UNIVERSITY OF MINNESOTA


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.07
The Whammy Bar as a Digital Effect Controller
by Martin M Kristoffersen & Trond Engum


In this paper we present a novel digital effects controller for electric guitar based upon the whammy bar as a user interface. The goal with the project is to give guitarists a way to interact with dynamic effects control that feels familiar to their instrument and playing style. A 3D-printed prototype has been made. It replaces the whammy bar of a traditional Fender vibrato system with a sensor-equipped whammy bar. The functionality of the present prototype includes separate readings of force applied towards and from the guitar body, as well as an end knob for variable control. Further functionality includes a hinged system allowing for digital effect control either with or without the mechanical manipulation of string tension. By incorporating digital sensors to the idiomatic whammy bar interface, one would potentially bring guitarists a high level of control intimacy with the device, and thus lead to a closer interaction with effects.


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.08
Timbre Tuning: Variation in Cello Sprectrum Across Pitches and Instruments
by Robert Pond, Alexander Klassen & Kirk McNally


The process of learning to play a string instrument is a notoriously difficult task. A new student to the instrument is faced with mastering multiple, interconnected physical movements in order to become a skillful player. In their development, one measure of a players quality is their tone, which is the result of the combination of the physical characteristics of the instrument and their technique in playing it. This paper describes preliminary research into creating an intuitive, real-time device for evaluating the quality of tone generation on the cello: a ``timbre-tuner" to aid cellists evaluate their tone quality. Data for the study was collected from six post-secondary music students, consisting of recordings of scales covering the entire range of the cello. Comprehensive spectral audio analysis was performed on the data set in order to evaluate features suitable to describe tone quality. An inverse relationship was found between the harmonic centroid and pitch played, which became more pronounced when restricted to the A string. In addition, a model for predicting the harmonic centroid at different pitches on the A string was created. Results from informal listening tests support the use of the harmonic centroid as an appropriate measure for tone quality.


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.09
Tributaries of Our Lost Palpability
by Matthew Mosher, Danielle Wood & Tony Obr


This demonstration paper describes the concepts behind Tributaries of Our Distant Palpability, an interactive sonified sculpture. It takes form as a swelling sea anemone, while the sounds it produces recall the quagmire of a digital ocean. The sculpture responds to changing light conditions with a dynamic mix of audio tracks, mapping volume to light level. People passing by the sculpture, or directly engaging it by creating light and shadows with their smart phone flashlights, will trigger the audio. At the same time, it automatically adapts to gradual environment light changes, such as the rise and fall of the sun. The piece was inspired by the searching gestures people make, and emotions they have while, idly browsing content on their smart devices. It was created through an interdisciplinary collaboration between a musician, an interaction designer, and a ceramicist.

Exhibitors
avatar for Matthew Mosher

Matthew Mosher

Assistant Professor, University of Central Florida
Boston native Matthew Mosher is an intermedia artist and mixed methods research professor who creates embodied experiential systems. He received his BFA in Furniture Design from the Rhode Island School of Design in 2006 and his MFA in Intermedia from Arizona State University in 2012... Read More →


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.10
Embedded Digital Shakers: Handheld Physical Modeling Synthesizers
by Andrew Piepenbrink


We present a flexible, compact, and affordable embedded physical modeling synthesizer which functions as a digital shaker. The instrument is self-contained, battery-powered, wireless, and synthesizes various shakers, rattles, and other handheld shaken percussion. Beyond modeling existing shakers, the instrument affords new sonic interactions including hand mutes on its loudspeakers and self-sustaining feedback. Both low-cost and high-performance versions of the instrument are discussed.

Exhibitors

Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.11
Live Repurposing of Sounds: MIR Explorations with Personal and Crowdsourced Databases
by Anna Xambó, Gerard Roma, Alexander Lerch, Mathieu Barthet & György Fazekas


The recent increase in the accessibility and size of personal and crowdsourced digital sound collections brought about a valuable resource for music creation. Finding and retrieving relevant sounds in performance leads to challenges that can be approached using music information retrieval (MIR). In this paper, we explore the use of MIR to retrieve and repurpose sounds in musical live coding. We present a live coding system built on SuperCollider enabling the use of audio content from online Creative Commons (CC) sound databases such as Freesound or personal sound databases. The novelty of our approach lies in exploiting high-level MIR methods (e.g., query by pitch or rhythmic cues) using live coding techniques applied to sounds. We demonstrate its potential through the reflection of an illustrative case study and the feedback from four expert users. The users tried the system with either a personal database or a crowdsourced database and reported its potential in facilitating tailorability of the tool to their own creative workflows.

Exhibitors
avatar for Anna Xambó

Anna Xambó

Postdoctoral Research Assistant, Queen Mary University of London


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.13
The Feedback Trombone: Controlling Feedback in Brass Instruments
by Jeff Snyder, Michael R Mulshine & Rajeev S Erramilli


This paper presents research on control of electronic signal feedback in brass instruments through the development of a new augmented musical instrument, the Feedback Trombone. The Feedback Trombone (FBT) extends the traditional acoustic trombone interface with a speaker, microphone, and custom analog and digital hardware.

Exhibitors
avatar for Mike Mulshine

Mike Mulshine

Research Specialist, Princeton University
avatar for Jeff Snyder

Jeff Snyder

Director of Electronic Music, Princeton University
Instrument designer, composer, improvisor.


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.14
Mechanoise: Mechatronic Sound and Interaction in Embedded Acoustic Instruments
by Eric Sheffield


The use of mechatronic components (e.g. DC motors and solenoids) as both electronic sound source and locus of interaction is explored in a form of embedded acoustic instruments called mechanoise instruments. Micro-controllers and embedded computing devices provide a platform for live control of motor speeds and additional sound processing by a human performer. Digital fabrication and use of salvaged and found materials are emphasized.

Exhibitors
ES

Eric Sheffield

Louisiana State University|Baton Rouge|Louisiana|United States


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.15
Do We Speak Sensor? Cultural Constraints of Embodied Interaction
by Jon Pigrem & Andrew P. McPherson


This paper explores the role of materiality in Digital Musical Instruments and questions the influence of tacit understandings of sensor technology. Existing research investigates the use of gesture, physical interaction and subsequent parameter mapping. We suggest that a tacit knowledge of the ‘sensor layer’ brings with it definitions, understandings and expectations that forge and guide our approach to interaction. We argue that the influence of technology starts before a sound is made, and comes from not only intuition of material properties, but also received notions of what technology can and should do. On encountering an instrument with obvious sensors, a potential performer will attempt to predict what the sensors do and what the designer intends for them to do, becoming influenced by a machine centered understanding of interaction and not a solely material centred one. The paper presents an observational study of interaction using non-functional prototype instruments designed to explore fundamental ideas and understandings of instrumental interaction in the digital realm. We will show that this understanding influences both gestural language and ability to characterise an expected sonic/musical response.

Exhibitors
avatar for Andrew McPherson

Andrew McPherson

Reader, Queen Mary University of London
avatar for Jon Pigrem

Jon Pigrem

Researcher, Queen Mary University of London
Hi! I'm Jon from Queen Mary University of London. I'm a Musician, Artist and Researcher. My current research investigates tacit understandings of instrumental interaction with materials and sensors. Talk to me about: Instrumental interaction - Sensors - NIMEs - DMIs - Electronic Music... Read More →


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby

12:00pm EDT

Poster 3.16
Re-engaging the Body and Gesture in Musical Live Coding
by Spencer Salazar & Jack Armitage


At first glance, the practice of musical live coding seems distanced from the gestures and sense of embodiment common in musical performance, electronic or otherwise. This workshop seeks to explore the extent to which this assertion is justified, to re-examine notions of gesture and embodiment in the context of musical live coding performance, to consider historical approaches to synthesizing musical programming and gesture, and to look to the future for new ways of doing so. The workshop will consist firstly of a critical discussion of these issues and related literature. This will be followed by applied practical experiments involving ideas generated during these discussions. The workshop will conclude with a recapitulation and examination of these experiments in the context of previous research and proposed future directions.

Exhibitors
avatar for Jack Armitage

Jack Armitage

PhD student, Augmented Instruments Lab, C4DM, QMUL
Jack Armitage is a PhD student in the Augmented Instruments Lab, Centre for Digital Music, Queen Mary University of London. His topic is on supporting craft in digital musical instrument design, supervised by Dr. Andrew McPherson.


Wednesday June 6, 2018 12:00pm - 1:30pm EDT
Moss Arts Center - Orchestra Lobby
 
Filter sessions
Apply filters to sessions.