Adam Tindale
Faculty of Arts & Science
Adam Tindale is an electronic drummer and digital instrument designer. He is an Associate Professor of Human-Computer Interaction in the Digital Futures Initiative at OCAD University. Adam performs on his E-Drumset: a new electronic instrument that utilizes physical modeling and machine learning with an intuitive physical interface. He completed a Bachelor of Music at Queen's University, a Masters of Music Technology at McGill University, and an Interdisciplinary Ph.D. in Music, Computer Science and Electrical Engineering at the University of Victoria.
Unveiling New Artistic Dimensions in Calligraphic Arabic Script with Generative Adversarial Networks
Proceedings of the ACM on Computer Graphics and Interactive Techniques
Published: December 31st 2024
Designing Wearable Technology for Opera
Proceedings of the 2023 ACM Designing Interactive Systems Conference
Published: December 31st 2023
Documented: Embedding Information onto and Retrieving Information from 3D Printed Objects
Published: December 31st 2021
Documentation for DIY tasks serve as codified project knowledge and help makers reach new understandings and appreciations for the artifact. Engaging in reflective processes using the documentation can be challenging when it comes to physical objects as the documentation and the artifact exist separately. We hypothesize that spatially associating the documentation information with the artifact can provide richer contextualization to reflect upon the artifact and design process. We implemented and evaluated Documented, a web application that helps makers associate documentation to 3D printed objects. Information can be embedded using printed tags spatially placed on the model and accessed using mobile AR. Our study highlights the different strategies participants had for organizing, embedding, and retrieving information. Informed by our results, we discuss how the coupling of the documentation and the artifact can support reflection and identify potential barriers that need further investigation.
Physically Colliding with Music
Full-body Interactions with an Audio-only Virtual Reality Interface
Published: December 31st 2019
A Very Real Looper (AVRL) is an audio-only virtual reality (VR) interface inside of which a performer triggers and controls music through full-body movement. Contrary to how musical interfaces in VR are normally used, a performer using AVRL is not disconnected from their surrounding environment through immersion, nor is their body restrained by a head-mounted display. Rather, AVRL utilizes two VR sensors and the Unity game engine to map virtual musical sounds onto physical objects in the real world. These objects help the performer locate the sounds. Using two handheld VR controllers, these sounds can be triggered, looped, acoustically affected, or repositioned in space. AVRL thus combines the affordances of the physical world and a VR system with the reconfigurability of a game engine. This integration results in an expansive and augmented performance environment that facilitates full-body musical interactions.
Flocking: A Framework For Declarative Music-Making On The Web
Published: December 31st 2014
(Abstract to follow)
Theoretical aesthetics
Journal of Professional Communication
Published: December 31st 2014
Extending The Nexus Data Exchange Format (Ndef) Specification
Published: December 31st 2014
The Nexus Data Exchange Format (NDEF) is an Open Sound Control (OSC) namespace specification designed to make connection and message management tasks easier for OSC-based networked performance systems. New extensions to the NDEF namespace improve both connection and message management between OSC client and server nodes. Connection management between nodes now features human-readable labels for connections and a new message exchange for pinging connections to determine their status. Message management now has improved namespace synchronization via a message count exchange and by the ability to add, remove, and replace messages on connected nodes.
Time tremors
developing transmedia gaming for children
Published: December 31st 2013
Time Tremors is a transmedia experience for children aged 8-14 that crosses television, web, locative media, and mobile apps. Time Tremors is a collection game in which players search for objects from history supposedly scattered throughout time and space, hidden, invisible to the human eye but detectable and collectable using a variety of mobile and online broadband technologies. Extending the game into locative augmented reality and mobile play was an applied research challenge that required narrative continuity while ensuring safe play.
Wearable haptic gaming using vibrotactile arrays
Published: December 31st 2013
In this paper we explore the design, layout and configuration of wrist-wearable, haptic gaming interfaces, which involve visual and vibrotactile spatial and temporal patterns. Our goal is to determine overall layouts and spatial and temporal resolutions on the wrist suitable for interactive tactile stimuli. Our approach is to first explore the simplest of configurative patterns that are intended to encircle the wrist, and then study their affordances. We describe various informal user studies we have employed to explore and test issues that arose.
The JunctionBox Interaction Design Toolkit: Making interaction programming easier to allow for sketching with sound
Published: December 21st 2013
Node And Message Management With The Junctionbox Interaction Toolkit
Published: December 31st 2012
Message mapping between control interfaces and sound engines is an important task that could benefit from tools that streamline development. A new Open Sound Control (OSC) namespace called Nexus Data Exchange Format (NDEF) streamlines message mapping by offering developers the ability to manage sound engines as network nodes and to query those nodes for the messages in their OSC address spaces. By using NDEF, developers will have an eas-ier time managing nodes and their messages, especially for scenarios in which a single application or interface controls multiple sound engines. NDEF is currently implemented in the JunctionBox interaction toolkit but could easily be implemented in other toolkits.
JunctionBox for Android: An Interaction Toolkit for Android-based Mobile Devices ...
Published: December 21st 2012
http://innovis.cpsc.ucalgary.ca/Publications/_Fyfe2012:LAC ...
Junctionbox : A Toolkit For Creating Multi-Touch Sound Control Interfaces
Published: December 31st 2011
JunctionBox is a new software toolkit for creating multitouch interfaces for controlling sound and music. Morespecifically, the toolkit has special features which make iteasy to create TUIO-based touch interfaces for controllingsound engines via Open Sound Control. Programmers using the toolkit have a great deal of freedom to create highlycustomized interfaces that work on a variety of hardware.
Training Surrogate Sensors in Musical Gesture Acquisition Systems
IEEE Transactions on Multimedia
Published: December 31st 2011
Capturing the gestures of music performers is a common task in interactive electroacoustic music. The captured gestures can be mapped to sounds, synthesis algorithms, visuals, etc., or used for music transcription. Two of the most common approaches for acquiring musical gestures are: 1) “hyper-instruments” which are “traditional” musical instruments enhanced with sensors for directly detecting the gestures and 2) “indirect acquisition” in which the only sensor is a microphone capturing the audio signal. Hyper-instruments require invasive modification of existing instruments which is frequently undesirable. However, they provide relatively straightforward and reliable sensor measurements. On the other hand, indirect acquisition approaches typically require sophisticated signal processing and possibly machine learning algorithms in order to extract the relevant information from the audio signal. The idea of using direct sensor(s) to train a machine learning model for indirect acquisition is proposed in this paper. The resulting trained “surrogate” sensor can then be used in place of the original direct invasive sensor(s) that were used for training. That way, the instrument can be used unmodified in performance while still providing the gesture information that a hyper-instrument would provide. In addition, using this approach, large amounts of training data can be collected with minimum effort. Experimental results supporting this idea are provided in two detection contexts: 1) strike position on a drum surface and 2) strum direction on a sitar.
The E-Drum A Case Study for Machine Learning in New Musical Controllers
Conference on Interdisciplinary Musicology
Published: December 21st 2011
Advancing the art of electronic percussion
Published: December 21st 2009
The goal of this project is to create a new instrument: the E-Drumset. This new interface addresses the lack of expressivity in current electronic percussion devices. The project combines Electrical Engineering for implementing hardware and digital signal processing, Computer Science for implementing musical and mapping software, and Music to devise new playing techniques and ways to combine them into a pedagogy and language of transmission. Like an acoustic drumset, the E-Drumset consists of different components that can be arranged together as a whole. An acoustic drumset can be thought of as a collection of pedals, drums and cymbals. The E-Drumset consists of the E-Pedal, E-Drum and E-Cymbal. The technology utilized in the E-Drumset includes sensor technologies with newly developed technologies such as acoustically excited physical models and timbre-recognition based instruments. These new technologies are discussed and applied to situations beyond the E-Drumset. Just building a new controller is not enough. It needs to be thoroughly tested in musical situations and to take into account feedback from musicians (both the player and other members of the ensemble) during the evaluation of the instrument. Clear and attainable technical guidelines have not been devised for the E-Drumset. In the case of the radiodrum, a spatial controller, improvements can be summarized to be better resolution in space and time. In the case of the E-Drumset the goal is to offer a flexible interface to percussionists where electronic drums are often the bottleneck in bandwidth. There is no clear answer to questions such as how low the latency needs to be to satisfy a drummer; an issue that will be explored through the project. The goals of the project are to provide the percussionist with an interface that they may sit down and use existing skills. Utilizing the great variety of gesture available to the expert, the E-Drumset allows the percussionist to explore all manners of controllers between acoustic instruments and electronic. ...
Strike-A-Tune: Fuzzy Music Navigation Using a Drum Interface.
Published: December 31st 2007
[TODO] Add abstract here.
A Hybrid Method For Extended Percussive Gesture
Published: December 31st 2007
This paper describes a hybrid method to allow drummers to expressively utilize electronics. Commercial electronic drum hardware is made more expressive by replacing the sample playback "drum brain" with a physical modeling algorithm implemented in Max/MSP. Timbre recognition techniques identify striking implement and location as symbolic data that can be used to modify the parameters of the physical model.
Virtual instrument control for music therapy
Proceedings of the European Conference for the Advancement of Assistive Technology
Published: December 21st 2007
Learning Indirect Acquisition of Instrumental Gestures using Direct Sensors
Published: December 31st 2006
Sensing instrumental gestures is a common task in interactive electroacoustic music performances. The sensed gestures can then be mapped to sounds, synthesis algorithms, visuals etc. Two of the most common approaches for acquiring these gestures are 1) Hybrid Instruments which are "traditional" musical instruments enhanced with sensors that directly detect gestures 2) Indirect Acquisition in which the only measurement is the acoustic signal and signal processing techniques are used to acquire the gestures. Hybrid instruments require modification of existing instruments which is frequently undesirable. However they provide relatively straightforward and reliable measuring capability. On the other hand, indirect acquisition approaches typically require sophisticated signal processing and possibly machine learning algorithms in order to extract the relevant information from the audio signals. In this paper the idea of using direct sensors to train a machine learning model for indirect acquisition is explored. This approach has some nice advantages, mainly: 1) large amounts of training data can be collected with minimum effort 2) once the indirect acquisition system is trained no sensors or modifications to the playing instrument are required. Case studies described in paper include 1) strike position on a snare drum 2) strum direction on a sitar.
The KiOm: A Paradigm for Collaborative Controller Design
Published: December 21st 2006
LogoRhythms: Introductory Audio Programming for Computer Musicians in a Functional Language Paradigm
Published: December 31st 2005
Teaching computer music presents opportunities and challenges at both secondary and university levels by bringing together students with widely varying exposures to and interests for mathematics and computer programming. Visual languages like MAX/MSP are popular with many musicians, but the idiom doesn't necessarily transfer well to a text language such as Java or C++, languages that might be used in a wider variety of programming problems. Our design challenge with LogoRhythms was to create a forgiving text based API that allows the neophyte programmer to explore programming and low-level digital audio manipulations. Since any musical composition is essentially a novel program, the opportunity for custom software is endless and the programming task given as a creative endeavor. LogoRhythms encourages functional style programming. Examples are provided showing lists and higher order functions used to create simple harmonies and melodies with a discussion of how to balance abstracting elegance with "abstracting elusiveness."
Wearable sensors for real-time musical signal processing
Published: December 31st 2005
This paper describes the use of wearable sensor technology to control parameters of audio effects for real-time musical signal processing. Traditional instrument performance techniques are preserved while the system modifies the resulting sound based upon the movements of the performer. Gesture data from a performing artist is captured using three-axis accelerometer packages that is converted to MIDI (musical instrument digital interface) messages using microcontroller technology. ChucK, a new programming language for on-the-fly audio signal processing and sound synthesis, is used to collect and process synchronized gesture data and audio signals from the traditional instrument being performed. Case studies using the wearable sensors in a variety of locations on the body (head, hands, feet, etc.) with a number of different traditional instruments (tabla, sitar, drumset, turntables, etc.) are presented.
A Comparison Of Sensor Strategies For Capturing Percussive Gestures
Published: December 31st 2005
Drum controllers designed by researchers and commercialcompanies use a variety of techniques for capturing percussive gestures. It is challenging to obtain both quick responsetimes and low-level data (such as position) that contain expressive information. This research is a comprehensive studyof current methods to evaluate the available strategies andtechnologies. This study aims to demonstrate the benefitsand detriments of the current state of percussion controllersas well as yield tools for those who would wish to conductthis type of study in the future.
Indirect acquisition of percussion gestures using timbre recognition
Proc. Conf. on Interdisciplinary Musicology (CIM)
Published: December 21st 2005
Retrieval Of Percussion Gestures Using Timbre Classification Techniques.
Published: December 31st 2004
[TODO] Add abstract here.
Classification of snare drum sounds using neural networks
Published: December 21st 2004
Haptic pattern representation using music technologies
Associate Professor
Type: Digital Futures
Ontario College of Art and Design
Permanent Instructor
Type: Media Arts & Digital Technologies
Alberta College of Art + Design