Ali Momeni

Menu
  • Bio
  • See
  • Listen
  • Read
  • Teaching
  • Press
  • News

Read

MOD: A Portable Instrument for Mixing Analog and Digital Drawing for Live Cinema

Posted November 8, 2017 & filed under Read.

Abstract

This paper describes the design and fabrication of MOD (Mobile Object for Drawing)–a portable instrument for combining analog and digital drawing. MOD is intended for live performance and content creation efforts that mix common analog drawing interfaces (i.e. paper, transparency, pencil, marker) with digital cameras (webcams, scientific imaging cameras, digital magnifiers and microscopes), custom software (for keying, thresholding, looping, layer) and digital projectors. The iteration of the instrument described here combines all of these components into a single portable battery powered pack- age that embeds the computation on a small linux computer, includes a small laser projector, and integrates custom tac- tile controllers. The intended uses of this instrument include experimental performance and rapid content creation; the instrument is intended to be suitable for formal (concert hall, theater) and informal (street performance, busking, parade, protest) settings, classrooms and maker spaces.

 

Citation

Momani, A., McNamara, D. (2018, March). MOD: A Portable Instrument for Mixing Analog and Digital Drawing for Live Cinema. In Proceedings of the Elevent International Conference on Tangible, Embedded, and Embodied Interaction. ACM. (pdf)

Download PDF

Dranimate: Paper Becomes Tablet, Drawing Becomes Animation

Posted January 16, 2016 & filed under Read.

Abstract

Dranimate is an interactive animation system that allows users to rapidly and intuitively rig and control animations based on a still image or drawing using hand gestures. Dranimate combines two complementary methods of shape manipulation: bone-joint-based physics simulation and
the as-rigid-as-possible deformation algorithm. Dranimate also introduces a number of designed interactions created around the metaphor of an image on a tablet screen replac- ing a physical drawing. The interactions focus the users attention on the animated content, as opposed to computer keyboard, mouse, or tablet surface while enabling natural and intuitive interactions with personalized digital content.

Citation

Download PDF

ArtBytes: A Mobile App for Mixing Art Appreciation with Art Creation

Posted January 16, 2016 & filed under Read.

Abstract

ArtBytes is a mobile app designed to accompany art seekers and makers to museums and galleries. The app emphasizes continuity and dialogue across a museum goer’s visits to different galleries, museums and exhibitions over time. During the visit to an exhibition, the app allows visitors to archive works of art they appreciate, in addition to specific elements within each work that are meaningful to the viewer. After the visit, the app provides opportunities for creative interaction with the specific visual elements within an art work; this opportunities include composition of new works through collages, as well as curation and presentation of these compositions to other users, in-real-life (i.e. not online) and outside of the gallery or museum space, using augmented reality techniques. The app aims to help art seekers better understand their own taste, increase access to works of art, extend art consumption activities to by engaging art seekers art making activities, and leverage crowds in helping art seekers discover new aesthetic experiences within and outside of the museum context.

Citation

Download PDF

Dranimate: Rapid real-time gestural rigging and control of animation

Posted August 26, 2015 & filed under Read.

Abstract

Dranimate is an interactive animation system that allows users to rapidly and intuitively rig and control animations based on a still image or drawing, using hand gestures. Dranimate combines two complementary methods of shape manipulation: bone-joint-based physics simulation, and the as-rigid-as-possible deformation algorithm. Dranimate also introduces a number of designed interactions that focus the users attention on the animated content, as opposed to computer keyboard or mouse.

Citation

Momeni, Ali and Zach Rispoli (​co-author).​“Dranimate: Rapid real-time gestural rigging and control of animation”, demo presented at ACM Symposium on User Interface Software and Technology (UIST). Charlotte, NC, 2015.

Download PDF

MagPad: A Near Surface Augmented Reading System for Physical Paper and Smartphone Coupling

Posted August 26, 2015 & filed under Read.

Abstract

In this paper, we present a novel near surface augmented reading system that brings digital content to physical papers. Our system allows a collocated mobile phone to provide augmented content based on its position on top of paper. Our system utilizes built-in magnetometer of a smartphone together with six constantly spinning magnets that generate designed patterns of magnetic flux, to detect 2D location of phone and render dynamic interactive content on the smartphone screen. The proposed technique could be implemented on most of mobile platforms without external sensing hardware.

Citation

Xu, Ding, A​li Momeni and Erik Brockmeyer (2​nd author)​. “MagPad: A Near Surface Augmented Reading System for Physical Paper and Smartphone Coupling”, demo presented at ACM Symposium on User Interface Software and Technology (UIST). Charlotte, NC, 2015.

Download PDF

Caress: An Enactive Electro-acoustic Percussive Instrument for Caressing Sound

Posted May 16, 2015 & filed under Read.

Abstract

This paper documents the development of Caress, an electroacoustic percussive instrument that blends drumming and audio synthesis in a small and portable form factor. Caress is an octophonic miniature drum-set for the fingertips that employs eight acoustically isolated piezo-microphones, coupled with eight independent signal chains that excite a unique resonance model with audio from the piezos. The hardware is designed to be robust and quickly reproducible (parametric design and machine fabrication), while the software aims to be light-weight (low-CPU requirements) and portable (multiple platforms, multiple computing architectures). Above all, the instrument aims for the level of control intimacy and tactile expressivity achieved by traditional acoustic percussive instruments, while leveraging real-time software synthesis and control to expand the sonic palette. This instrument as well as this document are dedicated to the memory of the late David Wessel, pioneering composer, performer, researcher, mentor and all-around Yoda of electroacoustic music.

Citation

Momeni, A. (2015). Caress: An Enactive Electro-acoustic Percussive Instrument for Caressing Sound. NIME 2015.

Download PDF

ml.lib: Robust, Cross-platform, Open-source Machine Learning for Max and Pure Data

Posted May 16, 2015 & filed under Read.

Abstract

This paper documents the development of ml.lib: a set of open- source tools designed for employing a wide range of machine learning techniques within two popular real-time programming environments, namely Max and Pure Data. ml.lib is a cross- platform, lightweight wrapper around Nick Gillian’s Gesture Recognition Toolkit, a C++ library that includes a wide range of data processing and machine learning techniques. ml.lib adapts these techniques for real-time use within popular data- flow IDEs, allowing instrument designers and performers to integrate robust learning, classification and mapping approaches within their existing workflows. ml.lib has been carefully de- signed to allow users to experiment with and incorporate ma- chine learning techniques within an interactive arts context with minimal prior knowledge. A simple, logical and consistent, scalable interface has been provided across over sixteen exter- nals in order to maximize learnability and discoverability. A focus on portability and maintainability has enabled ml.lib to support a range of computing architectures—including ARM— and operating systems such as Mac OS, GNU/Linux and Win- dows, making it the most comprehensive machine learning implementation available for Max and Pure Data.

GitHub: https://github.com/cmuartfab/ml-lib

Citation

Bullock, J., Momeni, A. (2015). ml.lib: Robust, Cross-platform, Open-source Machine Learning for Max and Pure Data. NIME 2015.

Download PDF

Scalpel: A Shortcut to Inspiration

Posted May 15, 2012 & filed under Read.

Abstract

“Scalpel is a biannual publication created for an exclusive audience of 600 top executives at Pernod Ricard. Its role is to inspire creative collaboration with the best emerging talent in photography, film, art, technology, music, fashion, design, retail and gastronomy.

Scalpel brings together twelve thought leaders, or ‘Surgeons’, from the worlds of photography, film, art, technology, music, fashion, design, retail and gastronomy. Surgeons present a profile of their top five up-and-coming talents who they predict will breakthrough in the next 24 months to inspire creative collaborations between Pernod Ricard and the very best emerging creative talents.

Scalpel is a practical tool. It includes a directory so that the talent featured can be contacted and creative collaborations formed. The publication is supported by a website, events, workshops and an annual creative excellence award.”

More info here.

Citation

"Robin Meier and Ali Momeni." Scalpel: A shortcut to inspiration. 3.2 (2012): 50-51. Text.

Download PDF

The Liminal Surface: An Interactive Table-top Environment for Hybridized Music – Theater Performance

Posted February 15, 2010 & filed under Read.

Abstract

This paper documents the development of a new instrument for the creation of experimental music theater. This environment, known as the liminal surface, uses a portable “table-top” design to integrate audio, video, analog and digital sensors, and computer-based control of external media (i.e. musical robotics). This environment will enable the composition of a series of new works exploring interactive computer music, intermodal relationships, and collaborative performance on a visually stimulating and technologically sophisticated platform.

Citation

Bithel, David Momeni, Ali. Proceedings from the 12th Biennial Symposium on Arts and Technology, March 4-6, 2010. Ammerman Center for Art & Technology, Connecticut College.

Download PDF

Dynamic Independent Mapping Layers for Concurrent Control of Audio and Video Synthesis

Posted February 15, 2005 & filed under Read.

Abstract

The work in the present article is primarily motivated by a desire for intimate and expressive control over creative processes implemented in real-time performance software. We seek a manner of control that offers a “low entry fee with no ceiling on virtuosity”, allows expressive control of musical and visual control structures (Wessel and Wright 2001); and like many colleagues, we believe that the answer is in enriching the approach to mapping (see (Winkler 1995), (Rovan, Wanderley et al. 1997), (Arfib, Courturier et al. 2002), (Hunt, Wanderly et al. 2002)). Our notion of a dynamic independent visual mapping layer concerns any independent system with Time-Variable behavior that takes data-input from the user and produces output to drive audio/video synthesis. This modification can be a change of dimensionality as well as what is commonly considered “mapping”: changes in numerical ranges, interpretation of “triggers” for setting off events and mathematical analysis and modification of the input, be they one-to-one, convergent, or divergent (Rovan, Wanderley et al. 1997). This modification, however, can be more complex if the mapping system is dynamic, that is, it changes over time. Notably, the internal behavior of the system can produce output variation without variation in the user input. The system is visual because first, we choose mapping spaces that have clear graphical foundations. In the case of our two examples, mass-spring physical models and interpolations systems in perceptual spaces, both have clear visual interpretations that we believe are a significant strength of this approach.

 

Citation

Ali Momeni, Cyrille Henry. Dynamic Independent Mapping Layers. Computer Music Journal, 30:1. 2005.

Download PDF

Facilitating Collective Musical Creativity

Posted February 15, 2005 & filed under Read.

Abstract

We present two projects that facilitate collective music creativity over networks. One system is a participative social music system on mobile devices. The other is a collaborative music mixing environment that adheres to the Creative Commons license [1]. We discuss how network and community infrastructures affect the creative musical process, and the implications for artists creating new content for these formats. The projects described are real-world examples of collaborative systems as musical works.

Citation

Tanaka, A., Tokui, N., and Momeni, A. Facilitating Collective Musical Creativity. Proceedings of ACM Multimedia, 2005.

Download PDF

Composing Instruments: Inventing and Performing with Generative Computer-based Instruments

Posted February 15, 2005 & filed under Read.

Abstract

This dissertation describes music composition as an act of composing instruments. The building blocks of such instruments are discussed: the fundamentally interdisciplinary approach, the role of gesture, the role of real-time generative software, the mappings between gesture and generative processes, and the interaction between performer and instrument. A real-time performance instrument that was composed to accompany the opera Takemitsu: My Way of Life is described. Key constraints imposed by this project are described, namely: the need for the real-time electronic sound to blend and relate musically to the rest of the music, the need to create a stateless and playable instrument, and the need for an instrument that is robust, adaptable, portable. Design and compositional decisions that address these constraints are proposed and the actual implementation is discussed. As a contrasting example of a composed instrument, a second project is presented: an interactive installation named …in memory of Leah Deni created in memory of Leah Deni. This project serves as an example of the same compositional interest in instrument building and interactivity, but applied to an installation setting where the performer is the audience member. Connections between the conceptual and technological aspects of the installation are drawn. Finally, a set of software modules for real-time creative work named _aLib is presented. The modules in _aLib (a set of abstractions for the Max/MSP environment) were used extensively in the described instruments and will hopefully make a contribution to the real-time computer performance community.

Citation

Momeni, A., Composing Instruments: Inventing and Performing with Generative Computer-based Instruments, PhD Dissertation, in Music. 2005, University of California: Berkeley. p. 51.

Download PDF

Characterizing and Controlling Musical Material Intuitively with Graphical Models

Posted February 15, 2003 & filed under Read.

Abstract

In this paper, we examine the use of spatial layouts of musical material for live performance control. Emphasis is geven to software tools that provide for the simple and intuitive geometric organization of sound material, sound processing parameters, and higher-level musical structures.

Citation

Momeni, A. and D. Wessel , "Characterizing and Controlling Musical Material Intuitively with Graphical Models." (2003) Proceedings of the New Interfaces for Musical Expression Conference, Montreal, Canada.

Download PDF

Analysis of Luciano Berio’s Points on a Curve to Find

Posted February 15, 2002 & filed under Read.

Abstract

I shall present my analysis of this piece in three sections. First I will discuss the construction of the solo piano part. In relation to the title of the work, this part signifies “the curve”. Since the piano part is the formal backbone of the piece, I shall include my consideration of work’s form in this section. Next I shall discuss the rest of the ensemble, i.e. “the points”. Finally I will discuss the co-evolution of the solo piano and the ensemble.

Citation

Momeni, A. Analysis of Luciano Berio's Points on a Curve to Find. 2002.

Download PDF

Analysis of Steve Reich’s Drumming and his use of African polyrhythms

Posted February 15, 2001 & filed under Read.

Abstract

In the summer 1970 Steve Reich went to Ghana to study drumming. With a travel grant from the Special Projects division of the Institute of International Education, he made his way to Accra in order to study with Gideon Alorworye, the resident master drummer of the Ghana Dance Ensemble. Due to illness he returned from only after five weeks. He spent the following year almost exclusively on the ensemble piece called Drumming. At first glance, Drumming appears to draw on Reich’s non-western musical influences more than any other of his compositions to date. The ensemble of instrumentalists sharing their time between drums, mallet instruments and singing testifies to the composer’s attraction African traditions; as does the 12/8 rhythmic cell– reminiscent of an African bell pattern–that accounts for the entire work’s material. However, listening to Steve Reich’s Drumming with an ear that is thirsty for African polyrhythmics is the recipe for misunderstanding and disappointment. The sort of strict polyrhythmics that is found throughout central and west African music is not at all the point of this piece of music. There is a drastic disparity between the complexity of the rhythmic material in traditional African music and the single rhythmic cell present in Drumming. Furthermore, the multi-leveled construction of African polyrhythmics often acts as a vehicle for the master drummer to flaunt his command over the pulse: with great ease, he is able to play just a few of milliseconds ahead of the bell pattern, or ever so slightly behind the low drum. This form of interaction is entirely absent from Drumming. The comparison begs the question: what did Reich learn by going to Ghana?

Citation

Momeni, A. Analysis of Steve Reich's Drumming and his use of African polyrhythms. 2001.

Download PDF

Managing Complexity with Explicit Mapping of Gestures to Sound Control with OSC.

Posted February 15, 2001 & filed under Read.

Abstract

We present a novel use of the OpenSoundControl (OSC) protocol to represent the output of a gestural controller as well as the input to sound synthesis processes. With this scheme, the problem of mapping gestural input into sound synthesis control becomes a simple translation from OSC messages into other OSC messages. We provide examples of this strategy and show benefits including increased encapsulation and program clarity.

Citation

Wright, M., A. Freed, A. Lee, T. Madden, and A. Momeni. (2001), “Managing Complexity with Explicit Mapping of Gestures to Sound Control with OSC.” Proceedings of the 2001 International Computer Music Conference, Habana, Cuba, pp. 314-317.

Download PDF

An XML-based SDIF Stream Relationships Language

Posted February 15, 2000 & filed under Read.

Abstract

We introduce the SDIF Stream Relationships Language (“SDIF-SRL”), a formal language for describing the relationships among streams of an SDIF file.

Citation

Wright, M., A. Chaudhary, A. Freed, S. Khoury, A. Momeni and D. Wessel (2000), "An XML-based SDIF Stream Relationships Language." Proceedings of the International Computer Music Conference, Berlin, Germany.

Download PDF

About

Ali Momeni is into dynamic systems and moving targets; he works with kinetics, electronics, software, sound, light, people, plants and animals. His creative output ranges from sculptures and installations, to urban interventions and music theater performance. Read more here.

Contact

b a t c h k u at gmail dot com

© 2023 Ali Momeni.