Posts By: batchku
I am honored to by invited by Kickstarter to make a contribution to their election issue, a series of online postings, essays, art works and discussion starters. My contribution is a recipe for urban project, in the style of those listed in my Manual for Urban Projection, intended for the days leading up to the election. You can check it out along with the other contributions here.
Honored to be invited to contribute to an upcoming Duke University Press book by Paul Miller (aka DJ Spooky) and Nancy Hightower, called Digital Fictions: The Future of Storytelling, a text that includes contributions from about 23 writers, artists, architects, journalists, and documentary makers writing about the most recent and groundbreaking ways multimodal writing, art, film and pedagogy are changing the landscape of 21st century narratives.
Honored to receive a grant from the The Grable Foundation to further develop SocialVR, and move toward its integration into Pittsburgh area schools and after school programs. The grant will allow Aparna Wilder, director of community outreach for SocialVR, will lead this planning and strategy development for the upcoming 6 months.
I am honored to be invited to participate in the NEA and Kennedy Center convening In Pursuit of the Creative Life: The Future of Arts and Creativity in America, taking place Friday, November 18, 2016 in Washington, DC.
I am honored to accept the invitation by Jax Deluca, Media Arts Director of Visual Arts at the National Endowment for the Arts, to join the Media Arts Advisory Panel. Looking forward to working together with other panel members to review submission and make recommendations. I will be focusing on the category of creation and education based activities such as production of new work, facilities access, residency programs, workshops, publications, and the development of web portals or mobile apps.
I have been invited to attend the PWL (People We Love) Camp between October 14-16 2016 in Kickstarters Brooklyn Head Quarters. Honored to be included and looking very forward to meeting all the creative makers and innovators.
My bug exhibition “All Around Us” that is on display at the Wood Street Galleries in Pittsburgh will be featured in an interview for The Scientist, “the magazine for life science professionals—a publication dedicated to covering a wide range of topics central to the study of cell and molecular biology, genetics, and other life-science fields” based in Ontario.
Here is a link to the article.
I’ve been invited by Dr. Carlos Castellanos to present my work at the Digital/Experimental Media Lab at Kansas State University, as part of the Leonardo LASER lecture series. I will also install a work at the Chapman Gallery on campus.
I’ve been invited by co-chairs Dave Colangelo and Michael Longford to present my urban projection–including my Manual for Urban Projection–work at the annual Media Architecture Summit. This event will take between September 29 to October 1, 2016 in Toronto.
I’ve been invited to create a new site-specific, large-scale participatory projection work to be performed at the annual Kickstarter Festival. The performance is scheduled for July 30, 2016 at Fort Green Park in Brooklyn. The performance will apply Dranimate to children’s drawings, as Zach Rispoli and I did at the Maker Faire last year at the Children’s Museum of Pittsburgh, to light up the park with surround projection mapping onto the park’s prominent monument. This performance will repurpose our mobile project carts, built for the Statuevision project.
CCRMA is the Center for Computer Research in Music and Acoustics at Stanford University.
I am honored to be invited by Kim Domanski, Public Art Coordinator at the Baltimore Office of Promotion & The Arts, to serve as a jury member for the second annual Light City Baltimore exhibit, to take place in the spring of 2017 and include dozens of illuminated sculptures, projections and performance works, viewed by several hundred thousand viewers.
“All Around Us” listed among “The month’s best bets in The ‘Burgh” by Pittsburgh Magazine
Thank you CBS for this listing.
Dranimate, a gestural animation and digital puppeteering tool created with Zach Rispoli is now a Delaware C-Coorporation. We have received a gap filling grant from Carnegie Mellon’s Technology Transfer Office to engage fastforward.sh in creating our first product: a tablet app that brings Dranimation to the masses.
This position paper investigates the challenges and oppor- tunities in providing access to machine learning (ML) tools to high school and early undergraduate students working on “Physical Computing” projects. By co-creating a library of ML tools for two popular interactive art programing IDEs (Max and PureData), and by introducing them in the class- room to early undergraduate students, the author has been developing and exploring the necessary scaffolding for cre- ating new tools. These experiences point to several critical needs to successful integration of ML tools in classrooms with non-experts: notably, experiential examples, minimally viable software-hardware systems, and “one-click-install” cross-platform and embeddable software packages.
Currently under review for CHI 2016 Workshop on "Human Centered Machine Learning"
Dranimate is an interactive animation system that allows users to rapidly and intuitively rig and control animations based on a still image or drawing using hand gestures. Dranimate combines two complementary methods of shape manipulation: bone-joint-based physics simulation and
the as-rigid-as-possible deformation algorithm. Dranimate also introduces a number of designed interactions created around the metaphor of an image on a tablet screen replac- ing a physical drawing. The interactions focus the users attention on the animated content, as opposed to computer keyboard, mouse, or tablet surface while enabling natural and intuitive interactions with personalized digital content.
Currently under review for CHI 2016 Demo
ArtBytes is a mobile app designed to accompany art seekers and makers to museums and galleries. The app emphasizes continuity and dialogue across a museum goer’s visits to different galleries, museums and exhibitions over time. During the visit to an exhibition, the app allows visitors to archive works of art they appreciate, in addition to specific elements within each work that are meaningful to the viewer. After the visit, the app provides opportunities for creative interaction with the specific visual elements within an art work; this opportunities include composition of new works through collages, as well as curation and presentation of these compositions to other users, in-real-life (i.e. not online) and outside of the gallery or museum space, using augmented reality techniques. The app aims to help art seekers better understand their own taste, increase access to works of art, extend art consumption activities to by engaging art seekers art making activities, and leverage crowds in helping art seekers discover new aesthetic experiences within and outside of the museum context.
Currently under review for CHI 2016 Workshop on "Involving the Crowd in Future Museum Experience Design"
Dranimate is an interactive animation system that allows users to rapidly and intuitively rig and control animations based on a still image or drawing, using hand gestures. Dranimate combines two complementary methods of shape manipulation: bone-joint-based physics simulation, and the as-rigid-as-possible deformation algorithm. Dranimate also introduces a number of designed interactions that focus the users attention on the animated content, as opposed to computer keyboard or mouse.
Momeni, Ali and Zach Rispoli (co-author).“Dranimate: Rapid real-time gestural rigging and control of animation”, demo presented at ACM Symposium on User Interface Software and Technology (UIST). Charlotte, NC, 2015.
In this paper, we present a novel near surface augmented reading system that brings digital content to physical papers. Our system allows a collocated mobile phone to provide augmented content based on its position on top of paper. Our system utilizes built-in magnetometer of a smartphone together with six constantly spinning magnets that generate designed patterns of magnetic flux, to detect 2D location of phone and render dynamic interactive content on the smartphone screen. The proposed technique could be implemented on most of mobile platforms without external sensing hardware.
Xu, Ding, Ali Momeni and Erik Brockmeyer (2nd author). “MagPad: A Near Surface Augmented Reading System for Physical Paper and Smartphone Coupling”, demo presented at ACM Symposium on User Interface Software and Technology (UIST). Charlotte, NC, 2015.
The Manual for Urban projection was written up on prosthetic knowledge.
Link to article…
Mary Abbe of the Star Tribune did a nice write-up of the Gutless Warrior in her pre-Northern Spark article about the night festival.
This paper documents the development of Caress, an electroacoustic percussive instrument that blends drumming and audio synthesis in a small and portable form factor. Caress is an octophonic miniature drum-set for the fingertips that employs eight acoustically isolated piezo-microphones, coupled with eight independent signal chains that excite a unique resonance model with audio from the piezos. The hardware is designed to be robust and quickly reproducible (parametric design and machine fabrication), while the software aims to be light-weight (low-CPU requirements) and portable (multiple platforms, multiple computing architectures). Above all, the instrument aims for the level of control intimacy and tactile expressivity achieved by traditional acoustic percussive instruments, while leveraging real-time software synthesis and control to expand the sonic palette. This instrument as well as this document are dedicated to the memory of the late David Wessel, pioneering composer, performer, researcher, mentor and all-around Yoda of electroacoustic music.
Momeni, A. (2015). Caress: An Enactive Electro-acoustic Percussive Instrument for Caressing Sound. NIME 2015.
This paper documents the development of ml.lib: a set of open- source tools designed for employing a wide range of machine learning techniques within two popular real-time programming environments, namely Max and Pure Data. ml.lib is a cross- platform, lightweight wrapper around Nick Gillian’s Gesture Recognition Toolkit, a C++ library that includes a wide range of data processing and machine learning techniques. ml.lib adapts these techniques for real-time use within popular data- flow IDEs, allowing instrument designers and performers to integrate robust learning, classification and mapping approaches within their existing workflows. ml.lib has been carefully de- signed to allow users to experiment with and incorporate ma- chine learning techniques within an interactive arts context with minimal prior knowledge. A simple, logical and consistent, scalable interface has been provided across over sixteen exter- nals in order to maximize learnability and discoverability. A focus on portability and maintainability has enabled ml.lib to support a range of computing architectures—including ARM— and operating systems such as Mac OS, GNU/Linux and Win- dows, making it the most comprehensive machine learning implementation available for Max and Pure Data.
Bullock, J., Momeni, A. (2015). ml.lib: Robust, Cross-platform, Open-source Machine Learning for Max and Pure Data. NIME 2015.
telepuppet.tv (collaboration with Nima Dehghani) featured on the European Commission’s Europe Créative page…
A nice announcement for Telepuppet.tv for Connecting Cities: Participatory City 2014. Includes a video of the puppets we used!
A pretty comprehensive review and explanation of the Telepuppet.tv installation as part of Participatory City 2014 for Connecting Cities. It includes a media library with photos and video! Read it here.
A write-up on Telepuppet.tv as part of Participatory City 2014 for MediaLab-Prado’s Digital Façade. Read more here.
An announcement of Telepuppet.tv at Nuit Blanche Brussels, including a link for a live stream of the event! Read it here.
Arts Numériques has a nice press report of “Telepuppet.tv” as part of Nuit Blanche Brussels. Read it here.
“Gutless Warrior” featured in the Free State Festival at the Lawrence Arts Center. Read more here.
“Two artists have harnessed the flying power of mosquitoes to create sound.”
An interview and web article by the BBC on my work Truce with Robin Meier. The radio interview was broadcast on the radio, an online article and beautifully filmed and edited video lives on the web.
“Scalpel is a biannual publication created for an exclusive audience of 600 top executives at Pernod Ricard. Its role is to inspire creative collaboration with the best emerging talent in photography, film, art, technology, music, fashion, design, retail and gastronomy.
Scalpel brings together twelve thought leaders, or ‘Surgeons’, from the worlds of photography, film, art, technology, music, fashion, design, retail and gastronomy. Surgeons present a profile of their top five up-and-coming talents who they predict will breakthrough in the next 24 months to inspire creative collaborations between Pernod Ricard and the very best emerging creative talents.
Scalpel is a practical tool. It includes a directory so that the talent featured can be contacted and creative collaborations formed. The publication is supported by a website, events, workshops and an annual creative excellence award.”
More info here.
"Robin Meier and Ali Momeni." Scalpel: A shortcut to inspiration. 3.2 (2012): 50-51. Text.
In a Q&A entitled “Maestro of the Swarm” [Nature 481, 144 (12 January 2012)], my other half Robin Meier answers Laura Spinney’s questions about our collaborative works with insects, namely Truce and Tragedy of the Commons
A nice review of “The tragedy of the commons” written by Julia Dusserre-Telmon on Vivre Paris. It includes a photo of the piece.
A nice write-up on “The tragedy of the commons” in Arts Programme. It includes a photo of the piece.
Art writer Violaine Boutet de Monvel contributed a nice write-up of The Tragedy of the commons to ArtReview no. 53 (october 2011).
Write-up about our work in the Dynasty exhibition.
“Une génération qui doute,” published on 9 July 2010 in Le Journal des Arts.
A short feature by Damien Delille on the Dynasty exhibit.
A great article by Katherine Knorr about the Dynasty exhibit. The article was published on 15 June 2010 in the New York Times.
Short piece, “Virtuose du Son Biotech,” published in 2010 about our work at Dynasty.
Piece about the Dynasty exhibit written by Elena Bordignon in Vogue on 11 June 2010.
Our work on “Truce” is mentioned in a nice piece about Dynasty published by Le Point.
Cycling 74 interviewed me on May 25, 2010.
Read the interview here.
This paper documents the development of a new instrument for the creation of experimental music theater. This environment, known as the liminal surface, uses a portable “table-top” design to integrate audio, video, analog and digital sensors, and computer-based control of external media (i.e. musical robotics). This environment will enable the composition of a series of new works exploring interactive computer music, intermodal relationships, and collaborative performance on a visually stimulating and technologically sophisticated platform.
Bithel, David Momeni, Ali. Proceedings from the 12th Biennial Symposium on Arts and Technology, March 4-6, 2010. Ammerman Center for Art & Technology, Connecticut College.
Basic Electronics Resources
ITP (Tom Igoe) Sensor Reports – plus implementation code for PIC microcontrollers
Protolab Sensor Tutorial – uses arduino microcontrollers for examples
Electronics Retailers and Surplus
The work in the present article is primarily motivated by a desire for intimate and expressive control over creative processes implemented in real-time performance software. We seek a manner of control that offers a “low entry fee with no ceiling on virtuosity”, allows expressive control of musical and visual control structures (Wessel and Wright 2001); and like many colleagues, we believe that the answer is in enriching the approach to mapping (see (Winkler 1995), (Rovan, Wanderley et al. 1997), (Arfib, Courturier et al. 2002), (Hunt, Wanderly et al. 2002)). Our notion of a dynamic independent visual mapping layer concerns any independent system with Time-Variable behavior that takes data-input from the user and produces output to drive audio/video synthesis. This modification can be a change of dimensionality as well as what is commonly considered “mapping”: changes in numerical ranges, interpretation of “triggers” for setting off events and mathematical analysis and modification of the input, be they one-to-one, convergent, or divergent (Rovan, Wanderley et al. 1997). This modification, however, can be more complex if the mapping system is dynamic, that is, it changes over time. Notably, the internal behavior of the system can produce output variation without variation in the user input. The system is visual because first, we choose mapping spaces that have clear graphical foundations. In the case of our two examples, mass-spring physical models and interpolations systems in perceptual spaces, both have clear visual interpretations that we believe are a significant strength of this approach.
Ali Momeni, Cyrille Henry. Dynamic Independent Mapping Layers. Computer Music Journal, 30:1. 2005.
We present two projects that facilitate collective music creativity over networks. One system is a participative social music system on mobile devices. The other is a collaborative music mixing environment that adheres to the Creative Commons license . We discuss how network and community infrastructures affect the creative musical process, and the implications for artists creating new content for these formats. The projects described are real-world examples of collaborative systems as musical works.
Tanaka, A., Tokui, N., and Momeni, A. Facilitating Collective Musical Creativity. Proceedings of ACM Multimedia, 2005.
This dissertation describes music composition as an act of composing instruments. The building blocks of such instruments are discussed: the fundamentally interdisciplinary approach, the role of gesture, the role of real-time generative software, the mappings between gesture and generative processes, and the interaction between performer and instrument. A real-time performance instrument that was composed to accompany the opera Takemitsu: My Way of Life is described. Key constraints imposed by this project are described, namely: the need for the real-time electronic sound to blend and relate musically to the rest of the music, the need to create a stateless and playable instrument, and the need for an instrument that is robust, adaptable, portable. Design and compositional decisions that address these constraints are proposed and the actual implementation is discussed. As a contrasting example of a composed instrument, a second project is presented: an interactive installation named …in memory of Leah Deni created in memory of Leah Deni. This project serves as an example of the same compositional interest in instrument building and interactivity, but applied to an installation setting where the performer is the audience member. Connections between the conceptual and technological aspects of the installation are drawn. Finally, a set of software modules for real-time creative work named _aLib is presented. The modules in _aLib (a set of abstractions for the Max/MSP environment) were used extensively in the described instruments and will hopefully make a contribution to the real-time computer performance community.
Momeni, A., Composing Instruments: Inventing and Performing with Generative Computer-based Instruments, PhD Dissertation, in Music. 2005, University of California: Berkeley. p. 51.
In this paper, we examine the use of spatial layouts of musical material for live performance control. Emphasis is geven to software tools that provide for the simple and intuitive geometric organization of sound material, sound processing parameters, and higher-level musical structures.
Momeni, A. and D. Wessel , "Characterizing and Controlling Musical Material Intuitively with Graphical Models." (2003) Proceedings of the New Interfaces for Musical Expression Conference, Montreal, Canada.
I shall present my analysis of this piece in three sections. First I will discuss the construction of the solo piano part. In relation to the title of the work, this part signifies “the curve”. Since the piano part is the formal backbone of the piece, I shall include my consideration of work’s form in this section. Next I shall discuss the rest of the ensemble, i.e. “the points”. Finally I will discuss the co-evolution of the solo piano and the ensemble.
Momeni, A. Analysis of Luciano Berio's Points on a Curve to Find. 2002.
In the summer 1970 Steve Reich went to Ghana to study drumming. With a travel grant from the Special Projects division of the Institute of International Education, he made his way to Accra in order to study with Gideon Alorworye, the resident master drummer of the Ghana Dance Ensemble. Due to illness he returned from only after five weeks. He spent the following year almost exclusively on the ensemble piece called Drumming. At first glance, Drumming appears to draw on Reich’s non-western musical influences more than any other of his compositions to date. The ensemble of instrumentalists sharing their time between drums, mallet instruments and singing testifies to the composer’s attraction African traditions; as does the 12/8 rhythmic cell– reminiscent of an African bell pattern–that accounts for the entire work’s material. However, listening to Steve Reich’s Drumming with an ear that is thirsty for African polyrhythmics is the recipe for misunderstanding and disappointment. The sort of strict polyrhythmics that is found throughout central and west African music is not at all the point of this piece of music. There is a drastic disparity between the complexity of the rhythmic material in traditional African music and the single rhythmic cell present in Drumming. Furthermore, the multi-leveled construction of African polyrhythmics often acts as a vehicle for the master drummer to flaunt his command over the pulse: with great ease, he is able to play just a few of milliseconds ahead of the bell pattern, or ever so slightly behind the low drum. This form of interaction is entirely absent from Drumming. The comparison begs the question: what did Reich learn by going to Ghana?
Momeni, A. Analysis of Steve Reich's Drumming and his use of African polyrhythms. 2001.
We present a novel use of the OpenSoundControl (OSC) protocol to represent the output of a gestural controller as well as the input to sound synthesis processes. With this scheme, the problem of mapping gestural input into sound synthesis control becomes a simple translation from OSC messages into other OSC messages. We provide examples of this strategy and show benefits including increased encapsulation and program clarity.
Wright, M., A. Freed, A. Lee, T. Madden, and A. Momeni. (2001), “Managing Complexity with Explicit Mapping of Gestures to Sound Control with OSC.” Proceedings of the 2001 International Computer Music Conference, Habana, Cuba, pp. 314-317.
We introduce the SDIF Stream Relationships Language (“SDIF-SRL”), a formal language for describing the relationships among streams of an SDIF file.
Wright, M., A. Chaudhary, A. Freed, S. Khoury, A. Momeni and D. Wessel (2000), "An XML-based SDIF Stream Relationships Language." Proceedings of the International Computer Music Conference, Berlin, Germany.