Whilst researching performance projects which utilize emerging media technology, I discovered this highly successful and imaginative piece. "Scrapple", by Golan Levin is an installation consisting of a table onto which everyday objects are placed and these are the basis for music creation. The intention is that the table acts as an "active score" by constantly scanning the surface of the table and responding in real time to the objects that are on it. This way, the table acts as music notation. The work consists of a windows pc, custom designed software, a 3 metre long table with a dry erase board, a digital video camera with combined projector. The part of the work that the user performs on is the table. The installation will respond to anything that is on the surface of the table allowing for a great deal of scope. For example, it will respond to the variety of objects that are already on it and or people can also write on it. The camera continuously monitors the table from above and the creator calls objects on the table "sound events" and the rhythm is created by scanning the table lengthwise and these "sound events" are sequenced in time. Pitch is created by objects placed across the width of the table and the frequencies produced cover eight octaves from low to high. Close detail to the effect of each object can be noted as darker objects will produce louder sounds and wider objects will cover a wider range of the frequency spectrum. This allows for greater musical expression as the piece copes well with dynamics and also will produce note clusters and dissonance. In addition, by scanning the table every few seconds a loop is created and the tempo of this can be adjusted between 1 and 1000 bpm.
This work is an excellent example of augmented reality and this is achieved by the use of visuals that are projected onto the table. These visuals contribute in a number of ways. For example, there is a continuous moving glowing bar that represents scanning across the table and this is linked to the system so that at exactly the moment the bar passes over an object, the corresponding sound is produced. Also, there is a grid that marks time and pitch and a variety of grids can be chosen. For example, the user can choose one that will play triplets or pentatonic scales, once again highlighting the musical freedom of the piece. Once every piece has been detected it glows, enhancing the effect to the user. The glow on each piece also fades gradually, acting as a visual way of displaying the timing and tempo of the piece. These graphics help portray to the user, the effect of their actions on the piece. The current condition of the piece is displayed and through use of the grid, precision is possible. It is therefore possible to compose with accurate results or improvise. The technology used is impressive also. The software is self made, showing the importance that the creator placed on this piece doing what it was planned to do. The camera constantly monitors the table and when the image is passed, the creator states that "barel distortion", has to be removed. Also the perspective has to be corrected so that it is accurate. After this audio synthesis takes place which has to be simultaneously combined with the graphics and then in real time projected onto the table.
Overall, "Scrapple" in my opinion is a very successful piece. It appeals both visually and in terms of the sounds produced. One of the key attractions is the attention paid to how musical it is. The creator has clearly made this a priority. Also, the sound combined with the graphics, completely engages the user. The graphics enhance their every action on the piece by responding in real time and giving feedback to the user. This is why it is an excellent example of augmented reality.
http://www.flong.com/projects/scrapple/
Tuesday, 27 April 2010
Monday, 26 April 2010
Update on performance technology project
Last wednesday we successfully managed to have the project up and running. It consists of 2 Max msp patches and a logic file. The logic file has a looped recording of the instrument I have chosen (Absolute Resonance), along with a number of adjustable settings. These settings are contained within an insert (Auto Filter) and are cutoff modulation, resonance, dry signal and rate. Each of these had to be routed in logic pro so that communication between it and max msp were possible. The first Max msp patch traces the colour of what is on the screen over the point where it is hovering. Based on this it generates numbers on 4 scales which are then output to logic and these adjust the settings on the logic insert mentioned above. The second Max msp patch contains the movie file and allows the mouse to appear over the movie when the screen is maximized. The colours in the movie change gradually and the sound from logic accurately goes with this. This is the case when the mouse is left at a single point on the screen. The user can change the mouse position if they wish resulting in more drastic changes in sound as the colour changes will be more dramatic doing this. In this way, the user can either just watch the installation and experience the sounds that come naturally or they can have more of an effect on it by using the mouse. The room that the installation is taking place in will be empty, besides the projector and the screen to maximize effect and probably the lights will be off or dimmed significantly.
Subscribe to:
Posts (Atom)