New acoustic guitar microphone iRig Acoustic Condenser Microphone


By David Cox

First published in

The first ‘contact microphones’ as they were called in the 1970s came in a small plastic boxes and were disk-shaped. With them came this strange putty-like material that was a bit like a cross between silly putty and blu-tac.

The idea was you put the mic near the 6 o’clock position on the wood beneath the sound hole of your acoustic guitar and connected the other end to your amp. Of course being a microphone, it would invariably feedback at the soonest opportunity. It worked, but only just.

The best amplified acoustic guitars have long since has their own mikes built into them, with the electronics virtually embedded into the bodywork and a phono jack near where the guitar strap goes.

Now has come a removable acoustic condenser mike that works like a charm, is built very well and comes, crucially with some dedicated software that enables the player to sound as good as the best in the business.

According to IK Multimedia’s website, the device’s inception was inspired in part by the a well known feature length documentary on flamenco master Paco de Lucia. Made specifically for the acoustic player, the mic is designed to work in conjunction with dedicated apps that then take the nylon or steel string acoustic sound and ‘sculpt’ it and further process it for an optimized sound.

(MicroElectrical-Mechanical System) microphone technology replicates the positioning of a high quality microphone with an omnidirectional polar pattern, placement of the microphone just inside the sound hole where the output is optimal. This is combined with a “calibration” process that optimizes the guitar sound as if it were being miked externally. The result is a complete ‘tone picture’ of the guitar rather than simply a recording of the physical vibrations of the wood and strings.

iRig Acoustic is packaged with AmpliTube Acoustic FREE (download on the App Store), the new acoustic-specific AmpliTube app designed for processing and recording acoustic guitar and ukuleles.

First, when used with iRig Acoustic, the AmpliTube Acoustic app features a calibration and setup process that measures and optimizes the frequency response of your acoustic instrument and provides the “sweet spot” sonic clarity, tonal characteristics and projection that you normally get in the studio with an expensive high-end studio condenser microphone positioned in just the right spot. iRig Acoustic and AmpliTube Acoustic deliver that ultimate level of acoustic realness and character as an optimized system for a fraction of the cost.

The tone studio offers emulations of popular acoustic amplifiers; two solid state amps and one tube amp, complete with integrated effects sections as well as guitar pedal effects, including a live performance “Feed Kill” feedback eliminator, an acoustic compressor, graphic EQ and Parametric EQ; a 12-String emulator, a “Bass Maker” octave pedal, plus a “Body Modeler” that alters the sound of your guitar into another style of guitar. That’s pretty amazing, really.

One thing is for sure. It is more than the simple contact microphones of the ’70s.

RRP: $49.99

For more information visit:

Films online about Earth and Man – Anthropocinema!


By David Cox

The Holocene is over and the Anthropocene defines our epoch. Mankind’s irreparable and irreversible influence on the face of the planet will totally define its fate from now on. Several films that have as their central idea man’s impact on the planet.

Koyaanisquatsi, “Life out of Balance”, the epic 1982 documentary by Godfrey Reggio with music composed by Philip Glass and cinematography by Ron Fricke.shows the impact of man on his own environment and culture through a cornucopian montage of time-lapse, slow motion and panoramic vistas of spectacular yet fragile landscapes. I first saw it on film at the Valhalla Cinema, a repertory theater in Richmond, Melbourne whose audience was mainly students, and switched on counterculture types. I remember the kaleidoscopic cascade-like film playing to row after row of amazed Melbournites.

The semi ad-hoc nature of the way Koyaanisquatsi has been made complements the wild theme-based structure. It is a symphony of shots that leave you with the sense that the world is mad with development and that our impact as a species on the planet is without limit or direction.

Three Hopi prophecies sung by a choral ensemble during the latter part of the “Prophecies” movement are translated just prior to the end credits:

• “If we dig precious things from the land, we will invite disaster.”

• “Near the day of Purification, there will be cobwebs spun back and forth in the sky.”

• “A container of ashes might one day be thrown from the sky, which could burn the land and boil the oceans.”

Wax: Or the Discovery of Television by the Bees an experimental science fiction film by David Blair (1991) was prescient in fusing a vision of a world in which a Middle-Eastern war, photography, mathematics, and geometry had resulted from fusion of communication between bees and humans.

Wax arrived at a time when both its means of production and the themes it was addressing converged elegantly via the then brand new dimension of the Internet. I was able to count on one hand the number of people I could email when the film was released, and the terror and possibilities of new modes of communication we all felt in these early days are beautifully embodied in the film. William Burroughs himself wanders through the film, ambassador of all that is juxtaposed and otherworldly, and it is fitting that he should preside in this world, which seems to speak to our neoliberal wasteland today, devoid as it is rapidly becoming, of its UBER influence over Alles.

Powers of Ten by Eames Studio (1977) took the time to show the relationship of Earth to its planetary neighbors, and at the same time revealing the makeup of human matter at the atomic scale.

This mind-boggling animated journey into scalar depiction and scientific humanist relativism became the mainstay for many a high school and college study session. It ponders the big questions about our place in the universe and the universe in us. It was not the first film to examine the universe from the point of view of relative exponential scale (Cosmic Zoom predated it by several years) but it was certainly the first to do so in a way that precisely understood the relationship between all this cosmic measurement and the role of companies like IBM who distributed the film, and the way that such corporate sponsorship of the eternal would come to define the world in which we live today. The Anthropocene is nothing if not brought to you by the Biggest of the Big Players, then as now.

The Stalker (1979) by Andrie Tarkovsky is noted for its stark use of gritty, earthy close ups of mud, swamps, and the very material makeup of the planet itself. It shows a journey led by the ‘Stalker’ (Aleksandr Kaidanovsky) to take his two clients, a melancholic writer (Anatoli Solonitsyn) seeking inspiration and a professor (Nikolai Grinko) who seeks scientific discovery, to a place known as the ‘Zone’, which has a place within it with the supposed ability to fulfill a person’s innermost desires.

The three travel through unnerving areas filled with the cast-off material of modern society. They yell at each other, and on confronting the ‘Zone’ it would appear that it is in fact alive. Traversing “The Zone” can be felt but not really seen. The Cacophany Society’s Carrie Galbraith has said that the original “Burning Man” event was in fact one of several “Zone Trips” that were inspired by “The Stalker”, number 4 in fact, and the idea that a sentient earth receptive to the thoughts of those that engage with it is entirely consistent with the ideas of utopian groups who offer alternative uses for Federal desert land such as the Center for Land Use Interpretation. The contemporary Burning Man is a far cry from the ad hoc aims of those who interpreted the same site for “Zone Trip Number Four”, and the world is worse for it.

Don’t look up to heaven for transcendence, look down; at the shit, the mud, the earth, the swamp and all the fine grained individual particles of dirt and muck that make up our lives on this most finite of planets. For the effect-of-man-on-the-earth should be measured thus, the better to take account of all that has been moved out of place in the name of modernity, and all that has unfolded since.

The documentary Manufactured Landscapes, directed by Jennifer Baichwal is about the work of photographer Edward Burtynsky whose work concerns itself with the impact of massive manufacting plants on the earth’s environment. Enormous factories, large scale infrastructure programs, many of which are in mainland China form the basis of this extraordinary film about the bigger picture of global trade and its scarring effect on surface of the earth, and demands it makes on those caught in its seemingly unstoppable flows.

Together combined, the above films make for an elegant mini film festival on the Anthropocine – call it Anthropocinema. Thankfully, most are online for free.


Wax: Or the Discovery of Television by the Bees:

Powers of Ten – Eames Studio (1977)

The Stalker (1979) by Andrie Tarkovsky
Part 1

Part 2

Manufactured Landscapes, directed by Jennifer Baichwal (trailer)

Article: After Anthropocinema by Mohammad Salemy at the Brooklyn Rail website.

Real portability comes to a studio-quality music interface device


By David Cox

First Published on December 14, 2015 2:17 PM MST

  • iRig Pro Duo Music Portable Interface Device (used with permission of IK Multimedia)

IK Multimedia, YouTube,

iRig Pro Duo USB MIDI Interface

iRig Pro Duo is a studio-quality 24-bit audio box that is very easy to use, solidly built and is effectively, a bridge between the real world of microphones, guitars, basses, keyboards and those sometimes precious, finicky phones, pads, tablets and laptops and all the variations in the between.

Guitars and keyboards are from the 20th Century. They are heavy, solid, require effort to use and lift and carry. Data is about now. It is invisible, glows in the dark and is all around us. The iRig Pro Duo offers a kind of high speed tunnel between the two worlds. It is an Analog to Digital Converter for creative audio artists on the move, and the Swiss Army Knife version of one at that.

There are 2 channels with XLR/TRS combo audio jacks and phantom power so you can use high-end condenser microphones as well as guitars, basses, keyboards and any MIDI controller out there. This reminds me very strongly of the famous Zoom Hn4 field recorder which has a similar setup for audio in, and which can also be used as a USB interface.

Connections include Lightning, USB-OTG and USB and it comes with the proper cables which is a great thing given its true portability.

It is possible to do Dual-channel recording with each channel having its own input gain control; each channel can be afforded the correct input signal while recording.

MIDI machine

With iRig Pro DUO, you don’t just have a superb portable interface for audio; you can also hook up your favorite MIDI controllers. This is thanks to its included TRS to MIDI-DIN cables and dedicated MIDI in/out jacks, so you can control MIDI-compatible software (or send MIDI data to MIDI-compatible hardware, like synthesizers, drum machines and samplers) with plug and play simplicity.

High-quality studio condenser microphones require 48V of external power to function. It’s part of what makes them sound so good, crisp and accurate. With iRig Pro DUO, you’ll have that 48V of phantom power at your fingertips. Just plug in your high-quality condenser mic, flip iRig Pro DUO’s phantom power switch and record until your heart’s content.

Monitoring & output

iRig Pro DUO comes with two 1/4” TRS balanced analog audio outputs for your speakers.

These outputs provide clear audio signal via onboard output drivers.

You can also do direct monitoring to monitor either the direct incoming “dry” signals, or the processed signal coming back at you from your device or computer. There is a output amplifier as well so you can hear your mix back nice and loud via headphones. iRig Pro DUO can be powered with 2 AA batteries for use with iPhone or iPad. In addition, you can plug in a USB or Micro USB to OTG cables and iRig Pro DUO is then powered by the connected computer or Android device, or the AA batteries. Camped out in the studio? Plug it into the wall with a DC power adapter (not included) for extended recording sessions.


• Truly mobile dual input audio interface for iPhone, iPad, Android, Mac and PC

• Simultaneous dual track recording interface for all instruments

• Ultra-compact housing for extreme portability

• Dual identical XLR/TRS combo input jacks

• Dual ultra-low noise studio-quality IK preamps

• Individual input gain controls

• 48V phantom power

• Self-powered (2 AA batteries), device powered or DC power adapter (not included)

• 24-bit AD-DA converters

• Dual 1/4” switchable TRS balanced outs

• 1/8” 3.5mm Headphone out w/ level control

• MIDI IN/OUT jacks

• Ultra-compact housing fits in the palm of your hand

• Comes with mini-DIN to Lighting, Micro USB OTG and standard USB cables

• Designed and made in Italy

iRig Pro Duo is the MIDI box to take with you the next time to want to do some recording in the park, or to play outside with your keyboards with the band somewhere out in the open. Its super-versatile, highly adaptable and has uses to which I’m sure even the inventors have yet to discover.

RRP: $199.99

For more information visit:

Amplitube 4 – the Amplifier Modeling Resource for Guitarists


Most of the guitar amplifiers I’ve owned have had stenciled numbers on the side, been passed on from five other bands before landing in my garage or bedroom. Amps are like that, are a bit like old cars or motorcycles. They arrive, and are used, then disappear into that great gig domain from whence you may see them again, or you may not. Would that it were possible to invoke them all again. And all the other amps that you could never own for reasons of economy or rarity, or any other reason. Then there are the great guitar riffs and patterns and solos of all time.

With these come as part of their uniqueness the sound of the amplifier the guitars were played through on the records and the concerts. The distinctive sound of the configuration of amplifier and effects of each major guitarist is as unique as anything they played and in as many ways as crucial and central to their sound as the notes themselves. Jimi Hendrix and his epic Marshall Stacks. Fresh-faced early Beatle George Harrison and his modest undersized Vox amps. I had a Vox guitar once. A cherry-red Junior. I loved that thing. I had no amp, so I played it through my dad’s stereo system.

Think of the crisp metallic sound treble reverberation of Hendrix’ Fender Stratocaster on And the Wind Cried Mary or Andy Summer’s distant chorus and echo Fender Telecaster sound on the Police’s Walking on the Moon these are as much about the way the guitarist set up the amplifier and pedal effect as they are about the chords played. This is so obvious is goes without saying, except that in today’s world amp and effects configurations have been largely the province of guitar amps, and physical effects pedals.

AmpliTube 4, is a major upgrade to the world’s most powerful guitar and bass tone studio for Mac/PC, is here and will take you to a level of hyper-realism and customization of tone you never knew possible.

Here are some of the specs:

  • Hyper-Realistic tone
  • 3D Cab Room w/ selectable room simulations
  • Dual Mic placement on any speaker
  • Individual speaker selection
  • Speaker interaction modeling
  • Cabinet mixer for microphones, room, DI and master level
  • New British Series Amps
  • Power Amp/Speaker dynamic response
  • Acoustic Simulator
  • Effects loop slot between pre and power amp
  • Universal effects placement
  • Rack effects can be used as stomp effects
  • Stomp effects can be used in rack section
  • 8-track DAW/Recorder
  • 4-track Looper
  • UltraTuner
  • Built-in Custom Shop

AmpliTube 4 is also a guitar and bass tone studio for Mac/PC that works as a standalone application and as a plug-in for your favorite DAW. AmpliTube recreates the entire guitar/bass signal chain from instrument to recording device.

So it is entirely possible now for me to reproduce exactly the sound of Chris Squire’s metallic bass sound on Siberian Khatru from Close to the Edge for example, or Simple Mind’s Charlie Burchill’s shimmering twin Jazz Chorus amps with echo like the ones I saw him play in Melbourne, Australia in 1983. I remember that Like a Glittering Prize sounded like it was coming out of a guitar factory.

What’s particularly interesting with Amplitube 4 is that you can also actually “make” your own amplifier cabinets and modify the ones that come standard. Its called the “cab room”. It has five separate customization sections: cabinet selection with size adjustment where you can alter the speaker cabinet to go with the amp, a speaker selection where you can change the speakers, a microphone selection and placement area for finding the best place to put the mics, different virtual room types where you can try out different surrounding recording spaces, and a mixer that allows players to mix the levels of the mics, the room ambience, direct amp signal and overall main mix.

Amplitube also has full MIDI support which means you can use it with external controllers.

AmpliTube can work as a 64-bit plug-in for the most popular DAW (such as Pro Tools, Logic, GarageBand, Cubase, Live, Reaper) or can be used in stand-alone mode in Mac OS X and Windows. The plug-in and standalone versions offer the same function and sound, but the standalone version also offers a built-in 8-track recorder and DAW, plus a 4-track looper so you can capture your ideas at the moment of inspiration. Both the standalone and plug-in versions are included when you install AmpliTube.

AmpliTube can work as a 64-bit plug-in for the most popular DAW (such as Pro Tools, Logic, GarageBand, Cubase, Live, Reaper) or can be used in stand-alone mode in Mac OS X and Windows. The plug-in and standalone versions offer the same function and sound, but the standalone version also offers a built-in 8-track recorder and DAW, plus a 4-track looper so you can capture your ideas at the moment of inspiration. Both the standalone and plug-in versions are included when you install AmpliTube.

It arrives out-of-the-box with the ability to add equipment to your such as amps, pedals, cabinets from Fender®, MESA/Boogie®, Ampeg®, Orange®, Soldano™, Groove Tubes®, Gallien-Krueger®, Jet City Amplification™, THD® and T-Rex®, Fulltone®, Z.Vex®, Carvin®, Morley®, Wampler Pedals®, Dr. Z®, ENGL® etc. I’m loving using Amplitube 4 as I could never conceivably own all the amps represented in its vast database, nor would I want to. But the ability to model them means I can call upon them any time. Its like having my own private guitar store at my fingertips.

RRP: $149.99

Click here for further information

The Selfie Stick on Steroids – iKlip Grip



First published in on September 17, 2015 12:39 PM MST

By David Cox

iKlip Grip

IK Multimedia

Sometimes a product comes along that tries to do lots of things at once and finds a way for all of these functions to complement each other nicely. Such is the case with IK Multimedia’s iKlip Grip. Someone has put a lot of time into this device and the time and energy has paid off. Many of us working with smartphones as professional audiovisual recording devices are often in need of a kind of universal hyper-portable mount for tabletop interviews, field recordings, and shots of oneself doing things. The iKlip Grip could be the answer for many such everyday situations out there in the field.The IK Multimedia iKlip Grip offers a range of functions in one. Its essentially a device that holds your smartphone and lets you operate it at a distance. Yes it is at its core a so-called ‘selfie stick’. But a very well made one. In addition to being a selfie stick however it is also a tripod mount, and ingeniously, a handle that enables the user to carry their phone upright and facing outward like a ping pong paddle.

Back in film school the hip kids had grips like these custom made for their Bolex 16mm cameras but today an iPhone 6 has double the resolution of 16mm so this type of device is a neat inheritor of the tradition. The iKlip Grip can be broken down and taken apart and put into a bag second of all it the grip that holds the phone can accommodate the latest fabulous an iPhone sixes and so on and these devices are a professional right now so the quality level of this device matches that found in many contemporary phones and tablets.

The Italian made device has an ingenious expandable and secure spring-loaded bracket that allows you to adjust the position of your phone at any angle for recording video, audio or images. There is a neat bracket also with the must-have standard 1/4″-20 tripod screw thread so it can be mounted to any standard tripod also which from a film making point of view is a must. You can flip out small ovoid shaped ‘legs’ at the base of the iKlip Grip that lets you turn it into a modest tripod as well, although extending it this way makes it a little unstable, and it is best to leave it in the lowest of the ‘telescoped’ positions. This kind of petal folding open element is really good but any criticism I might have is that these petals stay open too easily. It would be nice to have some kind of means to keep them closed when not in use. I use a rubber band but some of the kind of clip or latch would be useful here or even velcro on the insides of the ‘petals’ to keep them closed in your bag would be good.

The iKlip Grip truly has a “telescoping detachable tube” that extends your reach by a full 17.5″ so positioning your camera, phone or recorder in the right spot is even easier. Plus, iKlip Grip has a pivoting ball-joint attachment point that provides a full 90° angle adjustment plus 360° rotation capabilities of the device. This allows the user to get just the right angle and position for a shot. iKlip Grip also comes with a Bluetooth smartphone shutter* remote control that lets users remotely activate the shutter button in video and photo apps. The remote is universally compatible with iOS and Android and can be operated up to 10 meters away from the device, this is very helpful for capturing live performances, family vacation photos, movies and of course the ubiquitous ‘selfie’.

So what is the iKlip Grip? A well made extendable selfie-stick and iphone tripod that also works as a grip for your phone as well. It’s a selfie stick on steroids.


• 4-in-1 video stand: handle, tripod, monopod, tripod adapter

• Includes remote Bluetooth shutter*

• Adjustable angle

• Extends up to 50cm / 19.7″

• Standard 1/4” UNC threaded ball mount

• Works with smartphones with screens from 3.5″ to 6″ with case on

• Works with digital cameras and small video cameras like GoPro

• Works with handheld audio field recorders

RRP: $59.99

For more information see:

The Language of Virtual Reality


First Published in on September 10, 2015 9:00 AM MST

By David Cox

Dactyl Nightmare – VR arcade game Virtuality in the early 1990s was a line of virtual reality gaming machines produced by Virtuality Group.


First Published in Otherzine as “Mise-en-Experience”

by David Cox

We are all spied upon, archived, forgotten. What are the new meanings that define this aesthetics of experience? What are the aesthetics of framing– ‘that contained within the experience’. If the film-maker has become the creator of experiences, what is to be ‘contained within’ the experience?

With Virtual Reality the décor is that constructed with the tools used to generate the content for the experience for the user/viewer. The entirety of this experience is likely to have been either recorded from an experience in the world using 360-degree cameras such as the Go Pro mounts that resemble cubes with cameras facing every direction, or generated using 3D CGI and game engine created imagery.

Contemporary visual culture increasingly embraces and celebrates the notion of complete “experience”. Just as the panorama of the 19th century enveloped the visitor in a 180-degree scene, today’s virtual reality and augmented reality seek to provide a complete, enveloping, melding of data, audiovisual information, sight, sound, and the concurrent real-time combination of these with everyday activity. The world is pregnant with buildings, streets, people, and objects supposedly seamlessly integrated with their data and their own archives, yet ironically the more connected subjects are, the less in touch with each other the same subjects appear to be. Populations have never had more connectivity, yet have never more resembled disassociated zombies of retina touchscreen distraction.

Drones, the so-called Internet of Things (IoT), artificial intelligences now mediate everything. Spies, spooks, whistle blowers and nefarious actors hide in the shadows or “bare all” for download in the bright light of accountability. Data is not just a fact of contemporary life for anyone participating in the western world, it is a requirement, an obligation. Augmenting one’s experience is rapidly becoming the domain in which aesthetics is unfolding as well. Where the screen and the stage once framed ‘that contained within’ or ‘mis en’, now the entirety of experience itself is the framework of the act of aesthetic organization.

Oculus rift brochure.

At a Virtual Reality film-making conference recently, I heard panelists talking about ‘in-sphere’ and ‘out-of-sphere’ as a correlate for ‘in-frame’ and ‘out-of-frame’. The sphere, or rather that fishbowl-like region into which our heads and sensibilities are placed when we put on a head-mounted display such as Oculus Rift or Google Cardboard is the ‘stage’ where ‘mise-en-experience’ takes place.

VR experiences may be recorded using 360-degree cameras, or constructed using tools such as the game engine UNITY and UNREAL. In such cases, the viewer is expected to remove herself from the ordinary experience of life and cross over into a documentary, fiction or some such combination of the two. Either way in this artificial panoramic realm, the notion of the ‘scene’ presents new ways of being considered.

The geo-spatial world around us unadorned might, with Augmented Reality or AR and wearable computers for example, be added onto with audiovisual and data features pertinent to locale, the viewer’s history, her trajectory though time and space, her sense of her self in relation to others likewise ‘connected’. Head mounted displays, wearable computers, and technologies that meld the past with the present, the physical with the nonphysical are coming in fast. Sensors scan everything. Metadata about metadata joins a sea of associations in a never-ending flowchart of patterns of ideas. A vast intricate spiderweb universe of everyone sharing everyone’s secrets and banal facts, all visual, all sensed, all parsed by algorithms is envisioned. I can put on my headset and see objects all around me. And yours. And you mine. We can share it all.

The concept of ‘that contained within the scene’ in terms of virtual reality encompasses the user’s “own” sense of the entirety of experience. When she puts on the Oculus Rift headset she is really gaining a sense of the totality of everything around her that is placed within the same realm, much as she would experience a place if she were a tourist on holiday. What she is being invited to understand and enjoy and appreciate is the experience. And with the VR/AR she is able to see and listening to everything that she is experiencing.

Virtual Boy VR Gaming System.

Like Videogame designers VR experience designers are looking to create something that is “in the round” , not only recorded in the audiovisual sense but also in what we might call the experiential surround. Virtual Reality experience designers are looking to create experiences that literally encapsulate the entirety of being-in-the –moment. VR cameras are like any other cameras and can only be placed in certain locales. One can jump cut from one camera to another. Cameras can move. Users often complain about motion sickness, but that is a discussion for another time. The point is the extensive spherical scope of VR and its ability to make the scene into experience and visa versa.

Beyond the simple documentary nuances of the everyday that people might undergo in the course of the average day, VR can record the sense of what it was like to inhabit a place. Instead of a framed shot, we might now be said to be gaining a sphered shot .

When the camera is moved so is the experience of moving the whole eye-and-head-platform such that all around can be seen as the act of the camera is moved. CGI and ‘reality’ can be combined such that for example, when one looks up and looks down its possible for the ‘ground’ to appear to be actually giving way, or for the ‘sky’ to splinter into pieces. Virtual Reality and Augmented Reality designers are really looking to create personal moments and situations.

‘Mise-en-experience‘ is about planning and considering the inclusion of every aspect of the virtual immersion event. Just as Banksy felt he could best articulate his disillusionment and anger at the dismal world of today with the theme park art installation Dismaland, such a gesture also might have been accomplished using Virtual Reality.

Thundering Turbo, 1982.

Tomytronic 3D Thundering Turbo game system by Tomy (1982)

According to game designer Scott Rogers,“Everything I learned about game design I learned from Disneyland”. In both Disneyland and Dismaland there are “weenies”– central architectural features like the princess castle and the Matterhorn that attract the viewer to them – large structures that attract the user/viewer/visitor. There are meaningfully placed paths between these attractions. And it is the path-ing as much as the ‘weenies’ that determines the effectiveness of the experience. The VR/Game/Theme Park/Art Installation designer is the designer of “the dynamics of being there”. The designer of what we might call the ‘dance of the user’ in their imaginary place, the sense of motion through that place.

This complex and dynamic relationship between the user and her environment, the relationship of that use to other users in a shared space is of course ‘cybernetics’, identified by Norbert Weiner in the 1940s. But in terms of VR and its effects on people as they move around and manipulate their 3D VR avatars online in real time, it is also what Jaron Lanier in the early 1990s used to describe as body music; the post symbolic language of user’s gestures. This, too, is part of ‘mise-en-experience‘.

Why place objects in a virtual scene and limit oneself to the rules that govern physics and expectation in the normal world? Why use natural boundaries or spatial boundaries in the way they would normally apply at all?

Lessons learned from game design, architecture and urban planning, dance and other arts, which involve time space and motion can assist here. From games comes the notion of the spatial and temporal boundary. Placing limits on where a person can go in order to manage the experience, but also to manage memory (computer memory, not human memory). Temporal constraints – time limits can help frame the sense of there being goals, challenges and rules, bringing VR within the context of ludology. What is the victory condition? Then there is the field of music; the mathematical breaking down of time into fractions where the control of air pressure notation wave forms overtime is also part of the general sensibility. VR and AR can borrow from music, the patterning of events over time to create a sweep of moments, much the way the great composers arranged events to create emotional tones. Consider the VR experience that is the Beatles’ White Album, each song a mini-movie or VR scene, complete with characters, settings, events.

Bubl-camera, 2015.

Bubl camera – Kickstarter funded 360 degree camera used for consumer-level VR photography and film making.

VR and AR aesthetics borrow heavily from cinema itself particularly animation with its fractious breaking down of time into the individual frame. It’s not by accident that many Virtual Reality movie productions are made using the same tools often used in video game design such as Unity and Unreal. Both of these tools allow for the creation of 3D models elsewhere, in resources like 3DS Max and Maya where they are also animated, and for these to be imported into experiences that may well share the ‘sphere-space.’ Views of the real world can thus be meshed in experience with unique imported objects.

At the VR conference in San Jose this year, a panel on VR film making emphasized the ‘problems of nausea’. As VR goes mainstream, the rush to find a way to commercialize it as an extension of cinema or a kind of ‘cinema-in-the-round’ does little to understand the medium’s origins in the research labs of the late 1970s and early 1980s. Back then, the desire to use the medium to discover entirely new categories of experiences was more the goal. But since the economic lure of mainstream entertainment is both risk averse and extremely strong, pundits would sooner see a return on investment by retooling VR for already proven genre fiction uses (Space Marines, Extreme Sports, Adventure Heroes in the 3rd World) than dare to take a chance on something that genuinely breaks the mold. But we shall see. Perhaps there will be a paradigm shift, a new group of experiences which will emerge that will bring us the Bruce Conner of VR, or the Brothers Quay of Augmented Reality.

It could well be that the massive archives of existing film merge with the online databases in the development of new hybrid media forms. A digitized future lies in how truly creative and experimental filmmakers and VR designers will combine these worlds.

CASE STUDY – Archival footage from 8 years ago about where I am on this street corner in San Francisco is being superimposed over my view of it it now. I can walk to the very same spot to see where it was filmed from the very same angle.

Like Dziga Vertov, the revolutionary Russian constructivist film maker of the 1920 who placed his camera everywhere he could and drew attention to the fact in the very form and structure of films like “The Man with the Movie Camera”, I am “Kino Eye”, all over again. All the information about the conditions on that day are available to me and all those about today, too. And all the people in the footage. And their relatives. And all the other people who have seen the footage. And what they thought. And what I think. And on and on and on. Then there is the 3D data about the buildings and the pipes under the road; the flight patterns of the planes above.

IRig UA – Guitar Interface for Android


First published in August 11, 2015 1:03 PM MST

By David Cox

The first universal guitar effects processor and interface for all Android devices

These days everything hinges on the interface with whether it’s the dashboard of your car on the glass surface of your smart device. It’s no different with your guitar to get your guitar to “talk” to your Android smart phone or ‘phablet’. You need something to translate all those guitar licks into digital signals such that they can be made use of by your effects apps and/or recording software. And that’s where IK Multimedia’s iRig UA comes in.

But the problem with such devices is always been latency. The delay between the signal of the guitar and the sound that you hear when you listen to it. Decrease the latency and you improve the experience.

As more and more people use the electric guitar the challenge of how to get a clean signal onto a smart phone for recording or performance remains. The software and the hardware of the phone itself have to do much of the heavy lifting. Any help they can get is greatly appreciated. Minimize latency and you improve performance. Everything is better. It pairs with AmpliTube UA to deliver solid sonic performance when used with any smartphone or tablet with Android 4.2 or higher. These include host mode/USB OTG. Android devices it works with include Samsung, Sony, Motorola, LG, HTC, Xiaomi or any other popular manufacturer.

iRig UA uses a built-in digital signal processor (DSP) that solves the issue of inconsistent OS latency on the Android platform. It’s able to do this by moving all processing to an external accessory. iRig UA’s DSP has been designed to work with a companion app, AmpliTube UA. The app, which comes with iRig UA, is powered by the DSP. AmpliTube provides a collection of virtual sound processing equipment to customize your guitar sound. A versatile system, iRig UA can also be used as a digital recording interface when connected to a Samsung Professional Audio compatible device or smartphone or tablet with Android 5.0.

This pairing of iRig UA and AmpliTube UA is perfect for when you need to practice in transit. iRig UA features a 1/4” input for a guitar, bass or other line-level instrument, a micro-USB to OTG cable and an 1/8” headphone output with volume control. I for one am grateful for this suite of inputs and outputs. I like to play along with .mp3s from other devices.

When playing my MIM Fender Strat through a Samsung Galaxy S using Amplitube, the signal was surprisingly clean and robust. I had some delay and reverb and a combo of chorus and distortion and I found that it felt just like playing through battery-powered metal BOSS stomp boxes from years ago. In fact it feels and sounds almost analog. Why? It uses hardware-based processing.

iRig UA’s on-board digital signal processor works in conjunction with AmpliTube UA, a special version of IK’s powerful guitar and bass multi-effects processor designed specifically for use with iRig UA. The processing is handled on iRig UA itself and not on the Android device, it’s able to provide consistent near-zero latency performance (down to 2 ms round-trip total latency) that’s independent of the make and model of your connected smartphone or tablet.

IK multimedia is getting very good at the small, portable pocket guitar interface device market, and this new push into the Android marketplace speaks to the rising tide of players using Galaxies and other such UA driven smart devices. Listening to guitarist and what they want has also helped in the decisions that led to iRig UA.


Works on any smartphone or tablet that supports Android 4.2 or higher and host mode/USB OTG

Near-zero latency digital FX processing

Digital audio recording on Android 5.0 and Samsung Professional Audio Devices

32-bit digital signal processor

24-bit converter with 44.1/48KHz sample rate

High-definition, low-noise preamp

Includes AmpliTube UA

Analog aux input for play-along and practice

Headphone output and volume control

Multicolor LED

Ultra-compact and lightweight design

Micro-USB port

RRP $99.99

Click here for more information

iRig Mic Studio: A mighty powerful microphone in a tiny package


By David Cox

First published in July 28, 2015 12:05 PM MST

New small condensor microphone offers quality and portability

Condenser microphone technologies advanced hugely in the last several years to the point now where microphones would only been found in studios 10 years ago are freely available in the open marketplace and can be used in the hands of people who otherwise would have needed to rent out recording facilities costing a fortune.

IK Multimedia

This now 15-year old laptop recording revolution has given rise to a whole new range of amazing peripherals of which this microphone from IK Multimedia, the iRig Mic Studio mic is a perfect example. Exquisitely crafted, finest quality metal parts make up the mic and it comes with a beautifully machined microphone stand with a small tripod. It also comes with a beautiful leather fabric pouch as well as all the parts you need to get started recording.

I’m working on an opera right now about the space program and I’m traveling to lots of people’s houses around San Francisco. I need a studio quality mike that fits in my backpack which can be pulled out and used at a moment’s notice fast. What I really like is the gain controls on the outside which let me adjust the levels for the singer’s volume level ‘on the fly. With rock(et) opera I’m depending on the range of the singer being quickly adjustable in a range of settings that I seldom have much control over in terms of dynamic range and acoustics. Soprano and baritone vocal levels can vary very widely so range from the from the mic and how loud he or she is in relation to the response of the mike is everything. This mic has not let me down yet. The nearest comparison for me is the Yeti, and this mic proves a better solution in terms of sound quality to that mic. And it is smaller.

Make professional studio-quality recordings on the go

iRig Mic Studio, IK Multimedia’s ultra-portable large-diaphragm digital condenser microphone has been released for iPhone, iPad, iPod touch, Mac, PC and Android. It sports a 1” diameter condenser capsule into a compact space that can be used to make professional-quality recordings on the go.


  • Professional studio microphone with large-diaphragm capsule
  • Ultra-compact size that’s easy to carry around
  • High-quality 1” back electret condenser capsule
  • 24-bit converter with 44.1/48Khz sampling rate
  • Low-noise, high-definition preamp
  • Integrated headphone output
  • Multicolor LED status and level indicator
  • Onboard gain control and headphone level control
  • Comes with a full suite of IK apps
  • Includes portable tripod tabletop stand
  • Includes Lightning, Micro-USB OTG and USB cables
  • 30-pin cable available separately
  • Available in black or silver version

The size of the mic is a large 1” diameter back electret condenser capsule, a 24-bit audiophile-grade A/D converter (with 44.1/48 kHz sample rate) and a built-in low-noise high-definition preamp. These, combined with its 133dB SPL rating, allow for recording at any sound pressure level — it can capture the human voice up to an amplified electric guitar and louder.

I’ve received the best results just going straight into GarageBand using the supplied USB cable. Mic Room, is a new universal microphone-modeling app for iOS that works with iRig Mic Studio to give it the sonic characteristics of many classic mics.

The controls include a gain control knob and a multicolor LED level indicator. It also includes a headphone output with its own level control for onboard monitoring directly from iRig Mic Studio itself. And, for better positioning while recording, iRig Mic Studio comes with a sturdy and portable tabletop tripod stand.

iRig Mic Studio comes with a female micro-USB port and cables: Micro-USB to Lightning for iPhone, iPad and iPod touch; micro-USB to micro-USB OTG for Android (requires either an Android 5 or Samsung Professional Audio device); and micro-USB to USB for Mac and PC. A micro-USB to 30-pin cable is available separately for older iPhone, iPad and iPod touch models.

Using apps with the iRig Mic Studio.

For immediate recording, iRig Mic Studio comes equipped with a powerful suite of vocal apps that match its cross-platform compatibility. iPhone, iPad and iPod touch users will be able to enjoy VocaLive, a powerful effects processor and multi-track recording app that features a selection of 12 professional real-time vocal effects. EZ Voice for iPhone, iPad and iPod touch and EZ Voice for Android are streamlined and easy-to-use sing-along apps that make it easy for vocalists to practice with any song in their music library. iRig Recorder for iPhone, iPad and iPod touch and iRig Recorder for Android are straightforward apps for field recording, podcasting, note taking and more. Mic Room, the microphone modeling app for iPhone and iPad.

When you get this mic you really can open it up it work straight away as I did. A green light shows you when it’s on giving you a solid sense of available signal strength. You can hear the quality through the mic’s own headphones socket right away. This is a great mic (and I’ve heard a few in my years) and comes highly recommended.

iRig Mic Studio (silver or black) is available now from dealers worldwide or via the online store for only $/€179.99. Get Mic Room on the App Store for $/€7.99 for iPhone and iPad.

Visit this site for more information

IK Multimedia’s Mic Room – Microphone Modeling for IOS


Mike Room

Mike Room, IK Multimedia

IK Multimedia Mike Room

First published in by David Cox

The idea of a mic room is a luxurious concept. A room full of the very best microphones, each with its special qualities, plugged into a fancy mixing board, ready to enable the recording of a radio play, a song or an opera. But when might most people ever have access to such a thing? A mic room is the stuff of professional recording lore. The realm of rock stars, top producers, big record contracts. But no longer. Such is the state of modern technology that microphone modeling can happen on the modern phone or IOS device.

Mic Room is the perfect companion to IK’s digital and analog mics like the latest iRig Mic Studio condenser. When paired with such microphones, Mic Room gives you complete access to a selection of the best dynamic, condenser and ribbon-type microphones that are everyday tools in A-list music studios all over the world.

I tried Mic Room with my iRig Field microphone on my iPhone 6+ and was able to try a full range of available mics. One of these was ‘old telephone mic’ – a rather classic sounding 1930s sound synonymous with radio broadcasts from the golden age of radio. Another mic was modeled was Bottle 563

Based on the famous cylindrical-with-a-sphere-on-the-top Neumann® CMV-563. A rich, full sound, this classic mike had a great sound and I was instantly reminded of how great this whole software would be for recording radio plays or operas, such as my Rocket Opera, “Cosmonauts on the Moon”.

This ‘meta mic’ aspect is a truly amazing experience as the notion of using one kind of mic to access the digital equivalents of so many classic mics takes recording to a whole new level. To use the software, you simply plug in your favorite digital or analog microphone from IK (you can also use your iPhone/iPad’s built-in mic), select which model you want it to sound like and you’ll be able to instantly tap into the inspiring sound of many timeless classics. You get tried and true dynamic workhorses, rich and velvety tube condensers, ultra-smooth ribbons and even more unusual creative mic types.


• Powerful microphone modeling app

• Nine mic models (1 upon software registration and 2 more after registering an IK hardware mic)

• Expandable with more mic models via in-app purchase

• Companion app for IK’s range of digital and analog microphones

• Also works with your iPhone or iPad’s built-in mic and Apple Headset

• Adjustable input level

• Level meter

• Master bypass switch

• Inter-App Audio and Audiobus compatibility

• Universal app for iPhone, iPad and iPod touch

• Free version also available

The native iPad version will be available soon

Mic Room is available as either a free version (with 2 mic model included and other microphone models available via in-app purchase) or as a full version with 9 included and the ability to add more models for free via hardware registration and also via in-app purchase.

Available in both free version and $7.99

Click here for more information

Interview: Professor Steve Mann, about Augmented World Expo 2015


First Published in on May 27, 2015 7:14 PM MST

By David Cox

Augmented World Expo 2015 – POWER TO THE PEOPLE!


SuperPowers to the People: Augmented World Expo 2015: An introduction to an audio interview with Professor Steve Mann (see link at end of article). The augmented reality conference AWE2015 is coming up and its theme is “Superpowers to the People”, and as usual the buzz is around Meta AR, the Kickstarter based firm that developed a headset and developer kit based around UNITY. Since 2013, the first year of META’s development has seen it grow considerably from a 3D printed housing prototype variation on the Epsom “Movio” glasses on which it was originally based.

META’s innovation was to add a ‘kinect’ or leapmotion style META tracker to the front bridge-of-the-nose area to act as the basis for where your computer knows where to place objects in your field of view from your ‘point of eye’ (POE) to use the jargon. This tracker knows also to ‘see’ your hand and to interpret it as the device with which objects are being manipulated, moved and transformed.

Steve Mann, Chief Scientist at META AR is a true pioneer of both wearable computing and Augmented Reality, and has been building his own wearable devices since 1974. I first met him in 1995 at the MIT Media Lab on a research visit.

A strong believer in personal freedom, Mann believes that wearable computing, especially the ability to manage one’s personal space as it pertains to the recorded image is a path to democracy. He views technology like META as a great equalizer in the war against surveillance. Against the top-down vector of ‘surveillance’ he posits ‘sousveillance’ which is ‘seeing from below’.

Simply put, if we are all wearing devices that enable us to view each other, this effectively neutralizes the one-way vector of power that cameras in the hands of the powerful makes possible. Of course in order to for sousveillance to become feasible, there needs to be the social consensus in place first. But one step toward this is to be sure, is an affordable universal principle of wearable technology that facilitates customization and ease of use. The wearer truly should be able to configure their field of view and the nature of all that which is augmented over that field of view. With META AR (AKA Spaceglasses) at least that version of META AR that has been made available to developers since 2014, the tracking technology works well enough to permit this, as do the developer tools, based as they are around the free 3D and 2D game engine UNITY.

I interviewed Steve Mann in the lead up to Augmented World Expo 2015 where he will be delivering a speech on the history of Augmented Reality as well as holding workshops on META viewing tools. Mann spoke of the difference between what he called the “Big AR” of the early 1960s – that of the type popularized by Ivan Sutherland and the famous “Sword of Damocles” head mounted display of the Stanford research labs during the cold war. These were large, tethered rigs tied by cables to mainframe computers hooked up to cumbersome looking binocular visors the size of bike handles.

Mann’s own “Little AR” by contrast, developed in the late 1970s when he was but twelve years old and built from more or less found materials, was aimed squarely at empowering the individual, who thus untethered could walk around, and have his or her data made available to him or her either in motion or in situ.

As the number of AR headsets today proliferates almost exponentially and the market becomes saturated, veterans like Steve Mann are in a position to lay down some of the guiding principles as to what makes an AR ecosystem of content provision by the user successful. One of the defining characteristics is openness of configurability by the user of their resources. If a system is closed, it undermines the whole basis of a meaningful AR, hence the failure of Google glass, according to Mann as he outlines during my interview (see link below).

Google glass exudes privacy. Privacy of sight. Privacy of seeing. And through its utterly closed ecosystem of use and apps, stands in stark contrast the notion of a democratic and participatory role for what should be as free and open to use as the low cost pay-as-you-go cellphone. We have a long way to go before any system of AR is truly of ‘power to the people’, but the lowering of costs is a matter of time. A language of AR and a syntax of use, both incumbent upon the correct management of tools and their education is key here. This is where policy comes in. The relationship of the UK government to the Raspberry Pi foundation comes to mind. Massive subsidy in order to promote broad literacy and creative expression in the population. We need an Arduino style AR revolution. A pi-AR if you will. If Lenin urged Dziga Vertov to make an ‘art of twelve kopeks’, we today need an AR of fifteen dollars.

And, the user must be able to customize to their own specifications as much as possible, right down to the hardware where possible. The iPhone and the iPad are closed models rendering the user a consumer of prepackaged services. AR offers a new opportunity of aesthetics in a way also. A new set of social relations defined by interesting meaningful relationships based on data, places and people. The experimental possibilities of drifting through open fields of participatory urban spaces, and moving to new ways of working and living together through those less managed open spaces might be possible. A non-neoliberal technologically mediated commons, in which AR assists in the development of newly reimagined urban possibility.

Interacting through this environment, both figuratively and literally, we need to encourage democratic and participatory models of use for AR. Just as Bruce Sterling identified a SPIME – time, space and virtual space An augmented subject can often consider herself to be self-consciously a spime in that she occupies both the real world, the virtual world simultaneously as her data influences her decisions and actions as her body occupies space. It is with the proliferation and deployment of very low cost wearable computers based on interoperability and the principle of the user as subject that Augmented Reality is beginning to mature as a medium and as a technology. Therefore just as with any new technological shift, a new language should logically follow. These and other concepts will be discussed by Steve Mann as part of the general theme of this year’s AWE2015 which is “Superpowers to the People”.

From cinema came the language of the close up, the long shot and the jump cut, and from computers came the save-as, the cut-and-paste and the selection box. AR is sure to bring with it its own language with such terms as “flowspace” (the space in which the subject moves such that their data moves with them meaningfully), objects as interface (reaching out to door handle with AR can have the effect of unlocking the door). Thus, a kind of dance of the interplay and overlap of things, places and people with the information pertinent to them, all the time, in real time will spawn its own new kind of terminology and lingo. It is the performative language partly of theater, urban planning, of cinema, and of dance and manners. From the world of filmmaking we might call it the experience of Augmented Reality, with its floating-objects-in-space and holographic dancing-objects-interacting-with-the-world around-us a kind of mis-en-scene and directorial scene blocking in real time. Everyone a director of their own real-time experience.

New ways of seeing are thus required, to quote John Berger, where the age-old Renaissance principle of what Mann calls the ‘point of eye’, the exact position of the iris where the world we view converges on our gaze needs to be rethought, all over again. Its one thing to have all the data of the world around you converge on your eyes only. Quite another to consider these tools for the population beyond yourself and your own personal needs.

Can we strip away from the singular point of view of the typical user as depicted in the PR materials of Augmented Reality his sense of entitlement and ownership and control, and perhaps through the very same tools, replace them with a new set of ways of viewing the world, less possessive, more inclusive, more considerate of the needs of the planet and is all-too fragile membrane of a surface? Along with the need for a new language of AR is a new language of being in the world which possibly such technologies might just help usher in. If so, Professor Steve Mann is just the kind of progressively minded visionary whose pioneering work in the field gives him the right, quite literally, to light the way.

I interviewed Steve Mann on May 15th, 2015

Here is the link to the audio interview

A link to Augmented World Expo 2015