Augmented World Expo 2016



By David Cox

Augmented World Expo to be held from the 1st to the 2nd of June is an extraordinary event that over the past four years I’ve watched grow into a major hub of ideas and activity. Augmented Reality (AR) itself was once a relatively ‘fringe’ notion, the preserve of the research departments of major tech universities, the R&D sections of companies and the basements of serious hackers. In 1998 when I was a visiting scholar at MIT Media Lab, AR & wearables were the kind of thing you would read about in trade journals as the type of technology used by Boeing employees to help them wire fuselages (a use to which they are still put by the way), by ‘cyborgs’ to build communities of advertising-free wireless networks, like Prof. Steve Mann’s students at the University of Toronto, or by, alas, the military.

People that built wearables and who developed AR back then were generally those super advanced researchers and hackers who had the hardware chops to source and build embedded computer components which were very obscure and difficult and expensive to obtain. Thad Starner’s classic Tin Lizzy wearable computer design at MIT was among the first attempts to establish a standard form-factor in the mid-1990s, for example. Such machines needed to the builder to kluge batteries from camcorders and to custom wire these to stacked linux-installed dedicated embedded computers, the sort usually sold to boat builder and light aircraft makers . Wearable computing folks built one-handed chording keyboards and molded these to their hands using special surgical plastics that were heat pliable. The one-off headsets were built from components from other things like video camcorders viewfinders, or ordered from obscure companies who normally only did business with large organizations who ordered amounts like tens and fifties. This stuff was unique, rare, and you needed to be a jack-of-all-trades to do it well. You needed to be obsessed. Today you can buy a wearable computer complete if you look around on ebay for about 500 bucks. Or you can find instructions to make one for half that on with a Raspberry Pi or a Lattepanda or a Beagleboard or an Arduino.

Yes, it is all different some twenty two years later. Today, AR has reached into more and more lives by virtue of the simple and total prevalence of the post-iPhone smart device. Tablets, smartphones, smart watches and those small portable embedded computers that you see at the Maker Faire. IP addresses apply to everything it seems today, and even socks and keys and belt buckles might have an RFID tag and a website to monitor its position, telemetry and everything else. Today, the so called IOP (internet of things) is, more than an idea, it is sufficiently widespread a concept to justify its own conferences worldwide and the deployment of a whole new category of IP addressing. The sheer volume of inexpensive Chinese-sourced components and labor, the ability to thus manufacture products on a limited basis close to cost, all point to a new set of realities for the AR and wearable computing world. Hence the explosion in popularity and availability that can justify an event as big and as bold as Augmented World Expo 2016, AR showcase to the world.

I spoke to Ori Inbar last Friday who is the cofounder and executive producer of Augmented World Expo 2016 – Superpowers to the People! convention at the Santa Clara Convention Center this year. AWE2016 promises to attract a record crowd of upwards of 4000 people who will be arriving to see the very latest in augmented reality hardware, ideas software and trends. AWE was actually held in Asia last year in what was the first-ever augmented reality of its type in the region. It showcased many many new startups and companies attracting over 2000 people.

The AWE convention this year in Silicon Valley has taken out double the space for the Santa Clara Convention Center exposition floor and much of that will be focused on what’s known as enterprise end of the market which is the commercial and industrial uses of augmented reality. This is the use of headsets and other devices and software for medical, industrial, and official, large scale big dollar applications.

Architecture firms, the armed services, any group who can buy big and spend big and needs “fleets” of AR units involving groups of people who need data about the building of things, or the viewing of real-time audiovisual data-based phenomena. For example welders who need data about what they are making. Builders who can see instructions about what they are constructing without recourse to paper plans. Doctors who can have data about a patient superimposed over them while doing surgery. Drone pilots who need to see both what the drone sees but all the other information about what the camera is doing onboard. Actual plane pilots who need 3D floating information about flight controls over the view through the cockpit window, “Iron Man” style. These are the ‘enterprise’ buyers; large groups with deep pockets who need lots of units for whole groups of users who also need training in the use of those units.

Then there is the consumer market. That’s really regular people like you and I, the ‘people’ of the convention’s name, who buy AR apps for our phones, or possibly a set of glasses for using lifestyle or productivity software. For this market are firms like Meta who make the famous “Spaceglasses”. In 2012 Meta was a startup, 3D printing its headsets, based then on the high-quality prosumer level “Moverio” glasses by Epson. The distance tracking (enabling the user to appear to be able to ‘pick up’ virtual objects) was done using a modded leapmotion sensor and some very well put together custom software.

Google Glass came and went from the market in the space of several years but as Bruce Sterling, (longtime regular of AWE and its keynote speaker for many years) has noted, Glass was not so much an AR device as an annotated reality system. It popped information above and to the right of the viewer, like as if someone was constantly putting up small virtual post-it notes all the time. There was often little to no relationship between what one was seeing in reality and the information displayed, as is the case with true AR. Perhaps this is one of the reasons why it never took off. This and the fact that Google underestimated the reaction the population would have to being video recorded by Glass wearers in such a way that privacy was assumed to not matter. It is likely that more subtle variations on the Glass concept, less intrusive in terms of social relations may well present themselves this year.

Today they have the backing of serious money and are about to put out a computer-connected headset that looks like something like a cross between a futuristic motorcycle visor and a prop from a science fiction movie. It lets people pass glowing 3D objects to each other, scale rotate, pick up those objects. Meta’s aim is to bypass the keyboard and mouse altogether and offer computer users a completely gesture-based system of interface where the 3D data floats hologram style in front of the face and is manipulated by one’s hands.

Another dimension to the consumer Augmented Reality market includes wearable technology such as the fitness wristbands like Fitbit and the Apple Watch category for the ‘quantified self’ idea of personal telemetry. The gadgets for wearable tech and the market for the data associated with these gadgets is enough to justify Target stores having a whole “Wearables” section in their consumer electronics departments now.

Ori Inbar, says that 2016 will be the year in which we see lots of new hardware and software which is “well past the gimmicky stage” that was prevalent several years ago among these we might include the “scan and see” type systems. I’m thinking here of such technologies as or Aurasma and Layar which were simply smartphone enabled apps offering the scanning of printed documents. Today more serious (presumably real-time data-driven) applications will be on offer. I have nothing against Layar and Aurasma, but apart from the ‘pop up some visual data over a printed image’ there is little these apps do that is actual direct use to the consumer that adds value to a life experience. It is not invaluable, in other words.

The Waygo translator tool for example by contrast is a good example of an app which translates Chinese-to-English written language in real time and pops up information about that translation for the user. This is an example of what we might call an ‘active’ AR smartphone app that processes what it sees and provides the user with information that could only take place by means of AR.

Another important development which is on display this year is smart Fabrics. What are smart Fabrics? They are a technology which is on the rise as clothing and apparel converge more and more with smart devices and the cloud. When programmers view a fabric, they often view a busing system for channeling data. Today fabrics can be used as surfaces for display, for input, and even for feedback in the form of pressure to the user as a means of interaction with virtual data. Fabrics that are worn can be bioluminescent, as the threads used to weave the fabrics can have the properties associated with deep sea fish and glowing insects. This is the brave new world of the intersection of biotech with digital media.

Fabrics might well for example serve as foldable, cuttable displays. A fabric could literally be a screen. Its like projecting a movie onto a dress made of movie projection screen material, only there is no projector. The dress is the display. Flaschen Taschen, an LED array screen by the San Francisco hackerspace Noisebridge is a good example of this type of development at a relatively low resolution and was ‘all over’ the Maker Faire this year.

A comprehensive demonstration display of smart fabrics will be on show at this year’s Augmented World Expo so anyone attending will be treated to that also. The relationship between augmented reality and virtual reality will also be at the forefront this year.

Its going to be great.

See you there.

Augmented World Expo

1st – 2nd June 2016

Santa Clara Convention Center

AWE2016 – Superpowers to the People! Website

Superpowers to the People – Augmented World Expo 2015


By David Cox

SuperPowers to the People: Augmented World Expo 2015: An introduction to an audio interview with Professor Steve Mann (see link at end of article). The augmented reality conference AWE2015 is coming up and its theme is “Superpowers to the People”, and as usual the buzz is around Meta AR, the Kickstarter based firm that developed a headset and developer kit based around UNITY. Since 2013, the first year of META’s development has seen it grow considerably from a 3D printed housing prototype variation on the Epsom “Movio” glasses on which it was originally based.

META’s innovation was to add a ‘kinect’ or leapmotion style META tracker to the front bridge-of-the-nose area to act as the basis for where your computer knows where to place objects in your field of view from your ‘point of eye’ (POE) to use the jargon. This tracker knows also to ‘see’ your hand and to interpret it as the device with which objects are being manipulated, moved and transformed.

Steve Mann, Chief Scientist at META AR is a true pioneer of both wearable computing and Augmented Reality, and has been building his own wearable devices since 1974. I first met him in 1995 at the MIT Media Lab on a research visit.

A strong believer in personal freedom, Mann believes that wearable computing, especially the ability to manage one’s personal space as it pertains to the recorded image is a path to democracy. He views technology like META as a great equalizer in the war against surveillance. Against the top-down vector of ‘surveillance’ he posits ‘sousveillance’ which is ‘seeing from below’.

Simply put, if we are all wearing devices that enable us to view each other, this effectively neutralizes the one-way vector of power that cameras in the hands of the powerful makes possible. Of course in order to for sousveillance to become feasible, there needs to be the social consensus in place first. But one step toward this is to be sure, is an affordable universal principle of wearable technology that facilitates customization and ease of use. The wearer truly should be able to configure their field of view and the nature of all that which is augmented over that field of view. With META AR (AKA Spaceglasses) at least that version of META AR that has been made available to developers since 2014, the tracking technology works well enough to permit this, as do the developer tools, based as they are around the free 3D and 2D game engine UNITY.

I interviewed Steve Mann in the lead up to Augmented World Expo 2015 where he will be delivering a speech on the history of Augmented Reality as well as holding workshops on META viewing tools. Mann spoke of the difference between what he called the “Big AR” of the early 1960s – that of the type popularized by Ivan Sutherland and the famous “Sword of Damocles” head mounted display of the Stanford research labs during the cold war. These were large, tethered rigs tied by cables to mainframe computers hooked up to cumbersome looking binocular visors the size of bike handles.

Mann’s own “Little AR” by contrast, developed in the late 1970s when he was but twelve years old and built from more or less found materials, was aimed squarely at empowering the individual, who thus untethered could walk around, and have his or her data made available to him or her either in motion or in situ.

As the number of AR headsets today proliferates almost exponentially and the market becomes saturated, veterans like Steve Mann are in a position to lay down some of the guiding principles as to what makes an AR ecosystem of content provision by the user successful. One of the defining characteristics is openness of configurability by the user of their resources. If a system is closed, it undermines the whole basis of a meaningful AR, hence the failure of Google glass, according to Mann as he outlines during my interview (see link below).

Google glass exudes privacy. Privacy of sight. Privacy of seeing. And through its utterly closed ecosystem of use and apps, stands in stark contrast the notion of a democratic and participatory role for what should be as free and open to use as the low cost pay-as-you-go cellphone. We have a long way to go before any system of AR is truly of ‘power to the people’, but the lowering of costs is a matter of time. A language of AR and a syntax of use, both incumbent upon the correct management of tools and their education is key here. This is where policy comes in. The relationship of the UK government to the Raspberry Pi foundation comes to mind. Massive subsidy in order to promote broad literacy and creative expression in the population. We need an Arduino style AR revolution. A pi-AR if you will. If Lenin urged Dziga Vertov to make an ‘art of twelve kopeks’, we today need an AR of fifteen dollars.

And, the user must be able to customize to their own specifications as much as possible, right down to the hardware where possible. The iPhone and the iPad are closed models rendering the user a consumer of prepackaged services. AR offers a new opportunity of aesthetics in a way also. A new set of social relations defined by interesting meaningful relationships based on data, places and people. The experimental possibilities of drifting through open fields of participatory urban spaces, and moving to new ways of working and living together through those less managed open spaces might be possible. A non-neoliberal technologically mediated commons, in which AR assists in the development of newly reimagined urban possibility.

Interacting through this environment, both figuratively and literally, we need to encourage democratic and participatory models of use for AR. Just as Bruce Sterling identified a SPIME – time, space and virtual space An augmented subject can often consider herself to be self-consciously a spime in that she occupies both the real world, the virtual world simultaneously as her data influences her decisions and actions as her body occupies space. It is with the proliferation and deployment of very low cost wearable computers based on interoperability and the principle of the user as subject that Augmented Reality is beginning to mature as a medium and as a technology. Therefore just as with any new technological shift, a new language should logically follow. These and other concepts will be discussed by Steve Mann as part of the general theme of this year’s AWE2015 which is “Superpowers to the People”.

From cinema came the language of the close up, the long shot and the jump cut, and from computers came the save-as, the cut-and-paste and the selection box. AR is sure to bring with it its own language with such terms as “flowspace” (the space in which the subject moves such that their data moves with them meaningfully), objects as interface (reaching out to door handle with AR can have the effect of unlocking the door). Thus, a kind of dance of the interplay and overlap of things, places and people with the information pertinent to them, all the time, in real time will spawn its own new kind of terminology and lingo. It is the performative language partly of theater, urban planning, of cinema, and of dance and manners. From the world of filmmaking we might call it the experience of Augmented Reality, with its floating-objects-in-space and holographic dancing-objects-interacting-with-the-world around-us a kind of mis-en-scene and directorial scene blocking in real time. Everyone a director of their own real-time experience.

New ways of seeing are thus required, to quote John Berger, where the age-old Renaissance principle of what Mann calls the ‘point of eye’, the exact position of the iris where the world we view converges on our gaze needs to be rethought, all over again. Its one thing to have all the data of the world around you converge on your eyes only. Quite another to consider these tools for the population beyond yourself and your own personal needs.

Can we strip away from the singular point of view of the typical user as depicted in the PR materials of Augmented Reality his sense of entitlement and ownership and control, and perhaps through the very same tools, replace them with a new set of ways of viewing the world, less possessive, more inclusive, more considerate of the needs of the planet and is all-too fragile membrane of a surface? Along with the need for a new language of AR is a new language of being in the world which possibly such technologies might just help usher in. If so, Professor Steve Mann is just the kind of progressively minded visionary whose pioneering work in the field gives him the right, quite literally, to light the way.

I interviewed Steve Mann on May 15th, 2015

Here is the link to the audio interview

A link to Augmented World Expo 2015

iLoud Portable Speaker


By David Cox

Years ago, portable speakers were heavy, cumbersome affairs. If there were batteries at all, they generally were not rechargeable and ran out of juice quickly.

The iLoud portable speaker

IK Multimedia, iLoud

View all 5 photos

IK Multimedia, iLoud

The whole point of small speakers was to have powered amplification where you needed it

outside or in situations where you could not plug in easily. The 6X AAA battery-powered Roland Microcube and its ilk filled a niche for a while about 10 years ago, for guitarists and keyboardists, and did the job pretty well, but these were really solid mini-stage amps, scaled down for small cafes and busking, not really suitable for say DJ-ing in galleries or at a party. If you were trying to play your iPad through them, it was like using a loud-haler – not much subtlety to the highs & lows, but okay if you were ripping it like Curt Cobain. The alternative really was to bring a small hi-fi but that again is a different kind of experience, and not really a self-contained speaker as such and you’re still plugged in to that wall socket.

But now both speaker technology AND battery technology have advanced such that very powerful and very high quality speakers can be manufactured that pack a fairly hefty wallop when it comes to delivery of sound and bass response, while at the same time leaving a relatively small footprint. Studio monitor speakers, once the sole preserve of high end recording booths have escaped into the laptop bags and DJ kits of the smart device generation and have joined the plethora of hardware of peripherals that accompany the sample driven music performance world of today.

IK Multimedia, today launched iLoud®, the first portable stereo speaker designed for studio monitor quality on the go, is now available from music instrument and consumer electronics retailers worldwide. The iLoud battery-operated speakers combine superior power, pristine frequency response and amazing low end in an ultra-portable design that makes it the perfect alternative to studio speakers for music creation, composition and playback on the go.

Loud Clear and Bassy, like a Lo-Rider at Night in San Francisco’s Mission District going by Low and Slow my Brother.

The iLoud speaker is indeed very loud. In fact, it’s 2 to 3 times louder than comparable size speakers – a blasting 40W RMS of power. But iLoud is extremely clear at all volume levels thanks to an onboard DSP processor and a bi-amped 4-driver array of high efficient neodymium loudspeakers, that provide accurate, even response across the entire frequency spectrum for unbelievable realism of sound. For deep bass response iLoud’s bass-reflex allows frequencies to go down to 50hz, an amazing low end for this small enclosure.

I’ve been using the iLoud for a few days now with Netflix and DVDs and have been amazed at how much I can actually hear on these movie soundtracks that would otherwise remain hidden. I’m talking about very densely mixed films like Ip Man (both 1 and 2) and that true litmus test for all movie sound design perfectionists, Dennis Hopper’s 1988 Gang-vs-Gang-vs-Cops film Colors (play it LOUD!!). For more on why this film is so important for understanding the importance of film sound, see this excellent article by Philip Brophy.

iLoud is the ideal speaker for musicians and audiophiles who demand an accurate reproduction of a wide range of musical styles from rock, hip-hop and electronic dance music, to more nuanced and sonically demanding genres like jazz, classical and acoustic.

Portability and the types of gigs this implies.

The iLoud speaker is powered by a high-performance Li-ion rechargeable battery with smart power-management features that reduce its power consumption so that it can be used for up to 10 hours without recharging. This makes iLoud an ideal portable speaker solution for mobile musicians. I find it will fit in a backpack very easily for gigs I can prepare for of the sort previously that would have required different ways of thinking about in terms of transport. I’m thinking; playing soft electric guitar via the iPhone at the cafe table or in the backseat of a car. Or playing keyboard WITH movie soundtrack in small gallery with a dataprojector to a group of 20 visitors, but on the sidewalk or in the alleyway with the barbecue and the beer buckets.

The Real Innovation – Wired and Wireless Connectivity

iLoud supports Bluetooth operation for wireless audio streaming anywhere and everywhere from a mobile device such as an iPhone, iPad, iPod touch, Android smartphone or tablet for casual listening. For sound sources like MP3 players that do not have Bluetooth capabilities, the iLoud also has a stereo 1/8″ mini-jack input for connecting line-level devices such as home stereos, DJ gear, mixers, MP3 players, and more.

Plug and Play Convenience

iLoud also offers the ability to connect a guitar, bass or dynamic microphone directly to the speaker and process the sound with a multitude of real-time effects apps on iOS devices. It features the same circuitry as IK’s iRig – the most popular mobile interface of all time – and allows users to plug in guitars or other instruments and access AmpliTube or other audio apps on their mobile device for practicing, performing and recording. The input also accommodates dynamic microphones, making it possible to run an app like IK’s VocaLive for real-time vocal effects and recording.

I recommend the iLoud for the experience of having a well-made and truly portable RECHARGEABLE (very important) speaker that is truly studio quality with you whenever you need it. And watch “Colors” with it when you get a chance. LOUD!!

Pricing and Availability

iLoud is priced at $299.99/€239.99 (excl. tax) and is available now from the IK network of music and electronic retailers around the world.

For more information, go to:

For a comprehensive collection of videos that showcases iLoud’s feature set, go to:

Gamebridge – the weekly classes in Unity at Noisebridge Hackerspace



By David Cox

It’s a rainy night in the Mission people move back and forth along Mission Street. There is the smell of burritos, tobacco, perfume and the effervescent sense that something is happening. There is rain there is marijuana wafting up and down the street. There are cafes and nighclubs. There are taxis and cop cars crawling up and down the street. There is Gamebridge.

It began three or four years ago at Noisebridge Hackerspace between 18th and 17th streets in the Mission District to enable those without the means to build and construct electronic inventions the means to access resources, to converge and share tools. It’s been a hub of activity for anyone interested in putting together ideas build something make a robot, 3D print an object, use fabrics, recycle computers, design games or simply use a soldering iron when they don’t have this equipment at home. If a hackerspace game club (an adhocracy by its very nature) could be said to have an organizer, it is definitely the sharply intelligent and quickly spoken programmer Canadian Alex Peake who has a background as a Game Developer. Peake peppers his descriptions of processes with vivid metaphors and always has a great visual concept to illustrate his ideas. He has an amazing passion for games, for programming and for teaching and is one of the best in the business. Brennan Hatton and Bud Leiser also contribute with equally passionate delivery detailed lessons that keep the Gamebridge regulars glued to the screen and their own laptops in equal measure for hours at a time.

People bring high end laptops with them (Macbook pros area a favorite). Unity will run on Windows, Mac or LInux machines which is to its great credit. Gamebridge regulars can number up to ten or twelve a week. They come to hear Alex and Brennon and Bud demonstrate how to use Unity with C-sharp and JavaScript to generally build game environments or perhaps to better improve collaborative workflow methods, Unity desktops are projected via the Noisebridge data projector and people follow careful instructions step by step. There are lots of resources for free with unity at and the API is very extensive as well and this makes it easy for people to access, even relative neophytes like me. Tthere’s a great sense of shared community in Gamebridge as well and everyone is willing to pitch in and do a mini-talk or a class or put forward an idea.

Someone has ordered Pizza. It arrives steaming filling the space with the scent of tomato paste and warm melting cheese and garlic. There is cold Diet Coke and then the paper cups and and napkins a broken out and discussions happen. It’s a great scene and everyone has something to offer.


One of the crowning triumps of Gamebridge recently is a collaboratively developed augmented reality project called SimBridge in which the entire Noisebridge space itself has been replicated in virtual 3D space so that it’s possible to move through it online while wearing a headset. While you are in the building, you can hold up a portable device like an iPhone or tablet and to see the same space superimposed over the real space but this 3D game-like metadate annotates the real space and tells you what sections of the real one are, and what they are for. It also enables people to share a virtual representation of Northbridge at a distance these and other experiments of pushing Noisebridge forward as a key activity at a time when virtual reality and augmented reality are starting to really push the boundaries of what’s possible with the new technology.

Nobody really expects to make money out of this. The whole thing is really grassroots. This is the spirit of the original ‘homebrew computer clubs’ of the 1970s and it’s about experimentation and ideas for their own sake. To that extent it’s a utopian testing ground and it is made up largely of young people with laptops and passion.

It’s a great thing. It’s Gamebridge

Trash amp, guitar amp,


By David Cox

When hackers make things that are normally the preserve of professionals the results can often be disappointing. Today however because of the amazing advances in components and the access to very very good electronic components the home-hacked product might exceed the one used to be sold only through specifically dedicated manufacturers.

One such device is the humble guitar practice amplifier for electric guitar or the sort of amp you would use with an MP3 player or any kind of audio signal. What is a trash amp you might ask? It is a small consumer item on display at the Maker Faire at the San Mateo fairgrounds in May this year.

The “trash” in the name comes from the fact that the amplifier assembly can be embedded into any portable drinks container that is small, portable and probably used once for containing a beverage. Trash amps have patented their basic idea which is based on the idea that their hardware circuitry is powerful enough to deliver a loud and solid signal, be powered by a rechargeable 5V battery, yet small enough portable enough to be fitted into a soda can or a mason jar. You know, those glass containers that were used throughout the 20th Century to preserve fruits and jellies and jams.

Getting a loud solid sound for guitar amplification out of such tiny containers would seem counter intuitive (a Mason jar?) but the speakers used and the hardware to power that speaker combined with the acoustic effect of the small space of a can or jar combined really packs a punch.

I’ve been using my mason jar trash amp in a small room and the speaker used by thrashing along with the small 5 volt rechargeable batteries of the kind that you associate with recharging a smartphone I’m more than enough to deliver a sound that fills a room. It is so loud in fact that I have to keep the on the guitar volume way less than ‘full’ on my fender strat or my .mp3 player than its potential maximum.

The trash amp is not quite as loud as a Pignose but it is much louder than a small portable cylinder speaker you might get for your iPhone. How is the trash amp possible? I would suggest it is the result of huge advances in a) portable power delivery – cheap, rechargable efficient batteries, b) very good speakers that are cheap and loud enough to do the job and of course c) the ease with which all of this and become available through mass production as a result of Chinese manufacturing of portable devices generally.

The trash can amp is an elegant simple and “obvious in retrospect” idea that is a great talking point. If nothing else the thing looks interesting. At $50 a completed Mason Jar model is inexpensive enough to purchase is already completed but if you wanted to try out your soldering techniques you can buy the components as a kit close to cost and build your own.

I’ve had nothing but fun with my trash amp mason jar amp and every time I use it people always ask me about it. The little LED lights inside it – one blue and one orange glow nicely inside the Mason jar like a small lamp. Its comforting.

This is feeling a bit of a classic, much like the tiny Pignose amp was back in the 1970s. I think the trash amp guys are going to do well.


  • Doubles as a practice electric guitar amp
  • Automatically turns on when music plays and off when it stops to conserve battery life
  • 2.4W amplifier with 2 inch full range driver
  • 3.5mm Cord included, so you’re ready to rock, straight out the box

Trashamps website:

Virtual Reality and the likely return of the Movie Intermission


By David Cox

A memory of my travels as a young film maker with a suitcase of 16mm film prints traveling around the Pacific North West. I was hanging out with Bob Basanich (sometime guitarist for Negativland and co-founder of the Olympia Film Ranch) when touring with my films in 1992. Bob was the among other things at the time, the 35mm projectionist at a Drive-In theater outside Olympia, Washington. During intermission he projected some excellent old 1950s Intermission and Snack Time! slides as well as others advertising long-defunct car dealerships and other local businesses that he’d found in the projection booth. He also screened a short film he’d made from the offcuts of 35mm he’d found lying around in the same booth.

All this in between the main show which was an Alien 2 and Predator 2 double show. Shooting stars also appeared above the screen that night amidst the Douglas Firs…

Intermissions no longer are a part of the mainstream cinema experience, but I suspect they will make their return with Virtual Reality, if only to give the wearer of the headset a break to perceive the world around them. Virtual Reality movies must include everything in their field of view, including the film maker, who we are told must ‘blend in’ to the environment if they are to not be obvious to the viewer. With no frame there is nowhere to hide. Time too, must disappear if the illusion is to be maintained.

So how best to transition from the VR realm to that of the world beyond the headset? Fade-outs, wake-up calls? What might the industry of ads be in the VR world – one shudders to think… As for the intermission, those 1950s tantalizing appeals to go to the snack bar of the drive-in theaters of 60 years ago could now be the 3D injunction to press the virtual button to order a pizza, or some other delivery item. Or to visit the kitchen to get a branded item that unlocks part 2 of the production you are watching. Gamifying the VR surround experience could be average consumer dystopian future, if the usual people in the media production industries have their way.

It remains to be seen.

The Original Videogame Museum in California Will be Moving to New Home


By David Cox

Oakland, California – October 5, 2015 – The Museum of Art and Digital Entertainment (The MADE) is gearing up to move into its new home. This non-profit videogame museum is nearing the end of its Kickstarter campaign for $50,000, all of which will be used to move the facility into a larger, more accessible space.

The MADE was one of the original Kickstarter success stories, originally raising $20,000 on the site in 2011. Those funds were used to open America’s first dedicated, all-playable, open to the public videogame museum focusing on home and console games.

Now 4 years old, the MADE has outgrown its current location and is seeking to move into a space double its current size, elsewhere in Oakland. The all-volunteer museum is aiming to raise $50,000, which will fund the renovation and move in for a new location.

“We’ve done a lot of great work here, behind City Hall in Oakland, but it’s time to expand in every way,” said Alex Handy, founder and director of the MADE. “Our tournaments are standing room only and our collection grows every single weekend through new donations. We’re excited about the prospect of showing everything off in a better suited location.”

The MADE aims to preserve the history of videogames through playable exhibits and free programming classes. In its four year history, the museum has trained over 400 students in skills ranging from Scratch, C and Android development, to Photoshop, Unity, Presonus and ProTools.

Henry Lowood, Curator for History of Science & Technology Collections at Stanford University Libraries and founding member of The MADE’s board of directors, said that “Digital games without a doubt have become one of the central creative media available for entertainment, art and other forms of expression. So much so that contemporary cultural history is difficult to talk about without including digital games. As a result, not only will the history of this medium be lost if we do not preserve the history of digital games, but there is more at stake: we will be unable to provide a complete cultural history of our times.”

To this end, in its four year history, the MADE has worked to preserve and relaunch Habitat, the first graphical MMO for the Commodore 64, the long lost GamePro 1996 TV show, and has worked with the EFF to change copyright law around the preservation of old videogames.

The MADE’s Kickstarter is online at:…

About The Museum of Art and Digital Entertainment (MADE)

Founded in 2010, The MADE is an all volunteer organization created by Alex Handy, a video game journalist and technology archaeologist based in Oakland, California. In 2008 Mr. Handy unearthed a 25-year-old parcel of long lost Atari 2600 and Colecovision games at a flea market in that city, spurring his creation of the Museum. The MADE is an IRS recognized 501c(3) not-for-profit organization. It’s EIN number is 26-4570976 . The MADE raised its initial $20,000 on crowd-funding site, and has used those funds to pay for rent, Internet and insurance at its facilities in downtown Oakland. That facility opened in October of 2011. Since that time, the MADE has released many lost videos from the industry, worked with the EFF on copyright law, and started an effort to relaunch Habitat, the first graphical MMO. The MADE is 100% volunteer operated.

The MADE is open weekends from noon to 6 PM. Admission is free. More info is online at


610 16th St.

Suite 230 (Second floor)

Oakland, CA 94612

Dial #0230 to be buzzed in



Alex Handy

Director, The MADE


The Museum of Art and Digital Entertainment (The MADE)

New acoustic guitar microphone iRig Acoustic Condenser Microphone


By David Cox

First published in

The first ‘contact microphones’ as they were called in the 1970s came in a small plastic boxes and were disk-shaped. With them came this strange putty-like material that was a bit like a cross between silly putty and blu-tac.

The idea was you put the mic near the 6 o’clock position on the wood beneath the sound hole of your acoustic guitar and connected the other end to your amp. Of course being a microphone, it would invariably feedback at the soonest opportunity. It worked, but only just.

The best amplified acoustic guitars have long since has their own mikes built into them, with the electronics virtually embedded into the bodywork and a phono jack near where the guitar strap goes.

Now has come a removable acoustic condenser mike that works like a charm, is built very well and comes, crucially with some dedicated software that enables the player to sound as good as the best in the business.

According to IK Multimedia’s website, the device’s inception was inspired in part by the a well known feature length documentary on flamenco master Paco de Lucia. Made specifically for the acoustic player, the mic is designed to work in conjunction with dedicated apps that then take the nylon or steel string acoustic sound and ‘sculpt’ it and further process it for an optimized sound.

(MicroElectrical-Mechanical System) microphone technology replicates the positioning of a high quality microphone with an omnidirectional polar pattern, placement of the microphone just inside the sound hole where the output is optimal. This is combined with a “calibration” process that optimizes the guitar sound as if it were being miked externally. The result is a complete ‘tone picture’ of the guitar rather than simply a recording of the physical vibrations of the wood and strings.

iRig Acoustic is packaged with AmpliTube Acoustic FREE (download on the App Store), the new acoustic-specific AmpliTube app designed for processing and recording acoustic guitar and ukuleles.

First, when used with iRig Acoustic, the AmpliTube Acoustic app features a calibration and setup process that measures and optimizes the frequency response of your acoustic instrument and provides the “sweet spot” sonic clarity, tonal characteristics and projection that you normally get in the studio with an expensive high-end studio condenser microphone positioned in just the right spot. iRig Acoustic and AmpliTube Acoustic deliver that ultimate level of acoustic realness and character as an optimized system for a fraction of the cost.

The tone studio offers emulations of popular acoustic amplifiers; two solid state amps and one tube amp, complete with integrated effects sections as well as guitar pedal effects, including a live performance “Feed Kill” feedback eliminator, an acoustic compressor, graphic EQ and Parametric EQ; a 12-String emulator, a “Bass Maker” octave pedal, plus a “Body Modeler” that alters the sound of your guitar into another style of guitar. That’s pretty amazing, really.

One thing is for sure. It is more than the simple contact microphones of the ’70s.

RRP: $49.99

For more information visit:

Films online about Earth and Man – Anthropocinema!


By David Cox

The Holocene is over and the Anthropocene defines our epoch. Mankind’s irreparable and irreversible influence on the face of the planet will totally define its fate from now on. Several films that have as their central idea man’s impact on the planet.

Koyaanisquatsi, “Life out of Balance”, the epic 1982 documentary by Godfrey Reggio with music composed by Philip Glass and cinematography by Ron Fricke.shows the impact of man on his own environment and culture through a cornucopian montage of time-lapse, slow motion and panoramic vistas of spectacular yet fragile landscapes. I first saw it on film at the Valhalla Cinema, a repertory theater in Richmond, Melbourne whose audience was mainly students, and switched on counterculture types. I remember the kaleidoscopic cascade-like film playing to row after row of amazed Melbournites.

The semi ad-hoc nature of the way Koyaanisquatsi has been made complements the wild theme-based structure. It is a symphony of shots that leave you with the sense that the world is mad with development and that our impact as a species on the planet is without limit or direction.

Three Hopi prophecies sung by a choral ensemble during the latter part of the “Prophecies” movement are translated just prior to the end credits:

• “If we dig precious things from the land, we will invite disaster.”

• “Near the day of Purification, there will be cobwebs spun back and forth in the sky.”

• “A container of ashes might one day be thrown from the sky, which could burn the land and boil the oceans.”

Wax: Or the Discovery of Television by the Bees an experimental science fiction film by David Blair (1991) was prescient in fusing a vision of a world in which a Middle-Eastern war, photography, mathematics, and geometry had resulted from fusion of communication between bees and humans.

Wax arrived at a time when both its means of production and the themes it was addressing converged elegantly via the then brand new dimension of the Internet. I was able to count on one hand the number of people I could email when the film was released, and the terror and possibilities of new modes of communication we all felt in these early days are beautifully embodied in the film. William Burroughs himself wanders through the film, ambassador of all that is juxtaposed and otherworldly, and it is fitting that he should preside in this world, which seems to speak to our neoliberal wasteland today, devoid as it is rapidly becoming, of its UBER influence over Alles.

Powers of Ten by Eames Studio (1977) took the time to show the relationship of Earth to its planetary neighbors, and at the same time revealing the makeup of human matter at the atomic scale.

This mind-boggling animated journey into scalar depiction and scientific humanist relativism became the mainstay for many a high school and college study session. It ponders the big questions about our place in the universe and the universe in us. It was not the first film to examine the universe from the point of view of relative exponential scale (Cosmic Zoom predated it by several years) but it was certainly the first to do so in a way that precisely understood the relationship between all this cosmic measurement and the role of companies like IBM who distributed the film, and the way that such corporate sponsorship of the eternal would come to define the world in which we live today. The Anthropocene is nothing if not brought to you by the Biggest of the Big Players, then as now.

The Stalker (1979) by Andrie Tarkovsky is noted for its stark use of gritty, earthy close ups of mud, swamps, and the very material makeup of the planet itself. It shows a journey led by the ‘Stalker’ (Aleksandr Kaidanovsky) to take his two clients, a melancholic writer (Anatoli Solonitsyn) seeking inspiration and a professor (Nikolai Grinko) who seeks scientific discovery, to a place known as the ‘Zone’, which has a place within it with the supposed ability to fulfill a person’s innermost desires.

The three travel through unnerving areas filled with the cast-off material of modern society. They yell at each other, and on confronting the ‘Zone’ it would appear that it is in fact alive. Traversing “The Zone” can be felt but not really seen. The Cacophany Society’s Carrie Galbraith has said that the original “Burning Man” event was in fact one of several “Zone Trips” that were inspired by “The Stalker”, number 4 in fact, and the idea that a sentient earth receptive to the thoughts of those that engage with it is entirely consistent with the ideas of utopian groups who offer alternative uses for Federal desert land such as the Center for Land Use Interpretation. The contemporary Burning Man is a far cry from the ad hoc aims of those who interpreted the same site for “Zone Trip Number Four”, and the world is worse for it.

Don’t look up to heaven for transcendence, look down; at the shit, the mud, the earth, the swamp and all the fine grained individual particles of dirt and muck that make up our lives on this most finite of planets. For the effect-of-man-on-the-earth should be measured thus, the better to take account of all that has been moved out of place in the name of modernity, and all that has unfolded since.

The documentary Manufactured Landscapes, directed by Jennifer Baichwal is about the work of photographer Edward Burtynsky whose work concerns itself with the impact of massive manufacting plants on the earth’s environment. Enormous factories, large scale infrastructure programs, many of which are in mainland China form the basis of this extraordinary film about the bigger picture of global trade and its scarring effect on surface of the earth, and demands it makes on those caught in its seemingly unstoppable flows.

Together combined, the above films make for an elegant mini film festival on the Anthropocine – call it Anthropocinema. Thankfully, most are online for free.


Wax: Or the Discovery of Television by the Bees:

Powers of Ten – Eames Studio (1977)

The Stalker (1979) by Andrie Tarkovsky
Part 1

Part 2

Manufactured Landscapes, directed by Jennifer Baichwal (trailer)

Article: After Anthropocinema by Mohammad Salemy at the Brooklyn Rail website.

Real portability comes to a studio-quality music interface device


By David Cox

First Published on December 14, 2015 2:17 PM MST

  • iRig Pro Duo Music Portable Interface Device (used with permission of IK Multimedia)

IK Multimedia, YouTube,

iRig Pro Duo USB MIDI Interface

iRig Pro Duo is a studio-quality 24-bit audio box that is very easy to use, solidly built and is effectively, a bridge between the real world of microphones, guitars, basses, keyboards and those sometimes precious, finicky phones, pads, tablets and laptops and all the variations in the between.

Guitars and keyboards are from the 20th Century. They are heavy, solid, require effort to use and lift and carry. Data is about now. It is invisible, glows in the dark and is all around us. The iRig Pro Duo offers a kind of high speed tunnel between the two worlds. It is an Analog to Digital Converter for creative audio artists on the move, and the Swiss Army Knife version of one at that.

There are 2 channels with XLR/TRS combo audio jacks and phantom power so you can use high-end condenser microphones as well as guitars, basses, keyboards and any MIDI controller out there. This reminds me very strongly of the famous Zoom Hn4 field recorder which has a similar setup for audio in, and which can also be used as a USB interface.

Connections include Lightning, USB-OTG and USB and it comes with the proper cables which is a great thing given its true portability.

It is possible to do Dual-channel recording with each channel having its own input gain control; each channel can be afforded the correct input signal while recording.

MIDI machine

With iRig Pro DUO, you don’t just have a superb portable interface for audio; you can also hook up your favorite MIDI controllers. This is thanks to its included TRS to MIDI-DIN cables and dedicated MIDI in/out jacks, so you can control MIDI-compatible software (or send MIDI data to MIDI-compatible hardware, like synthesizers, drum machines and samplers) with plug and play simplicity.

High-quality studio condenser microphones require 48V of external power to function. It’s part of what makes them sound so good, crisp and accurate. With iRig Pro DUO, you’ll have that 48V of phantom power at your fingertips. Just plug in your high-quality condenser mic, flip iRig Pro DUO’s phantom power switch and record until your heart’s content.

Monitoring & output

iRig Pro DUO comes with two 1/4” TRS balanced analog audio outputs for your speakers.

These outputs provide clear audio signal via onboard output drivers.

You can also do direct monitoring to monitor either the direct incoming “dry” signals, or the processed signal coming back at you from your device or computer. There is a output amplifier as well so you can hear your mix back nice and loud via headphones. iRig Pro DUO can be powered with 2 AA batteries for use with iPhone or iPad. In addition, you can plug in a USB or Micro USB to OTG cables and iRig Pro DUO is then powered by the connected computer or Android device, or the AA batteries. Camped out in the studio? Plug it into the wall with a DC power adapter (not included) for extended recording sessions.


• Truly mobile dual input audio interface for iPhone, iPad, Android, Mac and PC

• Simultaneous dual track recording interface for all instruments

• Ultra-compact housing for extreme portability

• Dual identical XLR/TRS combo input jacks

• Dual ultra-low noise studio-quality IK preamps

• Individual input gain controls

• 48V phantom power

• Self-powered (2 AA batteries), device powered or DC power adapter (not included)

• 24-bit AD-DA converters

• Dual 1/4” switchable TRS balanced outs

• 1/8” 3.5mm Headphone out w/ level control

• MIDI IN/OUT jacks

• Ultra-compact housing fits in the palm of your hand

• Comes with mini-DIN to Lighting, Micro USB OTG and standard USB cables

• Designed and made in Italy

iRig Pro Duo is the MIDI box to take with you the next time to want to do some recording in the park, or to play outside with your keyboards with the band somewhere out in the open. Its super-versatile, highly adaptable and has uses to which I’m sure even the inventors have yet to discover.

RRP: $199.99

For more information visit: