modulus

Trash amp, guitar amp,

Aug
02

By David Cox

When hackers make things that are normally the preserve of professionals the results can often be disappointing. Today however because of the amazing advances in components and the access to very very good electronic components the home-hacked product might exceed the one used to be sold only through specifically dedicated manufacturers.

One such device is the humble guitar practice amplifier for electric guitar or the sort of amp you would use with an MP3 player or any kind of audio signal. What is a trash amp you might ask? It is a small consumer item on display at the Maker Faire at the San Mateo fairgrounds in May this year.

The “trash” in the name comes from the fact that the amplifier assembly can be embedded into any portable drinks container that is small, portable and probably used once for containing a beverage. Trash amps have patented their basic idea which is based on the idea that their hardware circuitry is powerful enough to deliver a loud and solid signal, be powered by a rechargeable 5V battery, yet small enough portable enough to be fitted into a soda can or a mason jar. You know, those glass containers that were used throughout the 20th Century to preserve fruits and jellies and jams.

Getting a loud solid sound for guitar amplification out of such tiny containers would seem counter intuitive (a Mason jar?) but the speakers used and the hardware to power that speaker combined with the acoustic effect of the small space of a can or jar combined really packs a punch.

I’ve been using my mason jar trash amp in a small room and the speaker used by thrashing along with the small 5 volt rechargeable batteries of the kind that you associate with recharging a smartphone I’m more than enough to deliver a sound that fills a room. It is so loud in fact that I have to keep the on the guitar volume way less than ‘full’ on my fender strat or my .mp3 player than its potential maximum.

The trash amp is not quite as loud as a Pignose but it is much louder than a small portable cylinder speaker you might get for your iPhone. How is the trash amp possible? I would suggest it is the result of huge advances in a) portable power delivery – cheap, rechargable efficient batteries, b) very good speakers that are cheap and loud enough to do the job and of course c) the ease with which all of this and become available through mass production as a result of Chinese manufacturing of portable devices generally.

The trash can amp is an elegant simple and “obvious in retrospect” idea that is a great talking point. If nothing else the thing looks interesting. At $50 a completed Mason Jar model is inexpensive enough to purchase is already completed but if you wanted to try out your soldering techniques you can buy the components as a kit close to cost and build your own.

I’ve had nothing but fun with my trash amp mason jar amp and every time I use it people always ask me about it. The little LED lights inside it – one blue and one orange glow nicely inside the Mason jar like a small lamp. Its comforting.

This is feeling a bit of a classic, much like the tiny Pignose amp was back in the 1970s. I think the trash amp guys are going to do well.

Features:

  • Doubles as a practice electric guitar amp
  • Automatically turns on when music plays and off when it stops to conserve battery life
  • 2.4W amplifier with 2 inch full range driver
  • 3.5mm Cord included, so you’re ready to rock, straight out the box

Trashamps website:

http://www.trashamps.com/

Virtual Reality and the likely return of the Movie Intermission

Aug
02

By David Cox

A memory of my travels as a young film maker with a suitcase of 16mm film prints traveling around the Pacific North West. I was hanging out with Bob Basanich (sometime guitarist for Negativland and co-founder of the Olympia Film Ranch) when touring with my films in 1992. Bob was the among other things at the time, the 35mm projectionist at a Drive-In theater outside Olympia, Washington. During intermission he projected some excellent old 1950s Intermission and Snack Time! slides as well as others advertising long-defunct car dealerships and other local businesses that he’d found in the projection booth. He also screened a short film he’d made from the offcuts of 35mm he’d found lying around in the same booth.

All this in between the main show which was an Alien 2 and Predator 2 double show. Shooting stars also appeared above the screen that night amidst the Douglas Firs…

Intermissions no longer are a part of the mainstream cinema experience, but I suspect they will make their return with Virtual Reality, if only to give the wearer of the headset a break to perceive the world around them. Virtual Reality movies must include everything in their field of view, including the film maker, who we are told must ‘blend in’ to the environment if they are to not be obvious to the viewer. With no frame there is nowhere to hide. Time too, must disappear if the illusion is to be maintained.

So how best to transition from the VR realm to that of the world beyond the headset? Fade-outs, wake-up calls? What might the industry of ads be in the VR world – one shudders to think… As for the intermission, those 1950s tantalizing appeals to go to the snack bar of the drive-in theaters of 60 years ago could now be the 3D injunction to press the virtual button to order a pizza, or some other delivery item. Or to visit the kitchen to get a branded item that unlocks part 2 of the production you are watching. Gamifying the VR surround experience could be average consumer dystopian future, if the usual people in the media production industries have their way.

It remains to be seen.

The Original Videogame Museum in California Will be Moving to New Home

Aug
02

By David Cox

Oakland, California – October 5, 2015 – The Museum of Art and Digital Entertainment (The MADE) is gearing up to move into its new home. This non-profit videogame museum is nearing the end of its Kickstarter campaign for $50,000, all of which will be used to move the facility into a larger, more accessible space.

The MADE was one of the original Kickstarter success stories, originally raising $20,000 on the site in 2011. Those funds were used to open America’s first dedicated, all-playable, open to the public videogame museum focusing on home and console games.

Now 4 years old, the MADE has outgrown its current location and is seeking to move into a space double its current size, elsewhere in Oakland. The all-volunteer museum is aiming to raise $50,000, which will fund the renovation and move in for a new location.

“We’ve done a lot of great work here, behind City Hall in Oakland, but it’s time to expand in every way,” said Alex Handy, founder and director of the MADE. “Our tournaments are standing room only and our collection grows every single weekend through new donations. We’re excited about the prospect of showing everything off in a better suited location.”

The MADE aims to preserve the history of videogames through playable exhibits and free programming classes. In its four year history, the museum has trained over 400 students in skills ranging from Scratch, C and Android development, to Photoshop, Unity, Presonus and ProTools.

Henry Lowood, Curator for History of Science & Technology Collections at Stanford University Libraries and founding member of The MADE’s board of directors, said that “Digital games without a doubt have become one of the central creative media available for entertainment, art and other forms of expression. So much so that contemporary cultural history is difficult to talk about without including digital games. As a result, not only will the history of this medium be lost if we do not preserve the history of digital games, but there is more at stake: we will be unable to provide a complete cultural history of our times.”

To this end, in its four year history, the MADE has worked to preserve and relaunch Habitat, the first graphical MMO for the Commodore 64, the long lost GamePro 1996 TV show, and has worked with the EFF to change copyright law around the preservation of old videogames.

The MADE’s Kickstarter is online at: https://www.kickstarter.com/projects/themade/the-museum-of-art-and-digit…

About The Museum of Art and Digital Entertainment (MADE)

Founded in 2010, The MADE is an all volunteer organization created by Alex Handy, a video game journalist and technology archaeologist based in Oakland, California. In 2008 Mr. Handy unearthed a 25-year-old parcel of long lost Atari 2600 and Colecovision games at a flea market in that city, spurring his creation of the Museum. The MADE is an IRS recognized 501c(3) not-for-profit organization. It’s EIN number is 26-4570976 . The MADE raised its initial $20,000 on crowd-funding site Kickstarter.com, and has used those funds to pay for rent, Internet and insurance at its facilities in downtown Oakland. That facility opened in October of 2011. Since that time, the MADE has released many lost videos from the industry, worked with the EFF on copyright law, and started an effort to relaunch Habitat, the first graphical MMO. The MADE is 100% volunteer operated.

The MADE is open weekends from noon to 6 PM. Admission is free. More info is online at https://themade.org/posts/1385

The MADE

610 16th St.

Suite 230 (Second floor)

Oakland, CA 94612

Dial #0230 to be buzzed in

510-210-0291

Contacts:

Alex Handy

Director, The MADE

510-282-4840

alex@themade.org

The Museum of Art and Digital Entertainment (The MADE)

http://www.themade.org

New acoustic guitar microphone iRig Acoustic Condenser Microphone

Aug
02

By David Cox

First published in Examiner.com

The first ‘contact microphones’ as they were called in the 1970s came in a small plastic boxes and were disk-shaped. With them came this strange putty-like material that was a bit like a cross between silly putty and blu-tac.

The idea was you put the mic near the 6 o’clock position on the wood beneath the sound hole of your acoustic guitar and connected the other end to your amp. Of course being a microphone, it would invariably feedback at the soonest opportunity. It worked, but only just.

The best amplified acoustic guitars have long since has their own mikes built into them, with the electronics virtually embedded into the bodywork and a phono jack near where the guitar strap goes.

Now has come a removable acoustic condenser mike that works like a charm, is built very well and comes, crucially with some dedicated software that enables the player to sound as good as the best in the business.

According to IK Multimedia’s website, the device’s inception was inspired in part by the a well known feature length documentary on flamenco master Paco de Lucia. Made specifically for the acoustic player, the mic is designed to work in conjunction with dedicated apps that then take the nylon or steel string acoustic sound and ‘sculpt’ it and further process it for an optimized sound.

(MicroElectrical-Mechanical System) microphone technology replicates the positioning of a high quality microphone with an omnidirectional polar pattern, placement of the microphone just inside the sound hole where the output is optimal. This is combined with a “calibration” process that optimizes the guitar sound as if it were being miked externally. The result is a complete ‘tone picture’ of the guitar rather than simply a recording of the physical vibrations of the wood and strings.

iRig Acoustic is packaged with AmpliTube Acoustic FREE (download on the App Store), the new acoustic-specific AmpliTube app designed for processing and recording acoustic guitar and ukuleles.

First, when used with iRig Acoustic, the AmpliTube Acoustic app features a calibration and setup process that measures and optimizes the frequency response of your acoustic instrument and provides the “sweet spot” sonic clarity, tonal characteristics and projection that you normally get in the studio with an expensive high-end studio condenser microphone positioned in just the right spot. iRig Acoustic and AmpliTube Acoustic deliver that ultimate level of acoustic realness and character as an optimized system for a fraction of the cost.

The tone studio offers emulations of popular acoustic amplifiers; two solid state amps and one tube amp, complete with integrated effects sections as well as guitar pedal effects, including a live performance “Feed Kill” feedback eliminator, an acoustic compressor, graphic EQ and Parametric EQ; a 12-String emulator, a “Bass Maker” octave pedal, plus a “Body Modeler” that alters the sound of your guitar into another style of guitar. That’s pretty amazing, really.

One thing is for sure. It is more than the simple contact microphones of the ’70s.

RRP: $49.99

For more information visit:

http://www.ikmultimedia.com/products/irigacoustic

Films online about Earth and Man – Anthropocinema!

Aug
02

By David Cox

The Holocene is over and the Anthropocene defines our epoch. Mankind’s irreparable and irreversible influence on the face of the planet will totally define its fate from now on. Several films that have as their central idea man’s impact on the planet.

Koyaanisquatsi, “Life out of Balance”, the epic 1982 documentary by Godfrey Reggio with music composed by Philip Glass and cinematography by Ron Fricke.shows the impact of man on his own environment and culture through a cornucopian montage of time-lapse, slow motion and panoramic vistas of spectacular yet fragile landscapes. I first saw it on film at the Valhalla Cinema, a repertory theater in Richmond, Melbourne whose audience was mainly students, and switched on counterculture types. I remember the kaleidoscopic cascade-like film playing to row after row of amazed Melbournites.

The semi ad-hoc nature of the way Koyaanisquatsi has been made complements the wild theme-based structure. It is a symphony of shots that leave you with the sense that the world is mad with development and that our impact as a species on the planet is without limit or direction.

Three Hopi prophecies sung by a choral ensemble during the latter part of the “Prophecies” movement are translated just prior to the end credits:

• “If we dig precious things from the land, we will invite disaster.”

• “Near the day of Purification, there will be cobwebs spun back and forth in the sky.”

• “A container of ashes might one day be thrown from the sky, which could burn the land and boil the oceans.”

Wax: Or the Discovery of Television by the Bees an experimental science fiction film by David Blair (1991) was prescient in fusing a vision of a world in which a Middle-Eastern war, photography, mathematics, and geometry had resulted from fusion of communication between bees and humans.

Wax arrived at a time when both its means of production and the themes it was addressing converged elegantly via the then brand new dimension of the Internet. I was able to count on one hand the number of people I could email when the film was released, and the terror and possibilities of new modes of communication we all felt in these early days are beautifully embodied in the film. William Burroughs himself wanders through the film, ambassador of all that is juxtaposed and otherworldly, and it is fitting that he should preside in this world, which seems to speak to our neoliberal wasteland today, devoid as it is rapidly becoming, of its UBER influence over Alles.

Powers of Ten by Eames Studio (1977) took the time to show the relationship of Earth to its planetary neighbors, and at the same time revealing the makeup of human matter at the atomic scale.

This mind-boggling animated journey into scalar depiction and scientific humanist relativism became the mainstay for many a high school and college study session. It ponders the big questions about our place in the universe and the universe in us. It was not the first film to examine the universe from the point of view of relative exponential scale (Cosmic Zoom predated it by several years) but it was certainly the first to do so in a way that precisely understood the relationship between all this cosmic measurement and the role of companies like IBM who distributed the film, and the way that such corporate sponsorship of the eternal would come to define the world in which we live today. The Anthropocene is nothing if not brought to you by the Biggest of the Big Players, then as now.

The Stalker (1979) by Andrie Tarkovsky is noted for its stark use of gritty, earthy close ups of mud, swamps, and the very material makeup of the planet itself. It shows a journey led by the ‘Stalker’ (Aleksandr Kaidanovsky) to take his two clients, a melancholic writer (Anatoli Solonitsyn) seeking inspiration and a professor (Nikolai Grinko) who seeks scientific discovery, to a place known as the ‘Zone’, which has a place within it with the supposed ability to fulfill a person’s innermost desires.

The three travel through unnerving areas filled with the cast-off material of modern society. They yell at each other, and on confronting the ‘Zone’ it would appear that it is in fact alive. Traversing “The Zone” can be felt but not really seen. The Cacophany Society’s Carrie Galbraith has said that the original “Burning Man” event was in fact one of several “Zone Trips” that were inspired by “The Stalker”, number 4 in fact, and the idea that a sentient earth receptive to the thoughts of those that engage with it is entirely consistent with the ideas of utopian groups who offer alternative uses for Federal desert land such as the Center for Land Use Interpretation. The contemporary Burning Man is a far cry from the ad hoc aims of those who interpreted the same site for “Zone Trip Number Four”, and the world is worse for it.

Don’t look up to heaven for transcendence, look down; at the shit, the mud, the earth, the swamp and all the fine grained individual particles of dirt and muck that make up our lives on this most finite of planets. For the effect-of-man-on-the-earth should be measured thus, the better to take account of all that has been moved out of place in the name of modernity, and all that has unfolded since.

The documentary Manufactured Landscapes, directed by Jennifer Baichwal is about the work of photographer Edward Burtynsky whose work concerns itself with the impact of massive manufacting plants on the earth’s environment. Enormous factories, large scale infrastructure programs, many of which are in mainland China form the basis of this extraordinary film about the bigger picture of global trade and its scarring effect on surface of the earth, and demands it makes on those caught in its seemingly unstoppable flows.

Together combined, the above films make for an elegant mini film festival on the Anthropocine – call it Anthropocinema. Thankfully, most are online for free.

Koyaanisquatsi:

https://vimeo.com/46358393

Wax: Or the Discovery of Television by the Bees:

https://www.youtube.com/watch?v=aDzl6SBCX0c

Powers of Ten – Eames Studio (1977)

https://www.youtube.com/watch?v=0fKBhvDjuy0

The Stalker (1979) by Andrie Tarkovsky
Part 1

https://www.youtube.com/watch?v=JYEfJhkPK7o

Part 2

https://www.youtube.com/watch?v=hUHBgqx8YP8

Manufactured Landscapes, directed by Jennifer Baichwal (trailer)

https://www.youtube.com/watch?v=r327NuF6GQM

Article: After Anthropocinema by Mohammad Salemy at the Brooklyn Rail website.

http://www.brooklynrail.org/2014/07/criticspage/after-anthropocinema

Real portability comes to a studio-quality music interface device

Aug
02

By David Cox

First Published on December 14, 2015 2:17 PM MST

  • iRig Pro Duo Music Portable Interface Device (used with permission of IK Multimedia)

IK Multimedia, YouTube,

iRig Pro Duo USB MIDI Interface

iRig Pro Duo is a studio-quality 24-bit audio box that is very easy to use, solidly built and is effectively, a bridge between the real world of microphones, guitars, basses, keyboards and those sometimes precious, finicky phones, pads, tablets and laptops and all the variations in the between.

Guitars and keyboards are from the 20th Century. They are heavy, solid, require effort to use and lift and carry. Data is about now. It is invisible, glows in the dark and is all around us. The iRig Pro Duo offers a kind of high speed tunnel between the two worlds. It is an Analog to Digital Converter for creative audio artists on the move, and the Swiss Army Knife version of one at that.

There are 2 channels with XLR/TRS combo audio jacks and phantom power so you can use high-end condenser microphones as well as guitars, basses, keyboards and any MIDI controller out there. This reminds me very strongly of the famous Zoom Hn4 field recorder which has a similar setup for audio in, and which can also be used as a USB interface.

Connections include Lightning, USB-OTG and USB and it comes with the proper cables which is a great thing given its true portability.

It is possible to do Dual-channel recording with each channel having its own input gain control; each channel can be afforded the correct input signal while recording.

MIDI machine

With iRig Pro DUO, you don’t just have a superb portable interface for audio; you can also hook up your favorite MIDI controllers. This is thanks to its included TRS to MIDI-DIN cables and dedicated MIDI in/out jacks, so you can control MIDI-compatible software (or send MIDI data to MIDI-compatible hardware, like synthesizers, drum machines and samplers) with plug and play simplicity.

High-quality studio condenser microphones require 48V of external power to function. It’s part of what makes them sound so good, crisp and accurate. With iRig Pro DUO, you’ll have that 48V of phantom power at your fingertips. Just plug in your high-quality condenser mic, flip iRig Pro DUO’s phantom power switch and record until your heart’s content.

Monitoring & output

iRig Pro DUO comes with two 1/4” TRS balanced analog audio outputs for your speakers.

These outputs provide clear audio signal via onboard output drivers.

You can also do direct monitoring to monitor either the direct incoming “dry” signals, or the processed signal coming back at you from your device or computer. There is a output amplifier as well so you can hear your mix back nice and loud via headphones. iRig Pro DUO can be powered with 2 AA batteries for use with iPhone or iPad. In addition, you can plug in a USB or Micro USB to OTG cables and iRig Pro DUO is then powered by the connected computer or Android device, or the AA batteries. Camped out in the studio? Plug it into the wall with a DC power adapter (not included) for extended recording sessions.

Features

• Truly mobile dual input audio interface for iPhone, iPad, Android, Mac and PC

• Simultaneous dual track recording interface for all instruments

• Ultra-compact housing for extreme portability

• Dual identical XLR/TRS combo input jacks

• Dual ultra-low noise studio-quality IK preamps

• Individual input gain controls

• 48V phantom power

• Self-powered (2 AA batteries), device powered or DC power adapter (not included)

• 24-bit AD-DA converters

• Dual 1/4” switchable TRS balanced outs

• 1/8” 3.5mm Headphone out w/ level control

• MIDI IN/OUT jacks

• Ultra-compact housing fits in the palm of your hand

• Comes with mini-DIN to Lighting, Micro USB OTG and standard USB cables

• Designed and made in Italy

iRig Pro Duo is the MIDI box to take with you the next time to want to do some recording in the park, or to play outside with your keyboards with the band somewhere out in the open. Its super-versatile, highly adaptable and has uses to which I’m sure even the inventors have yet to discover.

RRP: $199.99

For more information visit:

http://www.ikmultimedia.com/products/irigproduo/

Amplitube 4 – the Amplifier Modeling Resource for Guitarists

Aug
02

Most of the guitar amplifiers I’ve owned have had stenciled numbers on the side, been passed on from five other bands before landing in my garage or bedroom. Amps are like that, are a bit like old cars or motorcycles. They arrive, and are used, then disappear into that great gig domain from whence you may see them again, or you may not. Would that it were possible to invoke them all again. And all the other amps that you could never own for reasons of economy or rarity, or any other reason. Then there are the great guitar riffs and patterns and solos of all time.

With these come as part of their uniqueness the sound of the amplifier the guitars were played through on the records and the concerts. The distinctive sound of the configuration of amplifier and effects of each major guitarist is as unique as anything they played and in as many ways as crucial and central to their sound as the notes themselves. Jimi Hendrix and his epic Marshall Stacks. Fresh-faced early Beatle George Harrison and his modest undersized Vox amps. I had a Vox guitar once. A cherry-red Junior. I loved that thing. I had no amp, so I played it through my dad’s stereo system.

Think of the crisp metallic sound treble reverberation of Hendrix’ Fender Stratocaster on And the Wind Cried Mary or Andy Summer’s distant chorus and echo Fender Telecaster sound on the Police’s Walking on the Moon these are as much about the way the guitarist set up the amplifier and pedal effect as they are about the chords played. This is so obvious is goes without saying, except that in today’s world amp and effects configurations have been largely the province of guitar amps, and physical effects pedals.

AmpliTube 4, is a major upgrade to the world’s most powerful guitar and bass tone studio for Mac/PC, is here and will take you to a level of hyper-realism and customization of tone you never knew possible.

Here are some of the specs:

  • Hyper-Realistic tone
  • 3D Cab Room w/ selectable room simulations
  • Dual Mic placement on any speaker
  • Individual speaker selection
  • Speaker interaction modeling
  • Cabinet mixer for microphones, room, DI and master level
  • New British Series Amps
  • Power Amp/Speaker dynamic response
  • Acoustic Simulator
  • Effects loop slot between pre and power amp
  • Universal effects placement
  • Rack effects can be used as stomp effects
  • Stomp effects can be used in rack section
  • 8-track DAW/Recorder
  • 4-track Looper
  • UltraTuner
  • Built-in Custom Shop

AmpliTube 4 is also a guitar and bass tone studio for Mac/PC that works as a standalone application and as a plug-in for your favorite DAW. AmpliTube recreates the entire guitar/bass signal chain from instrument to recording device.

So it is entirely possible now for me to reproduce exactly the sound of Chris Squire’s metallic bass sound on Siberian Khatru from Close to the Edge for example, or Simple Mind’s Charlie Burchill’s shimmering twin Jazz Chorus amps with echo like the ones I saw him play in Melbourne, Australia in 1983. I remember that Like a Glittering Prize sounded like it was coming out of a guitar factory.

What’s particularly interesting with Amplitube 4 is that you can also actually “make” your own amplifier cabinets and modify the ones that come standard. Its called the “cab room”. It has five separate customization sections: cabinet selection with size adjustment where you can alter the speaker cabinet to go with the amp, a speaker selection where you can change the speakers, a microphone selection and placement area for finding the best place to put the mics, different virtual room types where you can try out different surrounding recording spaces, and a mixer that allows players to mix the levels of the mics, the room ambience, direct amp signal and overall main mix.

Amplitube also has full MIDI support which means you can use it with external controllers.

AmpliTube can work as a 64-bit plug-in for the most popular DAW (such as Pro Tools, Logic, GarageBand, Cubase, Live, Reaper) or can be used in stand-alone mode in Mac OS X and Windows. The plug-in and standalone versions offer the same function and sound, but the standalone version also offers a built-in 8-track recorder and DAW, plus a 4-track looper so you can capture your ideas at the moment of inspiration. Both the standalone and plug-in versions are included when you install AmpliTube.

AmpliTube can work as a 64-bit plug-in for the most popular DAW (such as Pro Tools, Logic, GarageBand, Cubase, Live, Reaper) or can be used in stand-alone mode in Mac OS X and Windows. The plug-in and standalone versions offer the same function and sound, but the standalone version also offers a built-in 8-track recorder and DAW, plus a 4-track looper so you can capture your ideas at the moment of inspiration. Both the standalone and plug-in versions are included when you install AmpliTube.

It arrives out-of-the-box with the ability to add equipment to your such as amps, pedals, cabinets from Fender®, MESA/Boogie®, Ampeg®, Orange®, Soldano™, Groove Tubes®, Gallien-Krueger®, Jet City Amplification™, THD® and T-Rex®, Fulltone®, Z.Vex®, Carvin®, Morley®, Wampler Pedals®, Dr. Z®, ENGL® etc. I’m loving using Amplitube 4 as I could never conceivably own all the amps represented in its vast database, nor would I want to. But the ability to model them means I can call upon them any time. Its like having my own private guitar store at my fingertips.

RRP: $149.99

Click here for further information

The Selfie Stick on Steroids – iKlip Grip

Aug
02

 

First published in Examiner.com on September 17, 2015 12:39 PM MST

By David Cox

iKlip Grip

IK Multimedia

Sometimes a product comes along that tries to do lots of things at once and finds a way for all of these functions to complement each other nicely. Such is the case with IK Multimedia’s iKlip Grip. Someone has put a lot of time into this device and the time and energy has paid off. Many of us working with smartphones as professional audiovisual recording devices are often in need of a kind of universal hyper-portable mount for tabletop interviews, field recordings, and shots of oneself doing things. The iKlip Grip could be the answer for many such everyday situations out there in the field.The IK Multimedia iKlip Grip offers a range of functions in one. Its essentially a device that holds your smartphone and lets you operate it at a distance. Yes it is at its core a so-called ‘selfie stick’. But a very well made one. In addition to being a selfie stick however it is also a tripod mount, and ingeniously, a handle that enables the user to carry their phone upright and facing outward like a ping pong paddle.

Back in film school the hip kids had grips like these custom made for their Bolex 16mm cameras but today an iPhone 6 has double the resolution of 16mm so this type of device is a neat inheritor of the tradition. The iKlip Grip can be broken down and taken apart and put into a bag second of all it the grip that holds the phone can accommodate the latest fabulous an iPhone sixes and so on and these devices are a professional right now so the quality level of this device matches that found in many contemporary phones and tablets.

The Italian made device has an ingenious expandable and secure spring-loaded bracket that allows you to adjust the position of your phone at any angle for recording video, audio or images. There is a neat bracket also with the must-have standard 1/4″-20 tripod screw thread so it can be mounted to any standard tripod also which from a film making point of view is a must. You can flip out small ovoid shaped ‘legs’ at the base of the iKlip Grip that lets you turn it into a modest tripod as well, although extending it this way makes it a little unstable, and it is best to leave it in the lowest of the ‘telescoped’ positions. This kind of petal folding open element is really good but any criticism I might have is that these petals stay open too easily. It would be nice to have some kind of means to keep them closed when not in use. I use a rubber band but some of the kind of clip or latch would be useful here or even velcro on the insides of the ‘petals’ to keep them closed in your bag would be good.

The iKlip Grip truly has a “telescoping detachable tube” that extends your reach by a full 17.5″ so positioning your camera, phone or recorder in the right spot is even easier. Plus, iKlip Grip has a pivoting ball-joint attachment point that provides a full 90° angle adjustment plus 360° rotation capabilities of the device. This allows the user to get just the right angle and position for a shot. iKlip Grip also comes with a Bluetooth smartphone shutter* remote control that lets users remotely activate the shutter button in video and photo apps. The remote is universally compatible with iOS and Android and can be operated up to 10 meters away from the device, this is very helpful for capturing live performances, family vacation photos, movies and of course the ubiquitous ‘selfie’.

So what is the iKlip Grip? A well made extendable selfie-stick and iphone tripod that also works as a grip for your phone as well. It’s a selfie stick on steroids.

Features

• 4-in-1 video stand: handle, tripod, monopod, tripod adapter

• Includes remote Bluetooth shutter*

• Adjustable angle

• Extends up to 50cm / 19.7″

• Standard 1/4” UNC threaded ball mount

• Works with smartphones with screens from 3.5″ to 6″ with case on

• Works with digital cameras and small video cameras like GoPro

• Works with handheld audio field recorders

RRP: $59.99

For more information see:

http://www.ikmultimedia.com/products/iklipgrip/

The Language of Virtual Reality

Aug
02

First Published in Examiner.com on September 10, 2015 9:00 AM MST

By David Cox

Dactyl Nightmare – VR arcade game Virtuality in the early 1990s was a line of virtual reality gaming machines produced by Virtuality Group.

.

First Published in Otherzine as “Mise-en-Experience”

by David Cox

We are all spied upon, archived, forgotten. What are the new meanings that define this aesthetics of experience? What are the aesthetics of framing– ‘that contained within the experience’. If the film-maker has become the creator of experiences, what is to be ‘contained within’ the experience?

With Virtual Reality the décor is that constructed with the tools used to generate the content for the experience for the user/viewer. The entirety of this experience is likely to have been either recorded from an experience in the world using 360-degree cameras such as the Go Pro mounts that resemble cubes with cameras facing every direction, or generated using 3D CGI and game engine created imagery.

Contemporary visual culture increasingly embraces and celebrates the notion of complete “experience”. Just as the panorama of the 19th century enveloped the visitor in a 180-degree scene, today’s virtual reality and augmented reality seek to provide a complete, enveloping, melding of data, audiovisual information, sight, sound, and the concurrent real-time combination of these with everyday activity. The world is pregnant with buildings, streets, people, and objects supposedly seamlessly integrated with their data and their own archives, yet ironically the more connected subjects are, the less in touch with each other the same subjects appear to be. Populations have never had more connectivity, yet have never more resembled disassociated zombies of retina touchscreen distraction.

Drones, the so-called Internet of Things (IoT), artificial intelligences now mediate everything. Spies, spooks, whistle blowers and nefarious actors hide in the shadows or “bare all” for download in the bright light of accountability. Data is not just a fact of contemporary life for anyone participating in the western world, it is a requirement, an obligation. Augmenting one’s experience is rapidly becoming the domain in which aesthetics is unfolding as well. Where the screen and the stage once framed ‘that contained within’ or ‘mis en’, now the entirety of experience itself is the framework of the act of aesthetic organization.

Oculus rift brochure.

At a Virtual Reality film-making conference recently, I heard panelists talking about ‘in-sphere’ and ‘out-of-sphere’ as a correlate for ‘in-frame’ and ‘out-of-frame’. The sphere, or rather that fishbowl-like region into which our heads and sensibilities are placed when we put on a head-mounted display such as Oculus Rift or Google Cardboard is the ‘stage’ where ‘mise-en-experience’ takes place.

VR experiences may be recorded using 360-degree cameras, or constructed using tools such as the game engine UNITY and UNREAL. In such cases, the viewer is expected to remove herself from the ordinary experience of life and cross over into a documentary, fiction or some such combination of the two. Either way in this artificial panoramic realm, the notion of the ‘scene’ presents new ways of being considered.

The geo-spatial world around us unadorned might, with Augmented Reality or AR and wearable computers for example, be added onto with audiovisual and data features pertinent to locale, the viewer’s history, her trajectory though time and space, her sense of her self in relation to others likewise ‘connected’. Head mounted displays, wearable computers, and technologies that meld the past with the present, the physical with the nonphysical are coming in fast. Sensors scan everything. Metadata about metadata joins a sea of associations in a never-ending flowchart of patterns of ideas. A vast intricate spiderweb universe of everyone sharing everyone’s secrets and banal facts, all visual, all sensed, all parsed by algorithms is envisioned. I can put on my headset and see objects all around me. And yours. And you mine. We can share it all.

The concept of ‘that contained within the scene’ in terms of virtual reality encompasses the user’s “own” sense of the entirety of experience. When she puts on the Oculus Rift headset she is really gaining a sense of the totality of everything around her that is placed within the same realm, much as she would experience a place if she were a tourist on holiday. What she is being invited to understand and enjoy and appreciate is the experience. And with the VR/AR she is able to see and listening to everything that she is experiencing.

Virtual Boy VR Gaming System.

Like Videogame designers VR experience designers are looking to create something that is “in the round” , not only recorded in the audiovisual sense but also in what we might call the experiential surround. Virtual Reality experience designers are looking to create experiences that literally encapsulate the entirety of being-in-the –moment. VR cameras are like any other cameras and can only be placed in certain locales. One can jump cut from one camera to another. Cameras can move. Users often complain about motion sickness, but that is a discussion for another time. The point is the extensive spherical scope of VR and its ability to make the scene into experience and visa versa.

Beyond the simple documentary nuances of the everyday that people might undergo in the course of the average day, VR can record the sense of what it was like to inhabit a place. Instead of a framed shot, we might now be said to be gaining a sphered shot .

When the camera is moved so is the experience of moving the whole eye-and-head-platform such that all around can be seen as the act of the camera is moved. CGI and ‘reality’ can be combined such that for example, when one looks up and looks down its possible for the ‘ground’ to appear to be actually giving way, or for the ‘sky’ to splinter into pieces. Virtual Reality and Augmented Reality designers are really looking to create personal moments and situations.

‘Mise-en-experience‘ is about planning and considering the inclusion of every aspect of the virtual immersion event. Just as Banksy felt he could best articulate his disillusionment and anger at the dismal world of today with the theme park art installation Dismaland, such a gesture also might have been accomplished using Virtual Reality.

Thundering Turbo, 1982.

Tomytronic 3D Thundering Turbo game system by Tomy (1982)

According to game designer Scott Rogers,“Everything I learned about game design I learned from Disneyland”. In both Disneyland and Dismaland there are “weenies”– central architectural features like the princess castle and the Matterhorn that attract the viewer to them – large structures that attract the user/viewer/visitor. There are meaningfully placed paths between these attractions. And it is the path-ing as much as the ‘weenies’ that determines the effectiveness of the experience. The VR/Game/Theme Park/Art Installation designer is the designer of “the dynamics of being there”. The designer of what we might call the ‘dance of the user’ in their imaginary place, the sense of motion through that place.

This complex and dynamic relationship between the user and her environment, the relationship of that use to other users in a shared space is of course ‘cybernetics’, identified by Norbert Weiner in the 1940s. But in terms of VR and its effects on people as they move around and manipulate their 3D VR avatars online in real time, it is also what Jaron Lanier in the early 1990s used to describe as body music; the post symbolic language of user’s gestures. This, too, is part of ‘mise-en-experience‘.

Why place objects in a virtual scene and limit oneself to the rules that govern physics and expectation in the normal world? Why use natural boundaries or spatial boundaries in the way they would normally apply at all?

Lessons learned from game design, architecture and urban planning, dance and other arts, which involve time space and motion can assist here. From games comes the notion of the spatial and temporal boundary. Placing limits on where a person can go in order to manage the experience, but also to manage memory (computer memory, not human memory). Temporal constraints – time limits can help frame the sense of there being goals, challenges and rules, bringing VR within the context of ludology. What is the victory condition? Then there is the field of music; the mathematical breaking down of time into fractions where the control of air pressure notation wave forms overtime is also part of the general sensibility. VR and AR can borrow from music, the patterning of events over time to create a sweep of moments, much the way the great composers arranged events to create emotional tones. Consider the VR experience that is the Beatles’ White Album, each song a mini-movie or VR scene, complete with characters, settings, events.

Bubl-camera, 2015.

Bubl camera – Kickstarter funded 360 degree camera used for consumer-level VR photography and film making.

VR and AR aesthetics borrow heavily from cinema itself particularly animation with its fractious breaking down of time into the individual frame. It’s not by accident that many Virtual Reality movie productions are made using the same tools often used in video game design such as Unity and Unreal. Both of these tools allow for the creation of 3D models elsewhere, in resources like 3DS Max and Maya where they are also animated, and for these to be imported into experiences that may well share the ‘sphere-space.’ Views of the real world can thus be meshed in experience with unique imported objects.

At the VR conference in San Jose this year, a panel on VR film making emphasized the ‘problems of nausea’. As VR goes mainstream, the rush to find a way to commercialize it as an extension of cinema or a kind of ‘cinema-in-the-round’ does little to understand the medium’s origins in the research labs of the late 1970s and early 1980s. Back then, the desire to use the medium to discover entirely new categories of experiences was more the goal. But since the economic lure of mainstream entertainment is both risk averse and extremely strong, pundits would sooner see a return on investment by retooling VR for already proven genre fiction uses (Space Marines, Extreme Sports, Adventure Heroes in the 3rd World) than dare to take a chance on something that genuinely breaks the mold. But we shall see. Perhaps there will be a paradigm shift, a new group of experiences which will emerge that will bring us the Bruce Conner of VR, or the Brothers Quay of Augmented Reality.

It could well be that the massive archives of existing film merge with the online databases in the development of new hybrid media forms. A digitized future lies in how truly creative and experimental filmmakers and VR designers will combine these worlds.

CASE STUDY – Archival footage from 8 years ago about where I am on this street corner in San Francisco is being superimposed over my view of it it now. I can walk to the very same spot to see where it was filmed from the very same angle.

Like Dziga Vertov, the revolutionary Russian constructivist film maker of the 1920 who placed his camera everywhere he could and drew attention to the fact in the very form and structure of films like “The Man with the Movie Camera”, I am “Kino Eye”, all over again. All the information about the conditions on that day are available to me and all those about today, too. And all the people in the footage. And their relatives. And all the other people who have seen the footage. And what they thought. And what I think. And on and on and on. Then there is the 3D data about the buildings and the pipes under the road; the flight patterns of the planes above.

IRig UA – Guitar Interface for Android

Aug
02

First published in Examiner.com August 11, 2015 1:03 PM MST

By David Cox

The first universal guitar effects processor and interface for all Android devices

These days everything hinges on the interface with whether it’s the dashboard of your car on the glass surface of your smart device. It’s no different with your guitar to get your guitar to “talk” to your Android smart phone or ‘phablet’. You need something to translate all those guitar licks into digital signals such that they can be made use of by your effects apps and/or recording software. And that’s where IK Multimedia’s iRig UA comes in.

But the problem with such devices is always been latency. The delay between the signal of the guitar and the sound that you hear when you listen to it. Decrease the latency and you improve the experience.

As more and more people use the electric guitar the challenge of how to get a clean signal onto a smart phone for recording or performance remains. The software and the hardware of the phone itself have to do much of the heavy lifting. Any help they can get is greatly appreciated. Minimize latency and you improve performance. Everything is better. It pairs with AmpliTube UA to deliver solid sonic performance when used with any smartphone or tablet with Android 4.2 or higher. These include host mode/USB OTG. Android devices it works with include Samsung, Sony, Motorola, LG, HTC, Xiaomi or any other popular manufacturer.

iRig UA uses a built-in digital signal processor (DSP) that solves the issue of inconsistent OS latency on the Android platform. It’s able to do this by moving all processing to an external accessory. iRig UA’s DSP has been designed to work with a companion app, AmpliTube UA. The app, which comes with iRig UA, is powered by the DSP. AmpliTube provides a collection of virtual sound processing equipment to customize your guitar sound. A versatile system, iRig UA can also be used as a digital recording interface when connected to a Samsung Professional Audio compatible device or smartphone or tablet with Android 5.0.

This pairing of iRig UA and AmpliTube UA is perfect for when you need to practice in transit. iRig UA features a 1/4” input for a guitar, bass or other line-level instrument, a micro-USB to OTG cable and an 1/8” headphone output with volume control. I for one am grateful for this suite of inputs and outputs. I like to play along with .mp3s from other devices.

When playing my MIM Fender Strat through a Samsung Galaxy S using Amplitube, the signal was surprisingly clean and robust. I had some delay and reverb and a combo of chorus and distortion and I found that it felt just like playing through battery-powered metal BOSS stomp boxes from years ago. In fact it feels and sounds almost analog. Why? It uses hardware-based processing.

iRig UA’s on-board digital signal processor works in conjunction with AmpliTube UA, a special version of IK’s powerful guitar and bass multi-effects processor designed specifically for use with iRig UA. The processing is handled on iRig UA itself and not on the Android device, it’s able to provide consistent near-zero latency performance (down to 2 ms round-trip total latency) that’s independent of the make and model of your connected smartphone or tablet.

IK multimedia is getting very good at the small, portable pocket guitar interface device market, and this new push into the Android marketplace speaks to the rising tide of players using Galaxies and other such UA driven smart devices. Listening to guitarist and what they want has also helped in the decisions that led to iRig UA.

Features:

Works on any smartphone or tablet that supports Android 4.2 or higher and host mode/USB OTG

Near-zero latency digital FX processing

Digital audio recording on Android 5.0 and Samsung Professional Audio Devices

32-bit digital signal processor

24-bit converter with 44.1/48KHz sample rate

High-definition, low-noise preamp

Includes AmpliTube UA

Analog aux input for play-along and practice

Headphone output and volume control

Multicolor LED

Ultra-compact and lightweight design

Micro-USB port

RRP $99.99

Click here for more information