modulus

Amplitube 4 – the Amplifier Modeling Resource for Guitarists

Aug
02

Most of the guitar amplifiers I’ve owned have had stenciled numbers on the side, been passed on from five other bands before landing in my garage or bedroom. Amps are like that, are a bit like old cars or motorcycles. They arrive, and are used, then disappear into that great gig domain from whence you may see them again, or you may not. Would that it were possible to invoke them all again. And all the other amps that you could never own for reasons of economy or rarity, or any other reason. Then there are the great guitar riffs and patterns and solos of all time.

With these come as part of their uniqueness the sound of the amplifier the guitars were played through on the records and the concerts. The distinctive sound of the configuration of amplifier and effects of each major guitarist is as unique as anything they played and in as many ways as crucial and central to their sound as the notes themselves. Jimi Hendrix and his epic Marshall Stacks. Fresh-faced early Beatle George Harrison and his modest undersized Vox amps. I had a Vox guitar once. A cherry-red Junior. I loved that thing. I had no amp, so I played it through my dad’s stereo system.

Think of the crisp metallic sound treble reverberation of Hendrix’ Fender Stratocaster on And the Wind Cried Mary or Andy Summer’s distant chorus and echo Fender Telecaster sound on the Police’s Walking on the Moon these are as much about the way the guitarist set up the amplifier and pedal effect as they are about the chords played. This is so obvious is goes without saying, except that in today’s world amp and effects configurations have been largely the province of guitar amps, and physical effects pedals.

AmpliTube 4, is a major upgrade to the world’s most powerful guitar and bass tone studio for Mac/PC, is here and will take you to a level of hyper-realism and customization of tone you never knew possible.

Here are some of the specs:

  • Hyper-Realistic tone
  • 3D Cab Room w/ selectable room simulations
  • Dual Mic placement on any speaker
  • Individual speaker selection
  • Speaker interaction modeling
  • Cabinet mixer for microphones, room, DI and master level
  • New British Series Amps
  • Power Amp/Speaker dynamic response
  • Acoustic Simulator
  • Effects loop slot between pre and power amp
  • Universal effects placement
  • Rack effects can be used as stomp effects
  • Stomp effects can be used in rack section
  • 8-track DAW/Recorder
  • 4-track Looper
  • UltraTuner
  • Built-in Custom Shop

AmpliTube 4 is also a guitar and bass tone studio for Mac/PC that works as a standalone application and as a plug-in for your favorite DAW. AmpliTube recreates the entire guitar/bass signal chain from instrument to recording device.

So it is entirely possible now for me to reproduce exactly the sound of Chris Squire’s metallic bass sound on Siberian Khatru from Close to the Edge for example, or Simple Mind’s Charlie Burchill’s shimmering twin Jazz Chorus amps with echo like the ones I saw him play in Melbourne, Australia in 1983. I remember that Like a Glittering Prize sounded like it was coming out of a guitar factory.

What’s particularly interesting with Amplitube 4 is that you can also actually “make” your own amplifier cabinets and modify the ones that come standard. Its called the “cab room”. It has five separate customization sections: cabinet selection with size adjustment where you can alter the speaker cabinet to go with the amp, a speaker selection where you can change the speakers, a microphone selection and placement area for finding the best place to put the mics, different virtual room types where you can try out different surrounding recording spaces, and a mixer that allows players to mix the levels of the mics, the room ambience, direct amp signal and overall main mix.

Amplitube also has full MIDI support which means you can use it with external controllers.

AmpliTube can work as a 64-bit plug-in for the most popular DAW (such as Pro Tools, Logic, GarageBand, Cubase, Live, Reaper) or can be used in stand-alone mode in Mac OS X and Windows. The plug-in and standalone versions offer the same function and sound, but the standalone version also offers a built-in 8-track recorder and DAW, plus a 4-track looper so you can capture your ideas at the moment of inspiration. Both the standalone and plug-in versions are included when you install AmpliTube.

AmpliTube can work as a 64-bit plug-in for the most popular DAW (such as Pro Tools, Logic, GarageBand, Cubase, Live, Reaper) or can be used in stand-alone mode in Mac OS X and Windows. The plug-in and standalone versions offer the same function and sound, but the standalone version also offers a built-in 8-track recorder and DAW, plus a 4-track looper so you can capture your ideas at the moment of inspiration. Both the standalone and plug-in versions are included when you install AmpliTube.

It arrives out-of-the-box with the ability to add equipment to your such as amps, pedals, cabinets from Fender®, MESA/Boogie®, Ampeg®, Orange®, Soldano™, Groove Tubes®, Gallien-Krueger®, Jet City Amplification™, THD® and T-Rex®, Fulltone®, Z.Vex®, Carvin®, Morley®, Wampler Pedals®, Dr. Z®, ENGL® etc. I’m loving using Amplitube 4 as I could never conceivably own all the amps represented in its vast database, nor would I want to. But the ability to model them means I can call upon them any time. Its like having my own private guitar store at my fingertips.

RRP: $149.99

Click here for further information

The Selfie Stick on Steroids – iKlip Grip

Aug
02

 

First published in Examiner.com on September 17, 2015 12:39 PM MST

By David Cox

iKlip Grip

IK Multimedia

Sometimes a product comes along that tries to do lots of things at once and finds a way for all of these functions to complement each other nicely. Such is the case with IK Multimedia’s iKlip Grip. Someone has put a lot of time into this device and the time and energy has paid off. Many of us working with smartphones as professional audiovisual recording devices are often in need of a kind of universal hyper-portable mount for tabletop interviews, field recordings, and shots of oneself doing things. The iKlip Grip could be the answer for many such everyday situations out there in the field.The IK Multimedia iKlip Grip offers a range of functions in one. Its essentially a device that holds your smartphone and lets you operate it at a distance. Yes it is at its core a so-called ‘selfie stick’. But a very well made one. In addition to being a selfie stick however it is also a tripod mount, and ingeniously, a handle that enables the user to carry their phone upright and facing outward like a ping pong paddle.

Back in film school the hip kids had grips like these custom made for their Bolex 16mm cameras but today an iPhone 6 has double the resolution of 16mm so this type of device is a neat inheritor of the tradition. The iKlip Grip can be broken down and taken apart and put into a bag second of all it the grip that holds the phone can accommodate the latest fabulous an iPhone sixes and so on and these devices are a professional right now so the quality level of this device matches that found in many contemporary phones and tablets.

The Italian made device has an ingenious expandable and secure spring-loaded bracket that allows you to adjust the position of your phone at any angle for recording video, audio or images. There is a neat bracket also with the must-have standard 1/4″-20 tripod screw thread so it can be mounted to any standard tripod also which from a film making point of view is a must. You can flip out small ovoid shaped ‘legs’ at the base of the iKlip Grip that lets you turn it into a modest tripod as well, although extending it this way makes it a little unstable, and it is best to leave it in the lowest of the ‘telescoped’ positions. This kind of petal folding open element is really good but any criticism I might have is that these petals stay open too easily. It would be nice to have some kind of means to keep them closed when not in use. I use a rubber band but some of the kind of clip or latch would be useful here or even velcro on the insides of the ‘petals’ to keep them closed in your bag would be good.

The iKlip Grip truly has a “telescoping detachable tube” that extends your reach by a full 17.5″ so positioning your camera, phone or recorder in the right spot is even easier. Plus, iKlip Grip has a pivoting ball-joint attachment point that provides a full 90° angle adjustment plus 360° rotation capabilities of the device. This allows the user to get just the right angle and position for a shot. iKlip Grip also comes with a Bluetooth smartphone shutter* remote control that lets users remotely activate the shutter button in video and photo apps. The remote is universally compatible with iOS and Android and can be operated up to 10 meters away from the device, this is very helpful for capturing live performances, family vacation photos, movies and of course the ubiquitous ‘selfie’.

So what is the iKlip Grip? A well made extendable selfie-stick and iphone tripod that also works as a grip for your phone as well. It’s a selfie stick on steroids.

Features

• 4-in-1 video stand: handle, tripod, monopod, tripod adapter

• Includes remote Bluetooth shutter*

• Adjustable angle

• Extends up to 50cm / 19.7″

• Standard 1/4” UNC threaded ball mount

• Works with smartphones with screens from 3.5″ to 6″ with case on

• Works with digital cameras and small video cameras like GoPro

• Works with handheld audio field recorders

RRP: $59.99

For more information see:

http://www.ikmultimedia.com/products/iklipgrip/

The Language of Virtual Reality

Aug
02

First Published in Examiner.com on September 10, 2015 9:00 AM MST

By David Cox

Dactyl Nightmare – VR arcade game Virtuality in the early 1990s was a line of virtual reality gaming machines produced by Virtuality Group.

.

First Published in Otherzine as “Mise-en-Experience”

by David Cox

We are all spied upon, archived, forgotten. What are the new meanings that define this aesthetics of experience? What are the aesthetics of framing– ‘that contained within the experience’. If the film-maker has become the creator of experiences, what is to be ‘contained within’ the experience?

With Virtual Reality the décor is that constructed with the tools used to generate the content for the experience for the user/viewer. The entirety of this experience is likely to have been either recorded from an experience in the world using 360-degree cameras such as the Go Pro mounts that resemble cubes with cameras facing every direction, or generated using 3D CGI and game engine created imagery.

Contemporary visual culture increasingly embraces and celebrates the notion of complete “experience”. Just as the panorama of the 19th century enveloped the visitor in a 180-degree scene, today’s virtual reality and augmented reality seek to provide a complete, enveloping, melding of data, audiovisual information, sight, sound, and the concurrent real-time combination of these with everyday activity. The world is pregnant with buildings, streets, people, and objects supposedly seamlessly integrated with their data and their own archives, yet ironically the more connected subjects are, the less in touch with each other the same subjects appear to be. Populations have never had more connectivity, yet have never more resembled disassociated zombies of retina touchscreen distraction.

Drones, the so-called Internet of Things (IoT), artificial intelligences now mediate everything. Spies, spooks, whistle blowers and nefarious actors hide in the shadows or “bare all” for download in the bright light of accountability. Data is not just a fact of contemporary life for anyone participating in the western world, it is a requirement, an obligation. Augmenting one’s experience is rapidly becoming the domain in which aesthetics is unfolding as well. Where the screen and the stage once framed ‘that contained within’ or ‘mis en’, now the entirety of experience itself is the framework of the act of aesthetic organization.

Oculus rift brochure.

At a Virtual Reality film-making conference recently, I heard panelists talking about ‘in-sphere’ and ‘out-of-sphere’ as a correlate for ‘in-frame’ and ‘out-of-frame’. The sphere, or rather that fishbowl-like region into which our heads and sensibilities are placed when we put on a head-mounted display such as Oculus Rift or Google Cardboard is the ‘stage’ where ‘mise-en-experience’ takes place.

VR experiences may be recorded using 360-degree cameras, or constructed using tools such as the game engine UNITY and UNREAL. In such cases, the viewer is expected to remove herself from the ordinary experience of life and cross over into a documentary, fiction or some such combination of the two. Either way in this artificial panoramic realm, the notion of the ‘scene’ presents new ways of being considered.

The geo-spatial world around us unadorned might, with Augmented Reality or AR and wearable computers for example, be added onto with audiovisual and data features pertinent to locale, the viewer’s history, her trajectory though time and space, her sense of her self in relation to others likewise ‘connected’. Head mounted displays, wearable computers, and technologies that meld the past with the present, the physical with the nonphysical are coming in fast. Sensors scan everything. Metadata about metadata joins a sea of associations in a never-ending flowchart of patterns of ideas. A vast intricate spiderweb universe of everyone sharing everyone’s secrets and banal facts, all visual, all sensed, all parsed by algorithms is envisioned. I can put on my headset and see objects all around me. And yours. And you mine. We can share it all.

The concept of ‘that contained within the scene’ in terms of virtual reality encompasses the user’s “own” sense of the entirety of experience. When she puts on the Oculus Rift headset she is really gaining a sense of the totality of everything around her that is placed within the same realm, much as she would experience a place if she were a tourist on holiday. What she is being invited to understand and enjoy and appreciate is the experience. And with the VR/AR she is able to see and listening to everything that she is experiencing.

Virtual Boy VR Gaming System.

Like Videogame designers VR experience designers are looking to create something that is “in the round” , not only recorded in the audiovisual sense but also in what we might call the experiential surround. Virtual Reality experience designers are looking to create experiences that literally encapsulate the entirety of being-in-the –moment. VR cameras are like any other cameras and can only be placed in certain locales. One can jump cut from one camera to another. Cameras can move. Users often complain about motion sickness, but that is a discussion for another time. The point is the extensive spherical scope of VR and its ability to make the scene into experience and visa versa.

Beyond the simple documentary nuances of the everyday that people might undergo in the course of the average day, VR can record the sense of what it was like to inhabit a place. Instead of a framed shot, we might now be said to be gaining a sphered shot .

When the camera is moved so is the experience of moving the whole eye-and-head-platform such that all around can be seen as the act of the camera is moved. CGI and ‘reality’ can be combined such that for example, when one looks up and looks down its possible for the ‘ground’ to appear to be actually giving way, or for the ‘sky’ to splinter into pieces. Virtual Reality and Augmented Reality designers are really looking to create personal moments and situations.

‘Mise-en-experience‘ is about planning and considering the inclusion of every aspect of the virtual immersion event. Just as Banksy felt he could best articulate his disillusionment and anger at the dismal world of today with the theme park art installation Dismaland, such a gesture also might have been accomplished using Virtual Reality.

Thundering Turbo, 1982.

Tomytronic 3D Thundering Turbo game system by Tomy (1982)

According to game designer Scott Rogers,“Everything I learned about game design I learned from Disneyland”. In both Disneyland and Dismaland there are “weenies”– central architectural features like the princess castle and the Matterhorn that attract the viewer to them – large structures that attract the user/viewer/visitor. There are meaningfully placed paths between these attractions. And it is the path-ing as much as the ‘weenies’ that determines the effectiveness of the experience. The VR/Game/Theme Park/Art Installation designer is the designer of “the dynamics of being there”. The designer of what we might call the ‘dance of the user’ in their imaginary place, the sense of motion through that place.

This complex and dynamic relationship between the user and her environment, the relationship of that use to other users in a shared space is of course ‘cybernetics’, identified by Norbert Weiner in the 1940s. But in terms of VR and its effects on people as they move around and manipulate their 3D VR avatars online in real time, it is also what Jaron Lanier in the early 1990s used to describe as body music; the post symbolic language of user’s gestures. This, too, is part of ‘mise-en-experience‘.

Why place objects in a virtual scene and limit oneself to the rules that govern physics and expectation in the normal world? Why use natural boundaries or spatial boundaries in the way they would normally apply at all?

Lessons learned from game design, architecture and urban planning, dance and other arts, which involve time space and motion can assist here. From games comes the notion of the spatial and temporal boundary. Placing limits on where a person can go in order to manage the experience, but also to manage memory (computer memory, not human memory). Temporal constraints – time limits can help frame the sense of there being goals, challenges and rules, bringing VR within the context of ludology. What is the victory condition? Then there is the field of music; the mathematical breaking down of time into fractions where the control of air pressure notation wave forms overtime is also part of the general sensibility. VR and AR can borrow from music, the patterning of events over time to create a sweep of moments, much the way the great composers arranged events to create emotional tones. Consider the VR experience that is the Beatles’ White Album, each song a mini-movie or VR scene, complete with characters, settings, events.

Bubl-camera, 2015.

Bubl camera – Kickstarter funded 360 degree camera used for consumer-level VR photography and film making.

VR and AR aesthetics borrow heavily from cinema itself particularly animation with its fractious breaking down of time into the individual frame. It’s not by accident that many Virtual Reality movie productions are made using the same tools often used in video game design such as Unity and Unreal. Both of these tools allow for the creation of 3D models elsewhere, in resources like 3DS Max and Maya where they are also animated, and for these to be imported into experiences that may well share the ‘sphere-space.’ Views of the real world can thus be meshed in experience with unique imported objects.

At the VR conference in San Jose this year, a panel on VR film making emphasized the ‘problems of nausea’. As VR goes mainstream, the rush to find a way to commercialize it as an extension of cinema or a kind of ‘cinema-in-the-round’ does little to understand the medium’s origins in the research labs of the late 1970s and early 1980s. Back then, the desire to use the medium to discover entirely new categories of experiences was more the goal. But since the economic lure of mainstream entertainment is both risk averse and extremely strong, pundits would sooner see a return on investment by retooling VR for already proven genre fiction uses (Space Marines, Extreme Sports, Adventure Heroes in the 3rd World) than dare to take a chance on something that genuinely breaks the mold. But we shall see. Perhaps there will be a paradigm shift, a new group of experiences which will emerge that will bring us the Bruce Conner of VR, or the Brothers Quay of Augmented Reality.

It could well be that the massive archives of existing film merge with the online databases in the development of new hybrid media forms. A digitized future lies in how truly creative and experimental filmmakers and VR designers will combine these worlds.

CASE STUDY – Archival footage from 8 years ago about where I am on this street corner in San Francisco is being superimposed over my view of it it now. I can walk to the very same spot to see where it was filmed from the very same angle.

Like Dziga Vertov, the revolutionary Russian constructivist film maker of the 1920 who placed his camera everywhere he could and drew attention to the fact in the very form and structure of films like “The Man with the Movie Camera”, I am “Kino Eye”, all over again. All the information about the conditions on that day are available to me and all those about today, too. And all the people in the footage. And their relatives. And all the other people who have seen the footage. And what they thought. And what I think. And on and on and on. Then there is the 3D data about the buildings and the pipes under the road; the flight patterns of the planes above.

IRig UA – Guitar Interface for Android

Aug
02

First published in Examiner.com August 11, 2015 1:03 PM MST

By David Cox

The first universal guitar effects processor and interface for all Android devices

These days everything hinges on the interface with whether it’s the dashboard of your car on the glass surface of your smart device. It’s no different with your guitar to get your guitar to “talk” to your Android smart phone or ‘phablet’. You need something to translate all those guitar licks into digital signals such that they can be made use of by your effects apps and/or recording software. And that’s where IK Multimedia’s iRig UA comes in.

But the problem with such devices is always been latency. The delay between the signal of the guitar and the sound that you hear when you listen to it. Decrease the latency and you improve the experience.

As more and more people use the electric guitar the challenge of how to get a clean signal onto a smart phone for recording or performance remains. The software and the hardware of the phone itself have to do much of the heavy lifting. Any help they can get is greatly appreciated. Minimize latency and you improve performance. Everything is better. It pairs with AmpliTube UA to deliver solid sonic performance when used with any smartphone or tablet with Android 4.2 or higher. These include host mode/USB OTG. Android devices it works with include Samsung, Sony, Motorola, LG, HTC, Xiaomi or any other popular manufacturer.

iRig UA uses a built-in digital signal processor (DSP) that solves the issue of inconsistent OS latency on the Android platform. It’s able to do this by moving all processing to an external accessory. iRig UA’s DSP has been designed to work with a companion app, AmpliTube UA. The app, which comes with iRig UA, is powered by the DSP. AmpliTube provides a collection of virtual sound processing equipment to customize your guitar sound. A versatile system, iRig UA can also be used as a digital recording interface when connected to a Samsung Professional Audio compatible device or smartphone or tablet with Android 5.0.

This pairing of iRig UA and AmpliTube UA is perfect for when you need to practice in transit. iRig UA features a 1/4” input for a guitar, bass or other line-level instrument, a micro-USB to OTG cable and an 1/8” headphone output with volume control. I for one am grateful for this suite of inputs and outputs. I like to play along with .mp3s from other devices.

When playing my MIM Fender Strat through a Samsung Galaxy S using Amplitube, the signal was surprisingly clean and robust. I had some delay and reverb and a combo of chorus and distortion and I found that it felt just like playing through battery-powered metal BOSS stomp boxes from years ago. In fact it feels and sounds almost analog. Why? It uses hardware-based processing.

iRig UA’s on-board digital signal processor works in conjunction with AmpliTube UA, a special version of IK’s powerful guitar and bass multi-effects processor designed specifically for use with iRig UA. The processing is handled on iRig UA itself and not on the Android device, it’s able to provide consistent near-zero latency performance (down to 2 ms round-trip total latency) that’s independent of the make and model of your connected smartphone or tablet.

IK multimedia is getting very good at the small, portable pocket guitar interface device market, and this new push into the Android marketplace speaks to the rising tide of players using Galaxies and other such UA driven smart devices. Listening to guitarist and what they want has also helped in the decisions that led to iRig UA.

Features:

Works on any smartphone or tablet that supports Android 4.2 or higher and host mode/USB OTG

Near-zero latency digital FX processing

Digital audio recording on Android 5.0 and Samsung Professional Audio Devices

32-bit digital signal processor

24-bit converter with 44.1/48KHz sample rate

High-definition, low-noise preamp

Includes AmpliTube UA

Analog aux input for play-along and practice

Headphone output and volume control

Multicolor LED

Ultra-compact and lightweight design

Micro-USB port

RRP $99.99

Click here for more information

iRig Mic Studio: A mighty powerful microphone in a tiny package

Aug
02

By David Cox

First published in Examiner.com July 28, 2015 12:05 PM MST

New small condensor microphone offers quality and portability

Condenser microphone technologies advanced hugely in the last several years to the point now where microphones would only been found in studios 10 years ago are freely available in the open marketplace and can be used in the hands of people who otherwise would have needed to rent out recording facilities costing a fortune.

IK Multimedia

This now 15-year old laptop recording revolution has given rise to a whole new range of amazing peripherals of which this microphone from IK Multimedia, the iRig Mic Studio mic is a perfect example. Exquisitely crafted, finest quality metal parts make up the mic and it comes with a beautifully machined microphone stand with a small tripod. It also comes with a beautiful leather fabric pouch as well as all the parts you need to get started recording.

I’m working on an opera right now about the space program and I’m traveling to lots of people’s houses around San Francisco. I need a studio quality mike that fits in my backpack which can be pulled out and used at a moment’s notice fast. What I really like is the gain controls on the outside which let me adjust the levels for the singer’s volume level ‘on the fly. With rock(et) opera I’m depending on the range of the singer being quickly adjustable in a range of settings that I seldom have much control over in terms of dynamic range and acoustics. Soprano and baritone vocal levels can vary very widely so range from the from the mic and how loud he or she is in relation to the response of the mike is everything. This mic has not let me down yet. The nearest comparison for me is the Yeti, and this mic proves a better solution in terms of sound quality to that mic. And it is smaller.

Make professional studio-quality recordings on the go

iRig Mic Studio, IK Multimedia’s ultra-portable large-diaphragm digital condenser microphone has been released for iPhone, iPad, iPod touch, Mac, PC and Android. It sports a 1” diameter condenser capsule into a compact space that can be used to make professional-quality recordings on the go.

Features

  • Professional studio microphone with large-diaphragm capsule
  • Ultra-compact size that’s easy to carry around
  • High-quality 1” back electret condenser capsule
  • 24-bit converter with 44.1/48Khz sampling rate
  • Low-noise, high-definition preamp
  • Integrated headphone output
  • Multicolor LED status and level indicator
  • Onboard gain control and headphone level control
  • Comes with a full suite of IK apps
  • Includes portable tripod tabletop stand
  • Includes Lightning, Micro-USB OTG and USB cables
  • 30-pin cable available separately
  • Available in black or silver version

The size of the mic is a large 1” diameter back electret condenser capsule, a 24-bit audiophile-grade A/D converter (with 44.1/48 kHz sample rate) and a built-in low-noise high-definition preamp. These, combined with its 133dB SPL rating, allow for recording at any sound pressure level — it can capture the human voice up to an amplified electric guitar and louder.

I’ve received the best results just going straight into GarageBand using the supplied USB cable. Mic Room, is a new universal microphone-modeling app for iOS that works with iRig Mic Studio to give it the sonic characteristics of many classic mics.

The controls include a gain control knob and a multicolor LED level indicator. It also includes a headphone output with its own level control for onboard monitoring directly from iRig Mic Studio itself. And, for better positioning while recording, iRig Mic Studio comes with a sturdy and portable tabletop tripod stand.

iRig Mic Studio comes with a female micro-USB port and cables: Micro-USB to Lightning for iPhone, iPad and iPod touch; micro-USB to micro-USB OTG for Android (requires either an Android 5 or Samsung Professional Audio device); and micro-USB to USB for Mac and PC. A micro-USB to 30-pin cable is available separately for older iPhone, iPad and iPod touch models.

Using apps with the iRig Mic Studio.

For immediate recording, iRig Mic Studio comes equipped with a powerful suite of vocal apps that match its cross-platform compatibility. iPhone, iPad and iPod touch users will be able to enjoy VocaLive, a powerful effects processor and multi-track recording app that features a selection of 12 professional real-time vocal effects. EZ Voice for iPhone, iPad and iPod touch and EZ Voice for Android are streamlined and easy-to-use sing-along apps that make it easy for vocalists to practice with any song in their music library. iRig Recorder for iPhone, iPad and iPod touch and iRig Recorder for Android are straightforward apps for field recording, podcasting, note taking and more. Mic Room, the microphone modeling app for iPhone and iPad.

When you get this mic you really can open it up it work straight away as I did. A green light shows you when it’s on giving you a solid sense of available signal strength. You can hear the quality through the mic’s own headphones socket right away. This is a great mic (and I’ve heard a few in my years) and comes highly recommended.

iRig Mic Studio (silver or black) is available now from dealers worldwide or via the online store for only $/€179.99. Get Mic Room on the App Store for $/€7.99 for iPhone and iPad.

Visit this site for more information

IK Multimedia’s Mic Room – Microphone Modeling for IOS

Aug
02

Mike Room

Mike Room, IK Multimedia

IK Multimedia Mike Room

First published in Examiner.com by David Cox

The idea of a mic room is a luxurious concept. A room full of the very best microphones, each with its special qualities, plugged into a fancy mixing board, ready to enable the recording of a radio play, a song or an opera. But when might most people ever have access to such a thing? A mic room is the stuff of professional recording lore. The realm of rock stars, top producers, big record contracts. But no longer. Such is the state of modern technology that microphone modeling can happen on the modern phone or IOS device.

Mic Room is the perfect companion to IK’s digital and analog mics like the latest iRig Mic Studio condenser. When paired with such microphones, Mic Room gives you complete access to a selection of the best dynamic, condenser and ribbon-type microphones that are everyday tools in A-list music studios all over the world.

I tried Mic Room with my iRig Field microphone on my iPhone 6+ and was able to try a full range of available mics. One of these was ‘old telephone mic’ – a rather classic sounding 1930s sound synonymous with radio broadcasts from the golden age of radio. Another mic was modeled was Bottle 563

Based on the famous cylindrical-with-a-sphere-on-the-top Neumann® CMV-563. A rich, full sound, this classic mike had a great sound and I was instantly reminded of how great this whole software would be for recording radio plays or operas, such as my Rocket Opera, “Cosmonauts on the Moon”.

This ‘meta mic’ aspect is a truly amazing experience as the notion of using one kind of mic to access the digital equivalents of so many classic mics takes recording to a whole new level. To use the software, you simply plug in your favorite digital or analog microphone from IK (you can also use your iPhone/iPad’s built-in mic), select which model you want it to sound like and you’ll be able to instantly tap into the inspiring sound of many timeless classics. You get tried and true dynamic workhorses, rich and velvety tube condensers, ultra-smooth ribbons and even more unusual creative mic types.

Features

• Powerful microphone modeling app

• Nine mic models (1 upon software registration and 2 more after registering an IK hardware mic)

• Expandable with more mic models via in-app purchase

• Companion app for IK’s range of digital and analog microphones

• Also works with your iPhone or iPad’s built-in mic and Apple Headset

• Adjustable input level

• Level meter

• Master bypass switch

• Inter-App Audio and Audiobus compatibility

• Universal app for iPhone, iPad and iPod touch

• Free version also available

The native iPad version will be available soon

Mic Room is available as either a free version (with 2 mic model included and other microphone models available via in-app purchase) or as a full version with 9 included and the ability to add more models for free via hardware registration and also via in-app purchase.

Available in both free version and $7.99

Click here for more information

Interview: Professor Steve Mann, about Augmented World Expo 2015

Aug
02

First Published in Examiner.com on May 27, 2015 7:14 PM MST

By David Cox

Augmented World Expo 2015 – POWER TO THE PEOPLE!

AWE2015

SuperPowers to the People: Augmented World Expo 2015: An introduction to an audio interview with Professor Steve Mann (see link at end of article). The augmented reality conference AWE2015 is coming up and its theme is “Superpowers to the People”, and as usual the buzz is around Meta AR, the Kickstarter based firm that developed a headset and developer kit based around UNITY. Since 2013, the first year of META’s development has seen it grow considerably from a 3D printed housing prototype variation on the Epsom “Movio” glasses on which it was originally based.

META’s innovation was to add a ‘kinect’ or leapmotion style META tracker to the front bridge-of-the-nose area to act as the basis for where your computer knows where to place objects in your field of view from your ‘point of eye’ (POE) to use the jargon. This tracker knows also to ‘see’ your hand and to interpret it as the device with which objects are being manipulated, moved and transformed.

Steve Mann, Chief Scientist at META AR is a true pioneer of both wearable computing and Augmented Reality, and has been building his own wearable devices since 1974. I first met him in 1995 at the MIT Media Lab on a research visit.

A strong believer in personal freedom, Mann believes that wearable computing, especially the ability to manage one’s personal space as it pertains to the recorded image is a path to democracy. He views technology like META as a great equalizer in the war against surveillance. Against the top-down vector of ‘surveillance’ he posits ‘sousveillance’ which is ‘seeing from below’.

Simply put, if we are all wearing devices that enable us to view each other, this effectively neutralizes the one-way vector of power that cameras in the hands of the powerful makes possible. Of course in order to for sousveillance to become feasible, there needs to be the social consensus in place first. But one step toward this is to be sure, is an affordable universal principle of wearable technology that facilitates customization and ease of use. The wearer truly should be able to configure their field of view and the nature of all that which is augmented over that field of view. With META AR (AKA Spaceglasses) at least that version of META AR that has been made available to developers since 2014, the tracking technology works well enough to permit this, as do the developer tools, based as they are around the free 3D and 2D game engine UNITY.

I interviewed Steve Mann in the lead up to Augmented World Expo 2015 where he will be delivering a speech on the history of Augmented Reality as well as holding workshops on META viewing tools. Mann spoke of the difference between what he called the “Big AR” of the early 1960s – that of the type popularized by Ivan Sutherland and the famous “Sword of Damocles” head mounted display of the Stanford research labs during the cold war. These were large, tethered rigs tied by cables to mainframe computers hooked up to cumbersome looking binocular visors the size of bike handles.

Mann’s own “Little AR” by contrast, developed in the late 1970s when he was but twelve years old and built from more or less found materials, was aimed squarely at empowering the individual, who thus untethered could walk around, and have his or her data made available to him or her either in motion or in situ.

As the number of AR headsets today proliferates almost exponentially and the market becomes saturated, veterans like Steve Mann are in a position to lay down some of the guiding principles as to what makes an AR ecosystem of content provision by the user successful. One of the defining characteristics is openness of configurability by the user of their resources. If a system is closed, it undermines the whole basis of a meaningful AR, hence the failure of Google glass, according to Mann as he outlines during my interview (see link below).

Google glass exudes privacy. Privacy of sight. Privacy of seeing. And through its utterly closed ecosystem of use and apps, stands in stark contrast the notion of a democratic and participatory role for what should be as free and open to use as the low cost pay-as-you-go cellphone. We have a long way to go before any system of AR is truly of ‘power to the people’, but the lowering of costs is a matter of time. A language of AR and a syntax of use, both incumbent upon the correct management of tools and their education is key here. This is where policy comes in. The relationship of the UK government to the Raspberry Pi foundation comes to mind. Massive subsidy in order to promote broad literacy and creative expression in the population. We need an Arduino style AR revolution. A pi-AR if you will. If Lenin urged Dziga Vertov to make an ‘art of twelve kopeks’, we today need an AR of fifteen dollars.

And, the user must be able to customize to their own specifications as much as possible, right down to the hardware where possible. The iPhone and the iPad are closed models rendering the user a consumer of prepackaged services. AR offers a new opportunity of aesthetics in a way also. A new set of social relations defined by interesting meaningful relationships based on data, places and people. The experimental possibilities of drifting through open fields of participatory urban spaces, and moving to new ways of working and living together through those less managed open spaces might be possible. A non-neoliberal technologically mediated commons, in which AR assists in the development of newly reimagined urban possibility.

Interacting through this environment, both figuratively and literally, we need to encourage democratic and participatory models of use for AR. Just as Bruce Sterling identified a SPIME – time, space and virtual space An augmented subject can often consider herself to be self-consciously a spime in that she occupies both the real world, the virtual world simultaneously as her data influences her decisions and actions as her body occupies space. It is with the proliferation and deployment of very low cost wearable computers based on interoperability and the principle of the user as subject that Augmented Reality is beginning to mature as a medium and as a technology. Therefore just as with any new technological shift, a new language should logically follow. These and other concepts will be discussed by Steve Mann as part of the general theme of this year’s AWE2015 which is “Superpowers to the People”.

From cinema came the language of the close up, the long shot and the jump cut, and from computers came the save-as, the cut-and-paste and the selection box. AR is sure to bring with it its own language with such terms as “flowspace” (the space in which the subject moves such that their data moves with them meaningfully), objects as interface (reaching out to door handle with AR can have the effect of unlocking the door). Thus, a kind of dance of the interplay and overlap of things, places and people with the information pertinent to them, all the time, in real time will spawn its own new kind of terminology and lingo. It is the performative language partly of theater, urban planning, of cinema, and of dance and manners. From the world of filmmaking we might call it the experience of Augmented Reality, with its floating-objects-in-space and holographic dancing-objects-interacting-with-the-world around-us a kind of mis-en-scene and directorial scene blocking in real time. Everyone a director of their own real-time experience.

New ways of seeing are thus required, to quote John Berger, where the age-old Renaissance principle of what Mann calls the ‘point of eye’, the exact position of the iris where the world we view converges on our gaze needs to be rethought, all over again. Its one thing to have all the data of the world around you converge on your eyes only. Quite another to consider these tools for the population beyond yourself and your own personal needs.

Can we strip away from the singular point of view of the typical user as depicted in the PR materials of Augmented Reality his sense of entitlement and ownership and control, and perhaps through the very same tools, replace them with a new set of ways of viewing the world, less possessive, more inclusive, more considerate of the needs of the planet and is all-too fragile membrane of a surface? Along with the need for a new language of AR is a new language of being in the world which possibly such technologies might just help usher in. If so, Professor Steve Mann is just the kind of progressively minded visionary whose pioneering work in the field gives him the right, quite literally, to light the way.

I interviewed Steve Mann on May 15th, 2015

Here is the link to the audio interview

A link to Augmented World Expo 2015

Media Magazine “INCITE” Issue 5 “Blockbuster” Published

Aug
02

Screen Shot 2016-08-02 at 2.08.41 AM

By David Cox

Cover Showing ILM Annual Company Ritual, INCITE Journal, Fall 2014

INCITE Journal

INCITE Issue #5: BLOCKBUSTER

Fall 2014

ISSN 2163-9701

  • Edited by Peter Nowogrodzki
  • Founding Editor and Publisher
  • Brett Kashmere
  • Art Director
  • Eliza Koch
  • Contributing Editors
  • Christina Battle
  • David Burnham
  • Walter Forsberg
  • Peter Nowogrodzki

INCITE is also available online at www.incite-online.net

INCITE media and technology journal number five’s theme is “Blockbuster”. A central topic is simulation and entertainment in contemporary culture.

The Game of the Future by Anna Ialeggio is about a collection of plywood Middle Eastern ‘everytowns’ built in the California desert near Barstow. The subject of the excellent documentary Full Battle Rattle (2008), this distributed collection of towns is operated by the US military and is used for role-play-based on-site training. The towns, Ialeggio explains, sport more or less everything, but in rather basic plywood and rudimentary form, that the US needs for its troops to pretend that they are already in the Middle East.

Amy Sloper’s Lab Bags is an article about the fall of film processing labs. It lovingly discusses the promotional illustrations found on the side of those plastic bags that accompanied cans of film sent to and from processing labs. There are several illustrated in the magazine. The witty and stylish designs speak to a time when film professional spoke to film professional about dependability and reliability of service. Now these illustrations are collected as precious ephemera of a disappeared historical moment. Lab Bags and its testament to this doomed form of industrial photochemical film culture is poignant indeed.

The ghosts of old thus media thus fade, and new ghosts take their place. INCITE nails it again with On Mimetic Polyalloy by Gregory Kalliche which takes a spotlight to the now legendary Bay Area based special effects company Industrial Light and Magic and its annual corporate bonding ritual performed somewhere in the desert.

The ILM team forms a giant circle, then Kalliche explains, members pour molten metal into cavities in the desert sand. The team-building metal-pouring gesture echoes visually one of the very first digital motion picture special-effects moments—the creation of the humanoid T-1000 Terminator in the film the Terminator 2 in 1991.

In this James Cameron sci-fi action blockbuster epic, a humanoid robot (whose default disguise among humans is as a sly police officer) is made up of liquid metal that can sample or replicate any object or life-form nearby, occasionally also dispatching them with edged or pointed weapons at will.

In the title story Blockbuster, Roger Beebe chronicles his experience as a hip independent video rental store owner in a small town. As a customer loyalty incentive, he offered free rental if his customers were willing to cut up their Blockbuster video rental card and put the pieces of the card in a large glass jar on his counter. This active boycott helped to him prevail even long after the pre-streaming media version of Netflix did in the Blockbuster chain once and for all.

Kevin B. Lee’s article Premaking a Chinese Hollywood Blockbuster: Transcultural Flows and the Culture of Anticipation in “Transformers: Age of Extinction” is a comprehensive analysis of the ways in which the demands (perceived and otherwise) of the Chinese market have affected production and content decisions in the latest Transformers movie, particularly the depiction of the Transformer robot character Grimlock, shown as a metal T-Rex in the original animated 1980s TV series, who now resembles a Chinese dragon. In this latest Michael Bay blockbuster, Lee sees, in the film and its attendant publicity trailers, evidence of emerging hybrids. Each superpower, Lee argues, by seeking a commercial advantage via the use of the massive movie blockbuster form for its own ends, is inadvertently or otherwise, colonizing the others.

Turing Complete User by Olia Lailina discusses the role of coding in software use and coding skills. She helps frame the history of the user interface in a broader cultural studies and media archeological framework. Lailina cites Cory Doctorow who warns about the “Coming War on General Purpose Computation”. In short, devices and the ‘cloud’ services they rely upon assume and reinforce a world in which the user is less and less a subject of independent agency, but rather a passive subject facing an increasingly predetermined series of experiences that render him or her a more passive subject in someone else’s advertising-driven game.

Lailina writes:

In general the WWW, outside of Facebook, is an environment open for interpretation. An effort must be made to educate users about themselves. There should be an understanding of what it means to be a user of an “all purpose automatic digital computing system”

INCITE #5 covers a range of topics characterizing our current time as one of deep ambivalence and fascination with the effects of blockbuster entertainment on our collective psyche and vice versa. It’s a rich compendium of theory and ideas and comes at just the right time for scholars and movie buffs alike. There are pieces on everything from Hollywood’s ongoing effects upon our psyches, to the reality of illusion and superheroes.

INCITE is clearly one of the most happening film journals today and reminds me of 21c, from Melbourne in the mid-nineties. It’s transnational, transcultural, interdisciplinary, and as in the UK, the digital and the real are starting to create some strange spin off hybrids. INCITE is tapping the zeitgeist energy beautifully.

Visit the journal website here.

The Ideas of The Rennaissance Persist in Games, Simulations

Aug
02

First Published by David Cox in January 21, 2015 2:35 PM MST

The Mona Lisa by Leonardo Da Vinci

Photo by Pascal Le Segretain/Getty Images

Pierro De La Francesca, the famed Renaissance painter and architect built arcane secrets into his pictures. Trained in the then very new technique of perspective painting, Pierro integrated systems of Euclidean geometry into the formal composition of his paintings. He even included ‘secret’ messages into the subject matter, such as five sided pentangles and so on which to those in the know at the time related to the presumed relationship between man, God and the universe. In some pictures, only recently developed techniques have enabled scholars to unlock some of the secret messages embedded in his paintings. The pictures were ciphers and cryptograms which referred back to the social conditions under which they were made in order to flatter those who could identify those codes. These conventions were considered part of what it meant to be an educated Renaissance artisan.

The cryptographic geometric and perspective-driven cosmologies integrated into his work and that of others around the same time ? Leonardo Da Vinci, and Giotto were those of high levels of mathematical abstraction, themselves at the time ‘redeemed’ from Greek antiquity. Using a system which would today be called ‘ray tracing’ and which would be done using 3D graphics software, Piero was able to calculate the appearance of objects in 3D space by numerically transposing positions of say parts of a human head tilted at an angle. The extraordinary feat was to be able to mathematically conceptualize the body as a fluid dynamic system whose spatial and positional appearance on the canvas could be represented by numbers. The numbers then could be used, quite separate from their real life referent to calculate the appearance of the same subject from any angle.

Just as computers now are used as much as cameras to deliver moving pictures to our screens, the common conceptual link between the two technologies is that of the abstract ‘plane’ upon which the perspective image is imagined to fall upon. One of Piero’s most famous images is that of a tilted head; a detail from his painting The Flagellation. The position of the head was one of many he could have settled on when he painted the picture, the subject of the picture was not present when it was painted. Rather the image of the subject had been abstractly transposed numerically by Piero first into his memory, then onto paper and from paper onto canvas. A computer graphics artist can choose to show a 3D model of a dinosaur or space ship from any angle and because the computer 3D graphics rely on the centrality of the first-person view of the universe, any graphic can be made to co-habit the orthographic domain of photography.

Film montage emerged from a certain vantage point, a peculiarly 20th Century vantage point. The idea of disjointed clashing meanings was in common circulation in Europe in the early 20th Century. The political payload which accompanied the aesthetics of montage was powerful indeed. The photo montage images of John Heartfield in Germany in the 1920s were culture jams in the extreme. The proliferation of photographs in print publishing enabled political satire to find expression through the surgical cuts of scalpel on the photograph and to cut and paste and rework still images had its parallel in the development of film editing in Russia. The Eisenstein technique was to make images clash up against each other and in colliding, give rise to combatant new images. This art of montage was the aesthetics of context migration. With film editing new meanings could be divined from the intersection where images collided in time. With photo montage the spatial field of the photograph itself rather was the terrain of a clash of opposites, where powerful hybrids of image with image could occur.

Linking these technologies was the idea that spaces could be traversed without effort, or that technology could mediate space. Photography and cinema have the aim of placing the viewer somewhere other than where they actually are ? transporting them in fact. Cinema and photography both employ spatial fields of view; the Euclidean geometric breakdown of space into geometric forms. Inside a camera, light falls on the film plane, is recorded photochemically, by means of a mechanical shutter.

Aircraft are similarly about the manipulation of forces, which in turn are therefore relatively simple to translate into code for the purposes of making a simulation. Variables like thrust, pitch, yaw, elevation, speed, flow represent the chaos of the movement of air over the wings, of the propeller through the air. Affording a view of the surroundings cartography mapping Empireâs make maps before invading. The British Empire’s first step prior to setting up India as a giant cheap manufacturing and supply colony was to divide the country up into triangle shaped segments, the better to map it. Conceptual ownership longitude.

The Space Race and the Cold War represented the fusing of political and technological imperatives toward a unified Imperial assertion of Superpower supremacy. The quest for space took on a religious overtone in both the USA and the USSR; both elevated space exploration as the pinnacle expression of modernist progress; to boldly go and get “go fever”. It is no accident that Tom Wolfe should valorize the extremes of 1960s expansionism on both the left and right.

The central view predominated in the 1960s much as it had done since the Renaissance. The privileged point of view of the Medici-funded artist was paralleled 400 years later by the NASA or USSR backed astronaut. The prize brought back to civilization from the Space Race was that of the unique view the space photograph of the earth, the moon panorama taken from space suit or Lunar Module cockpit. Neil Armstrong as Michealangelo’s David. Officialdom needs time and space measured, divided, controlled.

Joseph Nicephore Niepce (creator of the first fixed photo) was something of a photochemistry hacker as an experimenter using cameras, chemicals and surfaces. Exposure to light and the chemical fixing of the camera obscura’s image was the aim of the first photographers. The very first ‘fixed’ photo was of his own courtyard. Niepce needed to leave the camera somewhere where it could be left.

Babbage’s Difference Engine (though it did not work) had already been built when the first fixed photo was made. Computers have long been closely linked to the conceptualisation of space ? Charles Babbage’s famous unfinished prototype for a computer, the analytical engine developed in the 1830s was developed in response to a request from the British Government to generate better navigational charts for mercantile shipping. The Colossus computer developed in the UK to crack Nazi radio codes, found itself mainly decoding co-ordinate information of Atlantic submarine positions, and the like.

The miniaturization of electronic components which resulted in the development by counterculture hippies in the mid 1970s of the personal computer, was itself the result of the need by the military industrial complex for small parts for use in missile navigation and space travel. Mapping, architecture and urban planning also play a large role in the development of video games, whose elaborate labyrinths of play and dynamics in turn find eerie expression in the layout and appearance of the contemporary themed shopping precincts of our major cities.

Strategy and games both require abstractions of space, and the dynamics, which take place within them. The Situationist International’s project was that of reclaiming a rapidly modernizing Paris after its liberation in 1945 from the clutches of commercialization. Against sterile rationalist planning of inner city housing and retail areas they proposed radical alternative uses for cities, which emphasized a sense of free play, and which advocated a system of activities in art and architecture, film and writing which would ultimately render work and all forms of social control obsolete.

Early parlour toys dallied with sex and the licentious ? zoetropes and praxinoscopes and other visual tricks often were delivery mechanisms for lurid porn fantasies and devil images, rather like the proliferation of video recorders in the early 1980s. The boom in inititial VCR sales stemmed largely from the newly created home porn video market. The industrial revolution was starting to result in identifiable domestic scientific entertainment forms ? the home microscope ( a latter day home computer) offered views into other worlds ? the microscopic and the microphotographic. Microphotographs were tiny photos to be viewed through microscopes.

These images are ghostly, even phantasmagoric. At the Sony Center in San Francsico in 1999, before it became the Metreon center as it is now, my wife and I were able to have a moving white-light hologram made of us kissing embossed into a card about the size of a credit card. The image of us turning and kissing moves as one angles the card on which it is mounted from side to side under a light. To take the hologram, a video camera on a kind of four foot long conveyor belt scanned our faces over a period of five seconds as we kissed. The resultant frames were then processed in an adjacent lab, which converted the digital frames into the reflective white light hologram moving image the size of a large postage stamp. In a sense the technology of the space/time based arts like cinema and the space recording arts like photography have converged to enable moving holograms which record events, albeit short span ones, and to present those events in movie like images which can be seen in ordinary white light.

The old Sony version of the Metreon has long since given way to its more mundane shopping mall variant, but I often wonder what happened to the utopian impulse behind the while light hologram stand that was there when it first opened.

The Renaissance is still with us.

iRig Mic HD microphone

Aug
02

First Published by David Cox in Examiner.com

The Digital Handheld iRig Mic HD microphone

IK Multimedia

iRig Mic HD microphone

The new consumer-level multimedia iRig Mic HD microphone is innovative in that it is able to record voices and sounds in very high quality in digital form without any kind of conversion before it actually gets to the computer tablet or smart device. This is significant because the ability to do in order analog to digital conversion on board the microphone itself effectively turns the microphone into a digital signal processing device in its own right. iRig Mic HD has the 24-bit A/D converter, a 44.1/48 kHz sampling rate and a low-noise/high-definition pre-amp. The preamp means the signal is ‘good and fat’ by the time it hits the software or recording app.

Previously you needed to buy a special audio recorder in order to have this function with the audio signal being recorded directly onto a storage device of some kind such as a microSD card. I’m thinking now of the “Zoom” type recording devices, for example the H1 and Hn4 field recorders. Alternatively you could buy high-end analog microphones for use with programs such as GarageBand ProTools and other onboard computer recording studios. The problem here is that the microphones require some kind of analog-to-digital conversion hardware.This is an extra step which frankly gets in the way of a good recording idea. So this microphone in being able to record high-quality sound digitally right from the get-go puts it in a class of its own. This is probably why Apple sell it as a high profile item alongside other peripherals in the app store and why the exclusive silver car microphone is sold in the app store separate from the black one which you can get from IK multimedia on its own

I’ve been using the iRig Mic HD microphone digital microphone for several days now in a variety of settings and I found that it works just as well as a field recorder microphone recording audio effects the things such as video productions will film productions as it does for multitrack recording on songs or interviews when I’m conducting discussions from my blog. It feels nice to hold its well-made it’s as nice smooth aluminum finish there is an LED light which flashes to let you know the signal input level which is a handy way of the able to tell if you signal is too loud or too soft there is a small file on the mic which lets you adjust the input level and to set it in accordance with the volume settings suitable for the kind of recording you doing it comes complete with a microphone stand holder with the threaded screw fitting for microphone stand as well as a plastic adapter to support wires which end in a micro USB male connector. There are two cables that come with the microphone one of them is the newer Lightening iPhone cable and the other is a standard USB cable the optional third cable is the Apple iPhone 4 type standard non-lightening connector. I recommend the IK multimedia iRig Mic HD microphone. It is a another tool in the toolkit for outdoor and indoor high-quality digital audio recording for laptop tablet and smartphone use.

Features

  • Handheld digital condenser microphone for capturing audio on the go
  • Plugs directly into the digital input on iPhone, iPad, iPod touch and Mac/PC via its included 1.5m (59”) Lightning and USB cables
  • High-quality 24-bit, audiophile-grade A/D converter
  • 44.1 – 48 kHz sampling rate
  • Gain control with mutlicolor LED indicator
  • Handheld design — also compatible with standard mic stands
  • App/software bundle*, mic clip and carry bag included
  • iRig Recorder, VocaLive and AmpliTube (iOS & Mac/PC versions) included
  • Available in black or in an exclusive silver version (only available at the Apple Store)
  • 30-pin cable available separately

More information here:

http://www.ikmultimedia.com/products/irigmichd/

he way of a good recording idea. So this microphone in being able to record high-quality sound digitally right from the get-go puts it in a class of its own. This is probably why Apple sell it as a high profile item alongside other peripherals in the app store and why the exclusive silver car microphone is sold in the app store separate from the black one which you can get from IK multimedia on its own

I’ve been using the I Rick HD digital microphone for several days now in a variety of settings and I found that it works just as well as a field recorder microphone recording audio effects the things such as video productions will film productions as it does for multitrack recording on songs or interviews when I’m conducting discussions from my blog. It feels nice to hold its well-made it’s as nice smooth aluminum finish there is an LED light which flashes to let you know the signal input level which is a handy way of the able to tell if you signal is too loud or too soft there is a small file on the mic which lets you adjust the input level and to set it in accordance with the volume settings suitable for the kind of recording you doing it comes complete with a microphone stand holder with the threaded screw fitting for microphone stand as well as a plastic adapter to support wires which end in a micro USB male connector. There are two cables that come with the microphone one of them is the newer iPhone cable and the other is a standard USB cable the optional third cable is the Apple iPhone connector I recommend the IK multimedia I Rick HD microphone is a another tool in the toolkit for outdoor and indoor high-quality digital audio recording for laptop tablet and smartphone use. Hello as is a going well where you want me