• wiggly-slider_05

  • wiggly-slider_03

SIGNAL FLOWS, ROUTING AND PATCHBAYS

CM Labs CM64 programmable recallable analog patchbays

CM Labs CM64 programmable recallable analog patchbays

This post sounds pretty tedious!   Signal flows, routing and patchbays, indeed.

Well, it’s pretty useful stuff, but you don’t need a patchbay in most home studios these days, since most traditional outboard boxes (meaning effects units, compressors, EQs, etc) now come in the form of a plug-in inside your DAW, so you probably don’t have a lot of physical hardware to interconnect.

Accordingly, I’ll only touch on patchbays briefly.  They can still be useful to some of us.  You will, however, use signal flows and routing all the time.

It can be a real problem if you don’t keep your power cables away from your other types of cables.  If they must cross an audio cable, do so at right angles to minimize induced hum and noise.  Route your power cables carefully away from audio cables.  The ability to induce hum falls away with the square of distance, so a a few inches of separation is more than adequate if you’re really cramped for space.

Tidy cables are a challenge, and always will be until there are no more wires used in studios.  We’re getting there slowly.

What you really need to understand is how signals move around the studio, and what different protocols are in use.

What does Mic Level or Line Level mean and what cables are used for these signals?  How loud should my mic or line input signal be when it reaches the DAW inputs?

What about digital formats such as AES or S/PDIF?  What is WordClock and do I need it?  What is unbalanced, other than me?

Well, there are four main types of signal that move around a studio – they are either analog audio, digital audio, MIDI or control signals.  I’ll deal with MIDI in blogs between May 9 and May 13.

Today, we look at control signals and audio signals.

ANALOG AUDIO SIGNALS

In using the interface to route analog signals into your DAW, you will connect a mic, line or instrument level signal to your interface.

Of course, a mic level signal is what a microphone generates as it’s output.  This needs to be amplified to bring it to the level expected at the recorder.

A pre-amp is used to do this.  It may be stand-alone (a specific piece of hardware for the purpose called a pre-amplifier, e.g. Neve 1073 or UA 610) or it may be a part of another device – say built into the interface input channels or a mixing console’s mic input channels.

On the way to the preamp, the signal is normally sent down an XLR cable with male pins at one end that connect to a female socket in the mic body.  The male pins deliver an output, and the female sockets receive it – I wonder where they got that idea?

The interface end of this cable from the mic will be either another XLR connector (male, of opposite gender to the end connected to the mic so as to mate with a receptive socket on the interface) or it will be a 1/4″ jack connector.  The jack may have one black ring around it, or it may have two rings.

One ring on the jack means the jack is delivering an unbalanced signal, and two rings means it is instead balanced.

The 1/4″ jack may be either of two types, namely a balanced or an unbalanced connection.   Where two wires (hot and cold, or neutral and positive) are connected to a plug, they are unbalanced, but where three wires are connected (a ground wire is added) the connection is said to be  balanced.

The advantage of balanced signals is that they can reject hum and radio noise, or RFI  (RFI = radio frequency interference), giving you a noticeably cleaner result in most situations.  However, a balanced cable may not be the best choice in certain circumstances.

In a home situation, ideally you’ll use balanced XLR connections or balanced TRS 1/4″ jack connections for all of your analog audio connections.  These cables are a little more expensive but cables are no place to skimp if you want high-quality results.

Any older hi-fi equipment (cassette decks, radio tuners, hi-fi amps, etc) may have an analog RCA connector, often called a phono jack.  This is unusual these days.  It’s an unbalanced connection, having two wires only.  The two-wire type of cable is called co-axial, a term related to the axes of electrical current flows directionally through the wire, when depicted as three-dimensional sine waves interwoven in time and space.

DIGITAL AUDIO SIGNALS

Digital signals will be in the formats AES/EBU (carried on three-pin XLR cables) or S/PDIF (co-axial or optical) or ADAT (optical).  ADAT can carry either two or eight channels of audio.  Al these signal protocols work in one direction only, meaning there is a connector for inputs and another connector for outputs.   Connector pins transmit audio, and sockets receive it.  Most DAW interfaces will use ADAT because it uses less physical space in the unit and has the added bonus that, being a light pulse system, it cannot pick up stray hum and buzzes and radio noises.  It sounds pristine.  S/PDIF and AES can pick up interference because they are electrical connections rather than optical.

Optical connections have their own problems, particularly reflection at the connectors causing data errors, but this is very unusual in practice today.

Oddly enough, the very same “RCA” connector that was typically used for old-school hi-fi analog audio connections is now used instead for carrying digital audio over the S/PDIF format (see digital audio signals, below).  The cable used is different, however, in that it needs to have an impedance of 110 ohms to carry digital audio, whereas mic signals are expecting 75 ohms.

This is why you can safely use a digital audio XLR to XLR cable of 110 ohms to plug in a microphone, but you can’t use a standard mic cable  of 75 ohms to carry digital audio – digital requires 110 ohms impedance specs for a cable to work.

The higher resistance does not significantly affect the mic signal though, so you can safely swap a dead mic cable for a digital audio XLR cable.  If you try to use a mic cable to carry digital, however, the digital audio will not be successfully transferred from the one device to the other.

A common occurrence (when there are data errors or clocking problems) with digital audio you are recording is to hear clicks and pops in the audio at random intervals.  These are easy to miss, but ruin the recording.  Make sure your clock settings (see WordClock, below) are correct for the transfer you are making, or you will get the clicks and pops in your audio, or at worst will not be able to pass audio between the devices and will hear awful noise blasting through your system’s speakers.  Be careful with clock settings.  The clicks and pops are fairly frequent but totally random and usually mean the two devices have different sample rates set.  Sewt the rate to the same one and the problems should vanish.

Sony and Phillips collaborated in the 1980’s, pooling resources to invent the digital audio format S/PDIF, which stands for Sony/Phillips Digital Interface Format.  It can be optical (using ADAT) or it can be co-axial (using RCA connectors).

Audio can also be passed digitally between equipment, and such equipment has a digital audio connection (or several) for either input or output, or both.  In some cases, multiple cables are used to achieve the highest sample rates such as 192kHz (this is called a Double-Wire connection).  There are several types of digital connection,

The ADAT connector is an optical format connection.  It can carry two or eight channels of audio simultaneously, depending upon the format of the ADAT connection on the device.  Multitrack devices typically use an 8-channel ADAT connector.  The exact same connector is also used for stereo audio on stereo devices with optical connections, such as a CD player with an ADAT-format output.  Some devices refer to any ADAT format connector as Lightpipe, but it is the same thing by a different name, just as Intel’s LightPeak and Apple’s Thunderbolt are exactly the same technology.

CONTROL SIGNALS

Most control signals are simply electrical pulses that vary between two settings (often ON and OFF) at a fixed rate, allowing remote control of devices.

MIDI does this but it is unique, being vastly more sophisticated than any other control signal in use, both in it’s capabilities and it’s sheer variety of applications.  I’ll talk about MIDI at some length at the end of the week.  MIDI is a well-developed control language rather than a simple control signal.

Control signals are easy enough to understand – they are non-audio signals used to pass commands or data between instruments or equipment.  They can be analog or digital, and both types are common enough.  They let you control the actions of devices, and often you’ll simply get them to tell each other what to do automatically – setting them up to work autonomously by selecting a parameter value or sending a command from the front panel of the device in question.  Making the physical connection and turning on the required feature at the device is often all you need to do – some units will allow remote control of features, but mostly they are used simply to synchronize playback between different devices.

There are various types of control signal in a studio, from simple old-school CV (Control Voltage) control of a synth to the modern digital clocking scheme WordClock (which is only in use with digital equipment).  Any of you who are analog tape users will still be using SMPTE.  This is an acronym for the Society of Motion Picture and Television Engineers and it is a control protocol.  It’s a time-based audio pulse you record onto one track of a multitrack analog tape machine, which then plays it back at a fixed speed, allowing other equipment to read and synchronize their own transport controls (Play, Stop, etc) with the tape recorder playing back the SMPTE code.  This allows two or more recorders to play in time together.

You won’t need to use SMPTE if you are using a DAW unless you want to synchronize a tape recorder’s transport control operations with the DAW’s own transport.  Unlikely, today, since a DAW is so powerful and can emulate the sound of analog tape pretty effectively with the right plug-ins.

In a DAW, you’ll use MIDI for almost all these “control” tasks.  MIDI can pass musical data around as well as control signals, but does so in the form of commands and other data to instigate a device to actually play live, rather than being or even representing actual recorded audio.

I’ll talk about that for some days starting May 9, as I said earlier.  MIDI is incredibly useful and deep, despite it’s venerable age.

If you connect any two devices with a digital cable (especially including your DAW interface) you will almost certainly want to know about WordClock, an essential control signal in any studio with several digital audio devices.  If you are simply making a single connection to a DAW digital input and nothing more complicated, and you have set the DAW to lock to the incoming input, you won’t need a standalone WordClock generator.  If you do ANYTHING else, you will need one.  These are often called Master Clocks, and a common studio example is the Apogee Big Ben.  They act as a digital clock master for all devices in the studio which follow digital clocking – anything that passes digital audio in or out, in other words.

WORD CLOCK

WordClock only appears on digital devices, and it does not appear on all of them by any means.  Any sophisticated digital audio device will typically have WC if it needs to pass data, so you will see it mostly if you buy more sophisticated or expensive hardware.  Used equipment often has a WC connector.  Some synth modules have them, such as the Roland XV-5080.   Old R-DAT (Rotary Digital Audio Tape, more casually just DAT) players often have them.  If it inputs or outputs digital audio, it might well have a WC connector.

What is WordClock, exactly?  There are a few flavours, but the concept is the same in all cases.  This is because all digital audio is time-based – it always has a specific sample rate of so many samples per second that is agreed upon for a session.

WC synchronizes the data delivery from one unit to another, so that in any one second of time the exact same amount of samples have been dealt with on both machines, from the exact same start position in each case relative to the session project timeline.

You MUST use WordClock at all times if you pass audio over a digital connection from one WC device to another (i.e. using AES or S/PDIF co-axial or ADAT optical connections instead of using the alternative analog cables).  It is not passed over the WC cable – audio goes via the digital out on the sending device to the digital in connector on the receiving device.

If all you do is play digital audio on a device and record it’s digital output into your DAW, you should find that the DAW will simply lock to the incoming digital data stream automatically.  Devices must always lock when talking to each other in a digital connection, or you will find timing drifting apart in the way they lock together and errors will start to happen.

WARNING: ALWAYS MUTE the send to your amp and speakers before connecting a control signal cable, whatever the type.

If you have not set the WordClock  settings appropriately for each device at the device before interconnecting them with a digital audio cable, you will simply get a dreadful hash of noise (and I mean appallingly unbearable screaming torture) blasting at full volume into your amp and speakers the instant you try to listen to the device at the DAW’s input (by selecting an input from a drop-down menu at the DAW’s track you are planning to record to).

Although it seems like a risky thing to be using from the above description, it is impossible to make a WordClock device pass audio if the settings are improperly made – which is when the awful noise can occur, and why you HAVE to use it to pass audio in multiple WC device set-ups.

A WC cable ends look exactly like a household cable TV connector, and the connector type is a BNC.  The connector on the back of the device may be marked WC, WordClock, or WC Out.  Bi-directional devices will have a WC In and a WC Out, and in some cases a WC Thru.

If you have three or more WC devices, you will probably start thinking about a WordClock distribution amplifier in case you add more.  This is a box which accepts an incoming WC signal, and then spits out a regenerated, refreshed version of the signal, still in perfect synchronization with the original.  The box usually allows for multiple duplicates of the signal to be sent on to other devices, typically five or seven extra outputs.

I use an Apogee Big Ben as the master WC device most of the time, which allows for multiple outputs itself.  Since I have a significant number of digital devices, I have had to use a Sonifex RedBox WordClock distribution amplifier to refresh one of the Big Ben’s seven outputs another seven times.  It is actually very simple and reliable in operation once you get the right settings decided upon and then set the devices accordingly.  You won’t need that kind of firepower in the typical home setting though, but I wanted to demonstrate that it is not very difficult to use even when the interconnections are a bit on the convoluted side.

One WC device is ALWAYS in charge (you choose which device as needs arise and tell it to act as the Master), no matter how many devices are receiving or sending the WordClock, and all other devices using WordClock will follow the Master’s timing pulses in the data stream, synchronizing all digital data flows.

Learn WordClock and you can relax about connecting digital audio devices together.  Ignore it at your peril!

ROUTING

Routing simply means how you move audio signals from one “place” to another.

A buss is what we call the virtual equivalent of the cable in a DAW.  A buss is a pathway for audio to travel.  You assign things to a buss, and off they go.

In the analog world, busses are used all the time, but in that case it is an internal cabling arrangement – a wiring circuit carrying audio from one place to another.  Mixing consoles use busses to move audio around the sections of the console, from inputs, to metering, to subgroups, to outputs, and so on.

This can be done in analog or digital formats, of course.  Once in your DAW, you’re working digitally.

Say you put a single SM57 mic right in front of your guitar amp, a few inches away for a little air, or right up smack against the cloth at the edge of the best-sounding cone, whichever you prefer.

You route the sound to the DAW input via the mic connected to the DAW.  The DAW channel deals with the signal according to what you do in the channel  – you switch in the EQ, then boost the high mids at 4kHz, cut the low mids at 350Hz, say.   Oh yeah, you like the sound and it doesn’t seem to obscure other sounds already recorded in any way you can’t tweak later.

When you plug that mic into the interface, it actually goes to an internal analog  preamplifier, boosting it to line level.  It is converted to digital audio by an A/D converter and sent on to the DAW input channel for any further processing and then routed from the channel output to the recorder (the DAW) as you record on that track.

Now you want the sound to join the others in the main stereo mix, so you can hear it there, and you have the output for that track routed to the main L/R mix (you can choose if you are monitoring live input or playing back existing audio on the track).  This is routing.

Routing is simply telling the signal where to go.  In the simplest case it is a level and a pan control, and in more complex environments like a DAW you will find numerous possibilities on your Channel’s Output drop-down menu.

If you had three mics on three singers, for example, then normally you would want to route each mic to a separate track.  If, instead, you wanted to simply record the whole thing on two tracks as a stereo mix (this is not advisable on a DAW, as you have lots of tracks!) you would select L/R as the output for each channel, and then use the pan control of each channel to place the sound in that stereo image.

On old-school analog mixers, say an 8-group mixer, you would find buttons marked 1/2, 3/4, 5/6, 7/8, and another L/R button that lets you route directly to the main stereo mix.  An eight-group mixer is one where it has eight subgroups.  Budget models usually have four groups instead.  Groups are an optional stop on the journey between the individual channels’ outputs and the main stereo L/R fader.  It can be very useful to group a bunch of signals together, say the drumkit mics, simply so you can turn them up or down simultaneously as one group just by moving their group fader.

You group and route signals together in much the same way when you use the same aux send buss on more than one channel, say when sending multiple tracks to a reverb.

The 1/2 in the list is the first of four stereo subgroups in this case, and not the main stereo mix.  You would then route the four sub-group faders so that they fed the main stereo bus – if you wanted to include their audio in the mix.  This is not always wanted which is why you get routing buttons.

Obviously, routing takes place in both the analog and the digital environment.  You don’t get far without it.

PATCHBAYS

These can be analog or digital.  They can be huge and complicated or they can be basic and functionally transparent.  The ones I recommend for songwriters are the Samson range, because they cost very little and are well-built and easy to use and understand.

They might look fearsome, but don’t worry!  They are there to make things easier, and they do.

You don’t have to change connections in a studio constantly, but some of them do get changed every day.  You might want to connect a device a friend has provided or patch in an analog effects unit for a song.  To do this, you have to make connections.

Plugging and unplugging ad nauseam wears out expensive cable connectors, and sockets or pins on the devices themselves.  It can be hard to do because the connections are often at the back of hardware, not on the front.  If you rack-mount things, even more hassles can arise when you want to patch stuff in as external send/return devices for your DAW in different ways on different mixes.  It becomes an awkward pain in no time.

Patchbays make things convenient and functional by bringing all your connections up in one place.  A modern patch-bay for a home studio lets you patch line-level devices (but not microphones or instruments) to the back of the patchbay.  Most patchbays are arranged in two rows of 1/4″ TRS jack sockets at a time across the front of the unit, and also across the back.  The back is where you plug in the ins and outs from your devices, at which point they become available at the front of the unit (when set to do so) for cables to be inserted at the front to make connections between different sockets at the front of the patchbay.

You might leave your effects and devices permanently wired to the patchbay, and then just put very short cables in the front of the patchbay when you want to interconnect two devices, say an interface output to a hardware compressor input with one cable then back to an interface input from the output of the compressor with another cable.  One pair of cables at the front of the patchbay lets you do this once the patchbay rear is wired up.  You leave the rear wired permanently and work daily with only the front of the patchbay.  Very handy.

There are digital patchbays for AES signals by Z-Systems that work very well.  I have a few of those.  They simply use button pressing rather than cables to route signals around an internal matrix.

That’s it for today, and I’ll see you tomorrow for “Your Monitoring Environment”.

, , , , , , , , , , , , , , ,

No comments yet.

Leave a Reply

dream beautiful music tonight