• wiggly-slider_05

  • wiggly-slider_03

MIDI PROGRAMMING – SYNTHS AND SOUND DESIGN

 

synth module by Roland, the JP-8080

Roland JP-8080 popular tabletop rack-mountable synth module

First, a note to the daily followers of this blog – thanks for following this blog and my apologies to you for my late posting – I had a big bout of dental surgery yesterday.  I’m sore but glad it’s over, and I’m catching up on the blogging now, so do stick around.

A word to plug-in haters.  Welcome to the 21st century, blues pianists and jazz organ cats!   What do you mean, you don’t want to learn about bleeping synths?  Why be a Luddite?

It’s synth heaven nowadays, and anyone can make sounds of astonishing variety and complexity.  You can do virtually anything to incoming audio that you can imagine, and you can generate incredible soundscapes filled with motion and interest, ever-evolving.  You can even have the things write music for you using music generation algorithms of one sort or another, such as the process at the heart of the KORG Karma.

Given the right tools, if you can think it, you can do it, for the most part, once you have a firm grip on synthesis jargon and techniques.

If you already love using synths, and you’re bored with the presets, here’s a few things you might do with a synth, working from scratch to your own desire.

You might easily make a drum kit – roll your own snares, kicks, claps, hi-hats, whatever!  Make fat, funky, righteous bass sounds that shake the house.  If you must try to shift your internal organs with sound, it’s possible  ;-) but inadvisable.  Don’t try that on your small monitors.

How about shimmering pianos laced with dreamy choirs of voices and fizzing filter sweeps?

In fact, entire careers are built on sticking to presets designed by the experts, though you won’t learn a lot that way, other than the guided tour, and what works or not.  Sticking with what works is smart, of course, but it’s also playing it safe and not finding new ground.  The really smart move is to learn everything you can, and there’s no end of usefulness to synthesis in music production.

It’s phenomenal how often synthesis techniques are used as part of the mix process, even in recordings you would swear did not include a synth component in any way.

For example, many a kick drum is subsequently set up on playbacks to trigger a sine wave oscillator playing a very low frequency note  through a noise gate set appropriately to the rhythm feel.  The synth note chosen is best at too low a pitch for human ears to determine with any accuracy.

Drums have differing pitches to some extent depending on how hard you hit them, and how tense they are, and how evenly the tuning of the drumhead has been done.  They can also sympathetically vibrate with neighbouring drums, or even other instruments like pianos, acoustic guitars, etc.

Just sing near the soundhole of an acoustic guitar or mandolin to see how easy it is to set the resonating chamber afire with sound.   The strike of a drum will do this if other things are nearby and free to resonate in sympathy.

All this means is that it sounds fine no matter what pitch the kick drum may sound at.  Pitch is a moving target in a drum performance, in large part, although each drum has a fixed fundamental and series of overtones that dominate due to mechanical construction reasons.  Bending the head in differing amounts as you strike changes the pitch a bit, and every strike bends the head inwards and then the head rebounds and gradually loses energy as it returns to a stationary state – that’s assuming you can wait that long when you’re drumming!

So, in a nutshell, synths may be underpinning the lows of the floor tom and/or kick drum even though you can’t pick them out as discrete sounds, and there are lots of other non-obvious uses for synth components in audio production.

Many plug-in synths are so powerful today that you can’t really think of them as “mere” plug-ins.  Rather, many are vast powerhouses of sounds of every description that can fool with sound in a gazillion ways and playback entire productions in stereo at their outputs, complete with the orchestra and choirs, all the reverb and delay you want on discrete elements, and total control over where it all goes in the stereo image and how loud each element is mixed.

Synths and sound design go hand in hand.  The technical jargon of everyday synthesis provides us with a robust language for discussing properties of sound itself and of specific sounds, and gives us a means for communicating ideas about tones and sounds with other people in the course of making music.

Whether plug-in or hardware synthesizers there is little practical difference between them these days.

If you are a die-hard analog synth user who has not tried plug-ins for a while, I must insist you try them out again before judging them.

Since the first virtual synths began appearing in DAWs in the late 1990’s, virtual synthesizers have been getting ever more powerful.  The limitation most analog players complain about before making a transition to plug-in synths used to be that of having physical knobs and switches and sliders to play with, and occasionally that they preferred single-function controls where possible.   No problem, my 20th century friends.

Hardware controls will be available on the MIDI controllers and/or MIDI keyboards hooked up to your DAW so that you can continue interacting physically with the controls for your synth even though it is now a virtual synth you are most likely using.

This flexibility is at the heart of sound design, and why the synthesizer is the backbone of sound design.  Ever since the days of the BBC Radiophonic Workshop, the synth has been employed for weird and wonderful sounds as well as for the more mundane tasks of impersonating actual real-world instruments.

Ron Grainer’s fabulous theme music for the original 1960’s TV series, Doctor Who, is iconic in the field of synthesis.

Arch-enemies of the Doctor were the Daleks, withered brains inside metal salt-shaker-shaped machinery, and their characters were voiced using sound effects created with a synth technique called ring modulation, a classic synthesis tool of limited use which in the Dalek  reached a pinnacle of functional horror.  When Daleks rolled across the set, yelling “Exterminate” in that escalating shriek, kids everywhere would hide behind the furniture.

The wonderful music was, in modern parlance, “programmed”, but unlike today, this was done through patching discrete electronic components, many modular, and practising live moves with specific controls; setting up start and end positions for each control to be moved, and then recording these sections and pieces of various synthetic sounds.

They would find ways to write down what needed doing so they could more or less successfully recreate the sound later – often a fruitless task, but the best they could do at the time.  This topic will be covered more fully in my next blog, “SYNTHESIZERS – A WAY TO DESCRIBE SOUND”.

The technical assistant in the white coat at the BBC’s Radiophonic Workshop in England at this time was Delia Derbyshire, an under-appreciated and often overlooked female pioneer of synthesis.  She would do this unglamorous work for long hours, capturing bizarre snippets of otherworldly sounds to analog tape and splicing the various recordings together, stitching a glorious quilt of synthetic audio that lives on today in modified form on broadcast networks throughout the globe.   Thanks, Delia, Ron and the BBC.  What a show you made!

Every year around tax time I find myself agitatedly impersonating the Daleks.  “EXTERMINATE”  Try it, it’s very liberating!

At first, everyone tried to make synths sound like instruments that were already known, such as trombone or violin or flute.

It wasn’t just musical instruments of the orchestra and band that early synths tried to mimic.  These early synths provided a welcome means to make all kinds of theatrical sound effects, particularly stormy weather sounds like thunder, wind, hail or rain.  This was possible in large part because of components performing as filters or envelopes that could be adjusted in real-time.  The same tools were soon found to be useful in adding realism to mimicry of real instruments, for example .

The show itself was a very original sci-fi series which began in the 1960’s in black and white with a shoestring special effects budget, but survives today with legions of eager fans, and currently stars the charismatic British actor Matt Smith in the titular role of the Tardis-equipped Time Lord.  The incredible and historic theme tune has been reworked with modern synthesizers but it’s essence is recognizable and essentially intact.

The synths used for this early work were effectively sonic building blocks, mostly primitive electronic circuits that dealt with electronically generated sound in various ways, and the modules were interconnected with patch cords in most cases, allowing different interconnection schemes to be used depending on need.  If you wanted to change the sound significantly, often it would require re-patching the cables between the input and output sockets on the various analog electronic components.

Each switch or knob or slider on these early synths performed a solitary function – multi-function operations were impossible using a single control.  There were no “pages” or screen displays, as today, so information about what you were doing sonically was delivered in the geekiest ways.

You would trace patch cords to see what connections were made between circuits and at first there were no memories to store the changes you made.   They were simply made on the fly, and if you wanted to “restore” a cool sound, you would be forced to write out a recall sheet detailing all the settings and interconnections, and include written or drawn notes on what you had moved in real-time and when during your performances.

The odds of getting a sound back after you had made changes were slim to none most of the time.  The sounds were so cool that this did not put off musicians from exploring the possibilities of synthesis with gusto throughout the 1960’s, with artists from The Beach Boys to The Beatles employing the synths built by synth guru Bob Moog to add interesting sonic textures and effects to their music.  Various synth novelty hits came along, such as “Popcorn” and the ever-popular and much-recycled vocoder sounds, as heard on ELO (the Electric Light Orchestra) recordings and countless others.

Wendy Carlos made great strides in the 1970’s for women in electronic music with her own popular synthesizer recordings.

Dr. Robert Moog made perhaps the greatest synths of all, in the eyes of most synth fans.  Probably, Dr. Moog’s finest creation is the MiniMoog, a fantastic analog synth that provided many equally fantastic bass lines heard on countless hit records like “Jive Talkin'” (the Bee Gees) or “Tom Sawyer” (Rush) or “Flash Light” (Parliament) .

One of Moog’s greatest innovations was his patented “ladder” filter, which contributed greatly to the gloriously funky bass sounds users could coax from the MiniMoog.

In the late 1970’s the advent of the microprocessor (digital chips) changed everything.  Suddenly, all sorts of new tricks were possible at more or less the same cost to manufacturers as before.  Users celebrated by enthusiastically latching on to the new chip-enhanced products.

The first major player on the scene was the Sequential Circuits’ Prophet-5 in 1978, famously supplying the helicopter sound effects for the superb anti-war movie “Apocalypse Now”.  The earliest Prophet-5 models still retained the “one knob, one job” functionality, but gradually multifunction controls appeared.  These early Prophet-5’s had to be left switched on for 20 minutes or so to warm up if you wanted it to work properly and be (more or less) in tune.

Over time, things got better, and multi-function buttons cut costs for manufacturers.  Soon there were relatively complex operating systems in use, with multiple menus and modes and seemingly endless pages of information displayed for users to scroll through, making changes to settings as desired – once they navigated the often labyrinthine synth systems.

MIDI became the norm, and soon keyboards morphed into modules containing sounds but with no attached keyboard.  They were instead controlled by externally attached keyboards connected via MIDI cables.  These are 5-pin DIN standard cables, although not all pins are required for the MIDI spec.  An example is the Roland JP-8080 pictured at the top of this blog post, and naturally it would be played via a keyboard or a sequencer, in either case hooked up with MIDI cables.

MIDI ports

Fortunately, plug-ins are adept at these operational requirements.  They are cut out to work in a friendly, fast way, utilizing the bulk of MIDI and analog techniques in a virtual environment and bringing synthesis to the desktop of anyone with the will to have a go.  There are simple free synths all over the internet, and free trials abound for larger products.

Today, there are no more tuning discrepancies unless they are desired and deliberately modelled in software for authenticity.   These days, sonic adventuring is simple to do and even more fun than it was before.

You can even use a standalone Moog synth app on an iOS device like the iPad while you ride a bus or train.

We’ve come a long way, baby.  Synths rule the roost.  There are more hits made without guitars and with synths today than ever before.  Well over half the records in the chart on any given day will now be synth-heavy and probably have minimal guitar parts, if any.  In my opinion, that’s wonderful, but I’m not giving up my guitars any time soon!

Tomorrow, I will be discussing the nuts and bolts of synthesis in “SYNTHESIZERS – A WAY TO DESCRIBE SOUND”.   See you then, and until then, may you make happy squelchy noises to your heart’s content.

 

, , , , , , , , , , , , ,

No comments yet.

Leave a Reply

dream beautiful music tonight