Words by Emma Warren
Jake Jackson has been working solidly since March, including a month on five hours of music for a computer game that he can’t talk about as well as recording some of the theme tunes for WandaVision and mixing the music for Loki.
There’s always a tantalisingly long list of things that this award-winning recording engineer, mix engineer, and producer for film, TV, and games can’t talk about. The things he can reveal – on top of recent credits on The Green Knight and A Last Letter To Your Lover – are recording the music for Andrew Dominik’s new Marilyn Monroe film Blonde; a new natural history show for the BBC; and a new Shaun The Sheep for Aardman. He’s been busy mixing and recording music for gaming too: he worked on all three of the 2021 Ivor Novello nominations for Best Original Video Game Score (recording half of Ghost of Tsushima and all of the music for Little Orpheus and Ori and Will of the Wisps).
It goes without saying that he’s got a good ear. He grew up being taken to classical concerts by his grandfather from a young age and was surrounded by music before that – his dad was in progressive rock band Van Der Graaf Generator, often playing two saxophones at once in a style inspired by jazz musician Rahsaan Roland Kirk and making the most of octave effect boxes. Jake played in brass bands until the age of 18, did a music degree, and was DJing weekly before starting at AIR Studios as a runner in the late 1990s. “Music just floats through me,” he says. “It’s a love affair with this art medium and I just want to make the music sound as good as I possibly can.”
You mixed the music for Loki but weren’t involved in the recording. What are the challenges around working with material you didn’t record?
That was a fantastic experience. It’s the first time I’ve worked for the Marvel Cinematic Universe, although I recorded some of the theme tunes for WandaVision during the first lockdown. It’s always a bit tricky to work with material when you didn’t record it, just because it takes longer to understand what you’ve got in front of you.
The greatest joy was fitting that into the other world [composer] Natalie Holt’s created with tape loops, theremin, and Hardanger fiddle, to combine that traditional Marvel world of orchestra and percussion with Natalie’s really cool interpretation of electronics. She had a very clear idea of the balance of what she wanted in the sound, but I tried to push it as far as I could to bring my take on it, to blend the two worlds together, which I think worked really well.
What else can you tell us about working with Natalie Holt?
It was the most collaboration I’d done during lockdown, in terms of time spent together. I took the [orchestral] recordings and her material, working on it for a few days then giving her mixes. She’d give me notes, then we’d message backwards and forwards. Then we’d spend a day or two per episode going through it together online, me playing, her tweaking. Even when she approved it we listened through and tweaked it. We spent six months working on it.
Learning how to record an instrument without any prior knowledge of it is a challenge. I’ll talk to the musicians to learn from them, then add my take on how to make it filmic.
What were the challenges?
It was a big break for her, and a big project for me. I knew how important the music was to the show and how cool it was, and how well it would be received. We talked a lot about that. There’s a big scene in Episode 5 to do with the ‘Ride of the Valkyries’ theme, and we had a lot of discussions – how much of that to use. It’s not her music but it’s important to the fans. We spent a lot of time refining the mix, which I’m incredibly proud of. I get a lot of joy seeing how it’s received – it’s a measure of what I’ve done. The day it came out I was scrolling through Twitter and seeing people’s responses, I felt really engrossed, really part of it.
The old phrase for at least part of your work was ‘balance engineer’. How important is balance to what you do?
Balance is my key core value. If you can make music sound balanced you’re halfway to getting a great mix. During a recording session, there’s a balance between getting what you need versus time and budget. You have to balance the requirements of that piece of music you just recorded against the amount of music that you still have to record and the amount of time left.
Is it also a job of translation?
There’s the translation from the composer, which is generally straightforward because we’re talking musician to musician. If you’re talking about producers or directors, sometimes they may not get involved in the music recording process as much. There is that discussion where someone says they don’t like what they’re hearing, and my job is to translate that – is it something big or something small? Their ear might be taken to the balance of a certain instrument or it might be the timbre of a certain instrument. If you can take a clarinet part and pass it to the flute and they love it, it’s the same piece of music, just a different timbre. Then you’ve got to translate it into the paying public, what they’re expecting, what they like, what’s commercial, what’s necessary.
You also have credits on A24’s recent The Green Knight – engineering, co-mixing, co-production. Can you talk us through the recording process?
We recorded it in January 2020 over four days, then two days mixing at my home studio. Then [composer] Daniel Hart took the music back to America, mixed it again with the rest of the score. It was a real collaboration. What was so exciting was the extraordinary sounds we got. We recorded a seven-strong female choir, a concert harp, Celtic harp, percussion, cello, a recorder ensemble.
What can you tell us about the technical challenges on this recording?
The recorder ensembles were the most interesting. We were using recorders I’d never seen before - Paetzold's. They’re massive, as big as a human being. They’re really low, and they’re quite quiet. Despite them being so big you have to blow softly to make them sound pretty. Learning how to record an instrument without any prior knowledge of it is a challenge. I’ll talk to the musicians to learn from them, then add my take on how to make it filmic, and then mix it with the right amount of reverb. The musicians would have two or three recorders each then they’d swap around so it was interesting to work out how best to do it. It was a very collaborative session. Everyone’s input becomes important: musician, engineer, composer. I get a lot out of that because you’re learning at the same time as working.
You want to make sure you’re not the one holding the session up because you haven’t had the foresight to make it work better.
Would that have been about mic choice, mic placement?
Yes, of course. We recorded a couple of microphones – UM900s, some ribbons – so we had options at the recording stage and options in the mix. You have a finite amount of time and budget so you have to hedge your bets. You want to make sure you’re not the one holding the session up because you haven’t had the foresight to make it work better.
How do you check you haven’t missed anything in a mix?
It’s a very important part of a mix to look away or do something completely different. Sometimes at the end of a mix, I’ll get my phone and read the headlines whilst I’m listening. A different part of my brain will hear things I’ve not heard whilst I’ve been staring at the screen for two hours, mixing away and listening intently. The moment I switch my brain off I hear the mistake or the thing I’ve never heard. You need to step away, physically or mentally.
You also work regularly with Spitfire Audio, right? How often do you work on mixes that have used Spitfire samples you recorded or produced?
I record a lot of their samples and I’m also involved in a lot of the R&D, the technical side, and discussions about mixes. When they start thinking about the concept of a library, I work with the team to make decisions about microphone choice, room, and how we’re going to set it up. Every project that I've worked on with composers, pretty much, has Spitfire on it at some point. I like to think of samples as synthesisers really. It’s capturing a performance like you would a synth sound and it becomes a new instrument, sounds you’d never get an orchestra to play.