Music Production and Mixing Tips for Music Producers and Artists | Inside The Mix

#186: Why LUFS Don't Matter As Much As You Think with Ian Shepherd

Marc Matthews Season 5 Episode 11

Send me a message

In this episode of Inside The Mix, mastering engineer Ian Shepherd demystifies loudness metrics and debunks common mastering misconceptions while offering practical advice for producers looking to improve their masters without chasing arbitrary targets. 

Ian explains the role of LUFS in mastering, why normalization matters, and how focusing on musicality and dynamics leads to better results than simply hitting a loudness target.

What You'll Learn:
LUFS measurements explained – momentary, short-term, and integrated loudness
How many LUFS should my master be? Understanding the balance between dynamics and loudness
• The truth about LUFS for Spotify – why 83% of users never change loudness normalization settings
What does LUFS stand for? And why it's just one piece of the mastering puzzle
• How normalization impacts your music across different streaming platforms
• The role of audio normalization in creating a consistent listening experience
• Why AI mastering struggles to match the emotional intent of human engineers
Spotify’s approach to loudness and what it means for your masters
• Internal dynamics – how balancing different sections of your song enhances clarity and impact
• The mastering feedback loop – why collaboration between engineers and artists is key

If you want your music to stand out in today’s Spotify-dominated landscape, don’t obsess over loudness numbers. Instead, focus on musicality, dynamics, and emotional impact. Test how your tracks sound at normalized streaming levels, and let the music, not the meters, drive your mastering decisions.

Links mentioned in this episode:

Support the show

Download your FREE Producer Growth Scorecard

Follow Marc Matthews' Socials:
Instagram | YouTube | Synth Music Mastering

Thanks for listening!!


Ian Shepherd:

The EQ, the amount of bass, the amount of mid-range, the amount of distortion, the amount of density, the amount of stereo width. All of these things have a much bigger influence on how loud we think it feels than the actual LUFS number, because the LUFS number has been changed right. You might have two songs that were 8 dBs apart to start with, but once they're both at the same level, then all the other stuff comes into play. You're listening to the Inside the Mix podcast with your host, mark Matthews.

Marc Matthews:

Welcome to Inside the Mix, your go-to podcast for music creation and production. Whether you're crafting your first track or refining your mixing skills, join me each week for expert interviews, practical tutorials and insights to help you level up your music and smash it in the music industry. Let's dive in. Hey folks, welcome to Inside the Mix, or welcome back. If you are an existing listener, a big welcome. Today I am honored to welcome a true expert in the world of mastering and other things and everything in between, someone who has been at the forefront of the industry for decades as I've got my notes here and has helped educate countless producers and engineers Ian Shepard. Ian, how are you? And thank you for joining me today.

Ian Shepherd:

I'm very well, thanks, Mark. Thanks for inviting me. Glad to be here.

Marc Matthews:

Yes, yes, thank you for joining me on this. I've been looking forward to this. I've got a lot of content to go through in here, some really, really interesting stuff. So for the audience listening who might not be familiar with Ian, I'm just going to read a bit from his bio so we can get a bit of background, so then we can just dig straight into the questions for today.

Marc Matthews:

So, ian, a British mastering engineer and the MD of Mastering Media Limited. Over the course of Ian's career limited over the course of his career he's worked on thousands of cds, dvds and blu-rays for major record labels, tv stations and independents, including several number one signal singles and award-winning albums. And he also runs the popular production advice website and, of course, a fellow podcaster, the host of the mastering show podcast as well. And he's a fierce critic of the loudness wars. I love anything surrounded the loudness wars. I love anything surrounding the loudness wars, always interesting. And on top of that he's co-developed the loudness penalty website with meter plugs. So there's a lot. There's a lot to a lot in there. Ian, you've been very busy, I would say, and a lot to keep you busy as well, most definitely.

Ian Shepherd:

Yeah, for sure.

Marc Matthews:

Yeah, so Ian's going to share his valuable insights about mastering loudness and the evolving landscape of AI. So there's an interesting bit Hopefully we get onto it today about YouTube and its stable audio feature, which I wasn't aware of until I saw one of your pieces of content on Instagram and I was like I had no idea that existed. So that's very interesting. So we're going to start off with LUFS. That existed, so that's very interesting. So we're going to start off with luffs. So my question here do higher luffs masters sound better, even when they're normalized? But I think it's important to start off with just defining luffs, so maybe that's where we could start, ian yeah, absolutely, um.

Ian Shepherd:

So luff a luff is a loudness unit, full scale, like we have dB full scale. So basically you have a loudness unit and that's relative to zero, the top of the meters, and it is our current best attempt to measure perceived loudness. So not just loudness but perceived loudness and the reason I mean it's actually surprisingly tricky. I mean, everybody can hear when stuff is louder or quieter. But you know, the audio signal is stored as an electrical current in a wire and we can measure that really precisely. But if you track the waveform, which is basically what the peak level does, when you see the peak levels on a meter, in a, in a, you know a daw that very often doesn't really bear very much relation to how loud things are. I mean, generally, if the peak levels are higher, the loudness is going to be higher, so it will sound louder. But you can have things where the peaks look huge and it doesn't sound that loud, and you can have stuff that doesn't look that impressive and it actually sounds quite loud. So that's why back in the day you had vu meters, the old needle meters, um, and then moved on from there to rms meters. So there's the durra meter that some people are familiar with, and there's rms, features in again quite a lot of daws and more recently than that, you have loudness units. They're basically the same as rms but they have a filter applied to them to try and make them match what we hear more accurately, because our ears are more sensitive, especially in the mid-range, so they they tweak the frequency response of those and the idea is to try and represent the way how loud it feels right, um, to be more accurate than rms or vu, and it's pretty good. I mean, like, for example, for me, if I'm matching stuff by ear and then I slap a meter on it, very often it will be close, you know, often within like half a dB to a dB or so. And in particular, if I have two things, you know when you're trying to match two things that are exactly the same, that works.

Ian Shepherd:

However, they can be a little bit tricky because there are three different types of loudness unit. Can be a little bit tricky because there are three different types of loudness unit. So there is the momentary LUFS or the short term loudness unit and the integrated loudness. So momentary goes really, really fast and I don't personally find it that helpful for music. So I kind of tend to ignore that. One Short term is measured over a three second window, so it's the meter moves slower, and I find that is actually the most helpful for me when I'm working, because it shows me the loudness at this particular moment that I'm at, um, and it, broadly speaking, is kind of similar to what an rms meter or a vu meter uh show, I mean actually when I'm working, I tend to use a plug-in version of a vu meter.

Ian Shepherd:

I still really like. I find they're really helpful because they're really sensitive around the middle of the range, right. You calibrate them to where you want your zero point to be and I think they go 3 dB above that and maybe 20 dB below that or 30, depending on the exact. So it's really easy to see when you're in the right ballpark, which is obviously kind of an important thing to know when you're mastering, especially. Um, there's just a great visual indicator, but short-term loudness is kind of similar to what you see on there. So I think that's a good way of looking at it.

Ian Shepherd:

Integrated loudness is an overall number for an entire podcast episode or song or album or whatever it. Basically, you start the audio playing, you stop it and that gives you an overall value and it's kind of a good sort of general rule of thumb. You know typically higher numbers as in so less negative numbers. So minus eight is louder than minus 10 is louder than minus 12, right? So you know, minus eight is probably going to sound louder than something that's at minus 12. The problem with it is that it is an overall number and music doesn't do the same thing all the way through. You know it changes you verses and choruses, um songs build and ebb and flow, and also you have different musical genres. So, um, you know, like a really common question I get is you know somebody saying, well, I set the loudness to minus 14 or whatever it was, and that's not the right number. We can come back to that yes, yeah, but they.

Ian Shepherd:

You know I set the loudness to minus 14 and it sounds too loud.

Ian Shepherd:

Very often when that's happening, it's because it was intended to sound quiet right, it makes no sense to make an acoustic guitar ballad the same level as a death metal tune, right, because they're not meant to sound. The measurement will tell you oh, they're equal loudness, but that's artistically wrong. So it's really important to have your wits about you when you're looking at integrated numbers and it's also really important, when people are talking about loudness, to make sure you understand what they're talking about. Because you see, people say, oh, everything goes to minus eight. Right now, I think that's too loud, just for for what it's worth, and we can talk about that later as well.

Ian Shepherd:

But if something, do they mean the integrated loudness is at minus eight or do they mean the short-term loudness at the loudest moments is at minus eight?

Ian Shepherd:

Because that for me is the most helpful way to assess the loudness is what is the loudest part of this song? How loud does it get? And if you think about a song with lots of variety, the overall level is going to be less than that, right? So if somebody says, oh, it's, everything is at minus eight and they mean the loudest bits, that's a bit loud in my opinion, but probably not disastrous. Whereas if they're saying the integrated loudness is minus eight, right, that means overall, the song is minus eight, which means it gets well above minus eight at the loudest moments probably. Um, that could then be an issue, because the louder you go, the less room you have to work in terms of with, in terms of the peaks, um, and more limiting and all that kind of stuff you have to use, and that's where you know there are, uh, there's a balancing act to be done with the sound to get the best possible results.

Marc Matthews:

Yeah, so with that you mentioned the three different types of loft meter readings there. So you mentioned short-term. You've got momentary select yourself. I don't really momentary's there, but it's not one that I pay attention to as much as say short-term and integrated. At what phase of the mastering process are you then sort of focusing on the integrated love? So what I'm getting out here, I guess, is, when you're actually mastering a song and you're and you're looking at the meters, are you mainly focused on the short term at that point, and then then is it, once it's done, then you start looking at the integrated, or are you looking, flicking between the two in the process?

Ian Shepherd:

um, a little bit of both. I mean, the honest answer is I don't really care about either of them. Um, so I because I don't choose my loudness by meters I mean I have a little kind of catchphrase I came up with, I shared on socials, which is that the lufs should be the result of the mastering, not the target of the mastering. Right, so, master it so it sounds good. Then the integrated I mean the integrated value don't get me wrong is important because it's what streaming services use and we can talk about that in more detail. So it has a big impact on what's going to happen to your music when it gets played back on apple music and youtube and spotify and tidal and all the rest of them. So it is important to know at the end of the process and to check that you're happy with the results that you've got. But when I'm working actually, I mean I tend not even to watch an LUFS meter. Um, as I say, I've got my VU meter. So basically, with that I have, for anybody who's interested, I have it calibrated most of them out of the box. You the? The calibration level is minus 18, which means an rms. A sine wave at one kilohertz has an rms of minus 18 um. It's just, it's a calibration thing. That's good for mixing.

Ian Shepherd:

I would say um, because that means you know. You know because the way that I use it is, if you have two things that are more or less kicking the same vu, meat reading they're going to be similar in loudness, right. That's why they're, that's why it's helpful. And if, if the meter is pegged, it might be a bit too loud, and if the meter is way down, then either it's intended to be really quiet or it might be a bit quiet um, when it's hovering around zero, you know that the loudness is roughly in the right ballpark um. So mixing, that's super helpful because it allows you to balance the elements and, um, you know, figure out what's going on when it comes to mastering.

Ian Shepherd:

I change that calibration so I set it to minus 11 um, which means for me, so that means that the loudest sections are kicking up to plus one, plus two, maybe plus three on the meter right, which is up to kind of minus eight um in terms of rms, vu, but also probably lufs, um. So because, as I said, they're very similar and because they focus on the mid-range um, so that means at my loudest moments I may be a couple of dbs louder than minus 10, but overall it's kind of hovering around that range and then I balance everything else musically with that um, which means that if it's a quiet song it'll come out with a lower integrated lofs. If it's a loud song, all the way through the integrated lofs might get up to minus 11, minus 10 for my masters, um, and if it's varied, then the loudest stuff could be up at minus eight, nine or ten, but the the overall, you know, in general it will kind of be a little bit lower down. So that's how I'm, you know, and I don't even judge it by. I mean, what I would say is if the meter is pegged, then so flat out, you know, and just kind of banging up against the stops, that's a that's going to catch my attention and make me think is this actually a little bit hot, you know? And I kind of take a step back, maybe take a break, compare it to some of the other songs.

Ian Shepherd:

Or it could also mean, because VU meters are very sensitive to bass, that I've overdone the low end.

Ian Shepherd:

For example, if the meter looks low but it sounds loud, then maybe the mids are a bit up. So there's a kind of All of these meters, you have to learn how to use them. There's an art to using them. But that's what I'm paying attention to when I'm actually working how it sounds and then just keeping an eye on the VU meter just to make sure nothing crazy is happening. I mean, actually I don't even care about the LUFS at the end because at this point I've done it so often I know what the result is going to be and I know I'm confident that the results work really well. Um, if I have an artist who's particularly, or a client who's particularly kind of uh, concerned about this stuff, then I might kind of do some tests and sometimes I do tests just out of interest. Um, but yeah, for anybody kind of getting into this and trying to get their head around it, it can be really useful, particularly because of this whole thing about, um, online streaming services which most definitely so in.

Marc Matthews:

In summary, there, really you've got to be confident in your conviction. So at the end of it you mentioned there about that you're you're not really paying attention to the metering unless specifically the record label, the artist is, has requested you to do so. So, like you say, you've done it enough times. Now that you you're confident in what has actually come out, the other end is, is what it should be at, which is? It's interesting?

Marc Matthews:

You mentioned about the VU meter and setting that again in mastery, and that's something I'm going to take away because it's not something I'd ever really considered doing. So I'm going to definitely dive into that a bit more as well, in the interest of time. I think it'd be good to move on to loudness normalization so it kind of segues on nicely to the next part and sort of what it means for producers and engineers, because streaming platforms will normalize music and obviously with spotify we can go in and turn that off. I do that myself so that way I can actually hear the level, the actual the, the true. Well, I say true representation. Obviously it's a truncated audio file, but representation of the audio itself. So I wonder, could you explain how normalization works and what producers need to or mix engineers. Mastering engineers need to understand. To ensure their masters translate well across different platforms.

Ian Shepherd:

Normalization is basically users, music fans complain when there are extreme changes in loudness. Right, there's loads of people complaining about the ads on YouTube, for example, tv it's the number one source of complaints is when people can't hear the dialogue or something is super loud, and you know. So people don't like when the loudness changes too dramatically. You know it's kind of slightly different and if you go to the cinema, you want the explosions to pin you to your seat and you want, you know, the really that's a different experience. So streaming services know that. So they introduce normalization and the really simple way is to say that on all the mainstream platforms, the loudest stuff gets reduced in level. There's a lot of detail and nuance involved in that. They all do slightly different things and it works in slightly different ways, but the bottom line is and the most common number is minus 14, which is why everybody hears this number minus 14. Indeed, yes.

Ian Shepherd:

So that is their distribution loudness. So what they're saying is if you master something louder than minus 14, we will turn it down so that it is only minus 14. And they're using the integrated loudness, which is why that number is important. And they're using the integrated loudness, which is why that number is important. That's what normalization is. I want to kind of pick out that. You mentioned that you turn it off on Spotify. I have no problem with that, but it's important that people know that if they do that, they are doing the exact opposite of what most people listening to the music do.

Marc Matthews:

So the stats say 83% of people on Spotify never touch the preferences for loudness. Wow.

Ian Shepherd:

Okay, so four out of five listeners that setting is on by default. They don't know it's there, they don't care. Yeah, this is why I say it's really important to test, right For me. I master it so that it sounds good, and then, if I'm in any doubt, I will pull it into. Well, I've now created an application with Meet the Plugs called Loudness Penalty Studio to help me do this, but you can do it yourself in a DAW.

Ian Shepherd:

You measure what is the integrated loudness of my song, my master, and then you find your favourite reference track, whatever that is something similar that you think sounds amazing everywhere else. Put both of them at minus 14 and play them next to each other and see how they sound next to each other, because that's how most people will hear it. And I say most people because, like I say, it's on by default and most people don't turn it off in spotify. It's on by default in apple music. Now, um, it's not only is it on by default, you can't disable it on youtube, so the loud stuff will always be turned down on youtube. Um, it's on by default in tidal. It's not on for soundcloud and beatport. Those are two platforms, obviously, that are important to people, and it's not that bandcamp doesn't have it yet, but I wouldn't be surprised if these platforms add them in future. Um, and all three of those.

Ian Shepherd:

Well, I mean, it depends what kind of stuff you're releasing, who the audience is, but yeah um, and I mean one of them just as a kind of tangent soundcloud, don't have it, but you look at the top 10 and the loudness is all over the shop. I mean, if you wanted proof that you listeners don't care, you know, I mean literally I had one where the number 10, the number one uh song in this playlist of uh, it was just kind of popular stuff yeah, it's 8 db louder than the thing that came after it, right?

Ian Shepherd:

so the one that came after, even though it was 8 db lower, still managed to get to number two in the chart. So loudness can't be that big of a factor in terms of popularity. This is my opinion, um. So, yeah, that's what loudness is. Loudness normalization is the loud stuff gets turned down. And yeah, I think it's really important that, because when you, I asked this in a facebook group, I said to people do you turn off loudness normalization or not? And it was a, you know, a music producers, engineers facebook group. So, um and uh, three quarters of them said they turn it off, like you, whereas only one in five users on spotify or people out there in the world you know the fans, the people who we care about, who you know it's. So there's this weird thing going on where the people who care most about the music, which is us and the artists, are listening to it in a completely different way than the people who actually buy it are.

Marc Matthews:

Um, so there you go very, very, yeah, everything you've mentioned there as soon as you mentioned it. It makes perfect sense. And also I'm thinking to myself, because I've said this on the podcast before, about how people consume music. And then you see people walking down the street with a mobile phone, or I've seen um deliveroo, um delivered deliverers with a, a, with a jbl speaker on the back, playing out music. So people are. There are other delivery services available. Um, people are listening to it in different contexts and different mediums, different environments. So it's, you're exactly right with that. There. It makes perfect sense. And I didn't realize that statistic was so high for people like you say, I guess a lot of people don't know it exists and, like you say, no, no one really cares. It also makes me think about because I know you can change the EQ in Spotlight. I've never done that. That's one thing I've never done and I don't see why you would ever do that in the first place. But I don't know. Have you got any statistics on that?

Ian Shepherd:

of people actually using the eq on a streaming platform. I don't know, um, I'm guessing. I'm guessing it's small. I mean, I guess maybe you want to do it to adjust for you know kind of a crappy phone speaker or you know little earbuds or something yeah, um but um, I mean the other thing to say so here.

Ian Shepherd:

You didn't ask me this, but kind of one of the big things about mastering is translation, but I think there big things about mastering is translation, but I think there's a misconception about what translation actually means.

Ian Shepherd:

Lots of people think it means you're trying to make it sound the same everywhere, right, but that's impossible because you know you'll never get a pair of these. You know old apple earbuds to sound the same as a car stereo or a set of beats headphones or a PA system. The goal, or translation, is about making things sound right and sound good in comparison to everything else. So if you listen to it on a pair of Apple earbuds, everything is going to sound tinny and rubbish right. So the goal is to sound good in that context, whereas, if you want it to sound good in that context, um, whereas, if and same, if you know you want it to sound good in comparison to everything else on a smartphone or on a um, a speaker or you know whatever it might be, um, so yeah, I just think that's a, that's a point kind of worth making and it's one of the big factors in that is the eq and I think this is one of the things I want to.

Ian Shepherd:

I mean, your initial question was do higher lufs masters always sound louder? I mean, one obvious answer to that is no, because, as I say, it depends whether it's loud all the way through or whether it's very varied. Right, those will give different loudness readings and make you feel different about the loudness. But the other thing is that when you match the loudness which is what happens on all these streaming platforms almost everything else becomes more important. So I did a video of this where, if people want to kind of see it or hear it for themselves, the eq, the amount of bass, the amount of mid-range, the amount of distortion, the amount of density, the amount of stereo width all of these things have a much bigger influence on how loud we think it feels than the actual LUFS number, because the LUFS number has been changed right. You might have two songs that were eight dBs apart to start with, but once they're both at the same level, then all the other stuff comes into play and you know, to give another example, that's why it's important to make this comparison.

Ian Shepherd:

People also make stuff that's too dynamic, so you have a huge contrast between the verse and the chorus sounds amazing when you listen to it in the speaker, in the studio, with the speakers cranked up, yeah, but when you match the loudness and play it against something that's much more consistent, the risk is that the verse will just kind of disappear, you know, into the background noise, um. So that, and that's what I call internal dynamics. So it's the balance between different sections of the song, different instrumentation, different songs on the album. Um, getting all of that right means that things will. When I'm mastering, I'm looking for a center of gravity, a kind of line that runs through, that makes everything feel consistent. Um, and when you get all that stuff right, that's when it translates. Because you play it on a speaker that's got tons of bass, and it'll sound super bassy. You play it on something that's tinny and awful, and it'll sound super tinny.

Marc Matthews:

And getting the loudness optimized is a key bit of that equation interesting you mentioned there about the uh inter I think you mentioned you described as internal dynamics. You're saying that, um, some uh music is is translating, whereby the the quieter sections are too quiet compared to the louder sections. In terms of those, those dynamics, why do you think it is that those particular masters are levering, levering, leaving the mastering studio? Is there sort of a miscommunication going on somewhere with regards to information that's been disseminated, with regards to translation, or is it? I'm interested to know your thoughts on that.

Ian Shepherd:

It's a good question. I mean it kind of leads into the philosophy of mastering, right? What is the role of the mastering engineer Like for me? I'm not trying to stamp my own sound onto anything. When I'm mastering, I come into, I try and have empathy with what the artist or the, the producer, the production team are trying to achieve. So I try and listen and think, okay, this is what they were going for and I'm going to try and get them closer to that. Um, sometimes that means being very minimalist, sometimes it means being very hands-on. Um, and I'm personally will just do whatever I think is necessary. There are mastering engineers there who feel that it should be very minimalist and that there's a, there's a kind of a line they won't cross.

Ian Shepherd:

You know yeah and so for me just to take an example, one of the, I think, most important things that I do um, when I'm mastering a song, I start off with the loudest section, um, and the kind of the fullest eq and all the rest of it. I'll listen to that, I'll get that, so it's really working for me. And then I'll probably go back and listen to a quieter section and check that, and if the mix was absolutely spot on, chances are that section will also sound amazing. Um, but if you get something in the situation you're asking about where there is a big difference, you might kind of think oh okay, so now the verse is a bit quieter, so I'm tempted to push the level of the whole song up. But then when I go back to the loud section, that's going to hit all my dynamics, processing, the limiting, compression, all that kind of stuff much harder. That might also work and sound amazing, or it might be too much. So then you've got three choices you go with setting it by the loudest section and leaving the verse a bit quiet. You go with setting it by the verse and having the chorus too loud and smashing up into a limiter, which happens way more often than I would like for me. I don't have a problem, I will put in some automation level automation. I choose the crossover points and the crossfades really, really carefully, um, to just rebalance it ever so slightly. Um, so I might bring the verse up but leave the chorus so that it just hits all of the dynamics processing really nicely. And my goal is not to change the the levels, it's to make it feel right, um, and now there are mastering engineers who would say that's completely overstepping the mark because I'm messing with the mix at that point. Right, the mix has made a choice.

Ian Shepherd:

So the answer to your question could be respect for the artists and their decisions, which obviously is important. Whether or not that's misguided is. It could be that they're a bit minimalist and they just don't feel that that's the role of a mastering engineer, or it could be that they I think those are probably the two main reasons it could be lack of experience, I mean you know or just a different perspective, because at the end of the day, all of this comes down to taste and opinion. You know something that is perfect for me when mastering in general, you can spend ages working on something and then you loudness match it with the original and do a comparison, the difference doesn't seem that huge.

Ian Shepherd:

And you think, well, I did all of that work. What happened? And the answer is, if instead I had just cranked the source up to into a limiter, then when you do the comparison, they wouldn't sound remotely close right. The EQ balance would be wrong, there'd be distortion, there'd be pumping, it will be messed up. You put all of that work in in order to keep everything that was great about the mix and make it translate in the context of the master and the final loudness and the. You know all the other songs on the album and all the rest of it. So, yeah, I think it's complicated.

Marc Matthews:

Yeah, most definitely, and I like what you said there about it's sort of your. You're catering to the artist there. It's the artist's vision and that's what you're trying to draw out of it. But I do have a question. This is I'm going on a slight tangent here.

Marc Matthews:

I do this quite a lot on the podcast, but I've been listening to other podcasts as I do, and other schools of thought, and I was listening to one the other day and it was regards to feedback and I want to get your opinion on this. So a mix engineer, an artist, sends you a song for mastering and I think there are two schools of thought here in terms of some mastering engineers will provide that feedback loop and offer some feedback, potential advice, let's say in terms of how maybe the mix could, they could change the mix to be to help with the mastering process. Alternatively, there's a I find there's another school which is that's the mix they've submitted and that's the mix I'm going to master using my tools and techniques. What are your thoughts on the two sort of very crude camp descriptions I've created there?

Ian Shepherd:

I am a bit of both. What I tend to do I I think it's really important as a mastering engineer, like I say, to have empathy and to have respect for, for, for all the work that has gone into the music before it reaches us. So I mean, there are mastering engineers, I know who. The first thing they do for the majority of their clients is kick it back and say, no, you've got, you've got to do this, this, this, this and this, and their clients love them for it. Right that they, they look forward to it. It's why they use those engineers. They happily make those changes and then they end up with a master they're happy with.

Ian Shepherd:

Um, that's not me, unless there's something that's clearly a technical fault. Um, so my favorite example is you know, somebody's flipped the polarity on one channel of a stereo piano. So which kind of sounds huge and wide when you listen to it in stereo and you hit mono and the piano just disappears. That, to me, is an issue that can't be fixed in the mastering. They need to correct it. I'm going to make them aware of that. Or if there's blatant distortion or I don't know any kind of technical fault, but if the mix is in good enough shape for me to be happy to master it. Um, I will do my best um get the best out of it, assuming that everything that's happened is deliberate right, so in that sense I'm respecting the mix.

Ian Shepherd:

Now, sometimes that can be quite hands-on. I quite have some quite extreme eqs or a bit of automation or whatever it might be, um, messing with stereo width sometimes, whatever. But and then I'll pass that back to them, say, here you go, this is this is my take on it if there's something that I think and a very common one is actually it's been.

Ian Shepherd:

The mix was super hot to begin with. You know, back in the 90s one of the big things about mastering was making things louder because people didn't have access to all of the tools that were in mastering studios. So a big thing that you could bring was increasing the density maybe a little bit of, you know, saturation, just everything thicker and bigger and all the rest of it. These days everybody's got those tools. There's a million and one plugins and for me everything comes in has had too much of that already so often it's about trying to get back into more space and punch and impact and that kind of stuff into, and there's only so much you can do at the mastering stage in that situation. So if something I think is a bit overcooked, then I'll say I do wonder whether I could get something even better if the mix had been, you know, just taken back a little bit, if we could ease off the final compression and limiting or maximization, saturation, whatever it is. If that's an experiment you'd like me to try, that would be fantastic. On the other hand, if you're really happy with this as it is, I'm completely comfortable with this master. So I do offer that feedback, but it's not a. This is wrong, it's a. I think I could get an even better result here. Do you want to try it?

Ian Shepherd:

And sometimes they say yes and sometimes they say no. Either of those is fine with me. What I will say is that, with 100% success rate, when they say yes, we want to try it, I end up with a master that we all like. Better right, because I have that original mix. I have that original vision in mind. I know where they're trying to go, so I'm going to maintain that. But I have more flexibility to work when I've got more room to work with in terms of loudness in particular, and yeah, so we end up with something that sounds even better than it would otherwise have done. So, as I say, it's kind of a mixture of those two approaches.

Marc Matthews:

Yeah, I like that approach as well. They're sort of like you're offering some insight into we could try this, but it's open to we could try it or we could not at the end of the day, which I think that's a nice relationship to have, I think with clients as well, when it comes to the artists, record labels and mastering and whatnot.

Ian Shepherd:

You haven't asked this, but I will say. The one thing that kind of completely mystifies me is when mastering engineers won't give any feedback. People have come to me and said, oh, I submitted this to xyz, big name online mastering service. They sent this back and I don't like it. So, and I always my first thing I would say was well, have you asked them to? Have you told them that? Have you said, you know, that wasn't really what I was hoping for, can we? And sometimes they say, oh, no, I couldn't possibly do that. And then other times they say yeah, and I just got this answer about no, that's the master, you take it or leave it, and that to me that makes no sense at all.

Ian Shepherd:

For me, mastering is a collaboration, right? Yeah, I'm working together with the client. Um, you need a conversation? I mean, sometimes it's just here are the files, okay, here are the masters. Yeah, sounds great, thanks, fantastic. You know, actually that almost makes me a little bit nervous, because I'm kind of wondering how critically are they listening? You know, I'm almost a little bit more comfortable if they come back say, yeah, it's great, but can we just tweak x, y, because then I know that they've really they're not just blindly trusting me yeah, um, yeah, most yeah so so when there's no conversation at all, that to me I can't get my head around that yeah, that that's.

Marc Matthews:

That's very odd, um, because I how does I mean? This is obviously I'm not privy to to these conversations, but I wonder how that works in terms of the revision process. Do they just is it a case that they just say there are no revisions, or it just seems? It just seems like a very, very odd cycle to not have that feedback? But, yeah, very, very uh, very weird. What you mentioned there as well about feedback feeds in nicely to the, the question surrounding ai, assisted mastering, being a valid option, which we'll get to in a bit, because that's, I think, another part of where that potentially falls down in terms of that feedback loop. But I just want to circle back, so I've taken us on this trip slightly out of where we were going originally, uh, which was regards to the negative 14 LUFS for streaming platforms and what you can share with our audience with regards to that in particular. Can you explain why and what producers, artists, engineers should consider with regards to negative 14 LUFS?

Ian Shepherd:

I think, as I mentioned before, the thing to do is to test what your music sounds like when you adjust it to minus 14, along with everything else, right? So how does my song at minus 14 sound? Next to my favourite reference at minus 14? Because that's how most of the streaming services are going to play it back. I mean, apple plays it at minus 16, you know, but the majority, and again, in fact, the minus 14 isn't important at all. The point is matched loudness, right, because if you master louder than minus 14, it will be reduced. Everything gets reduced to minus 14. So that's how, like I say something like at least 80 of people are going to hear it. So that's the importance of it. Other than that's not important at all. Um, it's unfortunate.

Ian Shepherd:

There are some guidelines on Spotify's website, which they're well-intentioned, but I think they're a bit misleading because they suggest oh, I think what they say is if you don't want the loudness of your music to be changed, master it at minus 14. That's accurate, right? If you submit something at minus 14, it'll get played back at minus 14. But for me, aiming for a particular LUFS value, it, for me, aiming for a particular LUFS value, it doesn't make any sense right, because of the genre you know, is it an acoustic ballad or is it a death metal song? But also, how do I know what the final LUFS is going to be if I haven't even heard the song? So I actually suggest people don't have any kind of target in mind at all If they want to. I realise that's not terribly helpful for anybody kind of try to do this for themselves though. So if you're getting started, my recommendation is to make the loudest moments around about minus 10 short term and no louder. And if you make that consistent from song to song and then balance everything else so that it feels good, um, my experience is you're going to be in great shape. And I mean, I've been recommending this now for 15, 20 years and I've had hundreds of people, thousands of people, take the advice and hundreds of people tell me that it's worked for them and it's really helpful.

Ian Shepherd:

So, yeah, if your goal is to go louder, then you just choose a louder number, right, but consistent, short-term loudness at the loud moments. Balance everything else musically. And another important part of the equation is choose your mastering monitoring level, the gain that's going to your amps, and stick with it, right. So let's say you follow my guidelines, you pull your favorite reference track in, you play it back, you look at the loudest sections, you measure it. You go oh, that's okay, I'll adjust it down by 2 dB. That's now at minus 10.

Ian Shepherd:

Then adjust the gain on your monitoring so that that sounds really good and loud, but not, you know, not fatiguing, not so. It's quick, giving you making your ears ring, but also really exciting. You might have to kind of tweak it a bit over the next couple of days just to find the perfect setting. But once you find that perfect setting where you make the loudest sections at minus 10 and everything starts sounding great, mark it, you know, a bit of Tipp-Ex on the volume dial or the gain pot, whatever it is, and stick with that always.

Ian Shepherd:

So you're then combining the metering with what you're hearing, and there's lots of technical reasons for that. It's to do with the sensitivity of ears. That's when, when the gain is in the right place, the frequency response of our hearing is the flattest. Excuse me, but also you just start to learn over time what stuff is intended to sound like. You know because and you you'll get a clue from the waveform but you just put something on and you go, oh, that's super loud or it's super quiet. And, in terms of choosing the loudness, you just adjust the gain until it sits right in terms of the meters and your monitoring level and you're going to be in great shape. And they're two really simple steps, but they can absolutely transform the results you get when you're mastering. And, yeah, my guideline is the loudest bits should be minus 10, but people can take that with a pinch of salt.

Marc Matthews:

Fantastic advice and what you mentioned there about the gain and the monitoring level as well. I think I was rummaging around and listening to various stuff the other day and I heard somebody mention a level of 80 dB, because you mentioned there about it gets to a particular level where everything sounds at its most even. Does that ring true with you at all? I think it was 80. It might have been 85.

Ian Shepherd:

Honestly, it's very personal and when you're talking about stuff that's loud, the difference between 80 and 85 is a lot. So, yeah, I think that's the right ballpark. I forget I've got an app on my phone. I mean, obviously the mic on the phone isn't the most accurate anyway, but just to kind of get a ballpark. You can play some pink noise and get it playing back, and I think mine is more like 75 or 76 on average, but the loudest bits get up close to 80 probably. I might have those numbers wrong, but so, yeah, I I think actually our ears are a better guide.

Ian Shepherd:

You know it's pick something that feels right. Try working with it for a few hours. If you're getting to the end of it and you're starting to get tinnitus you know your ears are ringing a bit just tweak it down ever so slightly. You know, basically, when it's too loud it'll be uncomfortable. When it's quiet, you'll just constantly want to push things higher and higher. So if you're constantly, then that's a clue to tweak it up a bit and it just, you know, give yourself a few days to figure out where it is, and I think most people will get there.

Marc Matthews:

Yeah, while you were talking there about using your ears, in my head I was sort of summarizing the questions and the line of topic so far and what it's made me realize is that I mean, I like to think I don't do this in general, but I think at the moment there is I don't know, maybe it's an indictment on me and maybe music at the moment but targets and it's a topic of conversation and I cannot help but keep falling into that conversation in terms of target, and I don't know if it's because of the content I consume or the way. Well, it probably is that to be fair, but yeah, it just seems to be targets. We're talking about the negative 14 LUFS, and then I mentioned about the 80DB and I'm thinking why do I keep having these targets in my head? Maybe it's because I see them so much.

Ian Shepherd:

Well, yeah, people use it. So I'm a member of the AES, the Audio Engineering Society, and I got involved in so we drafted some guidelines for streaming services, not for music producers, but for streaming services about playback levels levels we actually recommended minus 16 instead of minus 14 to them. Um, but this is all about the distribution loudness, right, it's not about. And the interesting thing was I was involved in the original and then I came in towards the end of the, the update, um, and I just dropped a huge spanner in the works because I said can we please not use the word target? All the way through it said target Targets for the streaming services, right, for the distributors. It's a valid kind of use of the word in that sense, if you want to not upset people with super loudness, have a target for the loudest stuff. But the problem is music producers read that stuff and think it applies to them, right. And we also added a paragraph at the beginning in fact that was already there saying these are guidelines for streaming services, not for content creators, not for musicians and mastering engineers and all the rest of it. But also I think it was something like 40 uses of the word target, and that's why we came up with the phrase distribution loudness. So instead of target, it says distribution loudness to try and make it clear that's the final volume that's going to be played at. You can make it however loud or quiet you want going in. I recommend you've got to test it to make sure you're happy with the results, right, because if you have stuff, something we haven't mentioned yet is that some of the streaming platforms don't turn quiet songs up. Youtube is a big example and, by the way, in terms of statistics I was saying about that 80 thing spotify is less than well. Youtube is four-fifths of the music, online music listening market. Right, four-fifths of the users are listening to music on youtube video not youtube music, youtube video and that's where everything is normalized. So, whether actually that whole thing about spotify, whether people turn it on or off or not, that's already a tiny little slice of the of the pie. But, um, yeah, the distribution loudness is for the streaming services. So it's interesting and it's important to test because, yeah, let's say, you master something, you want it to be super loud and you master it at minus 16. Youtube is not going to turn that up.

Ian Shepherd:

Um, they will turn stuff that's louder down to minus 14, but because if they tried to turn it up it would cause clipping or they'd have to use limiting. So they don't do that, so they leave anything that's lower, they leave low. So that means that your loud song won't sound loud as you intend, because all the loud stuff is at minus 14, right? So if you want your music to sound loud, you do need to be at least at minus 14 or above. I guess that's another reason to pay attention to that number. Right, it's like if you want to be loud it's got to be at least minus 14. But the reality is most people do their stuff louder than that anyway, so it's probably fine. Um, yeah, sorry, that was a little tangent of mine.

Marc Matthews:

no, no, no, thank you, yeah, it just like. Like it's that word, isn't it target? And and I fall into that trap too many times Not that I work towards targets, but it just so happens it comes up in conversation and it's just moving away from it.

Ian Shepherd:

I've got to take some responsibility, right, because I've been talking about this stuff for decades now, and when LUFS first came out, I was like, oh great, this is really going to help people, right, because it's way better than RMS's not so sensitive to base, it's. You know closer to what we hear and we can give these guidelines and actually there are rules, so that's gonna help people to understand it.

Ian Shepherd:

the downside of that and talking about numbers all the time is that people do start to obsess about them yeah, the irony of it is that when I actually work, I don't pay any attention to the numbers until right towards the end of the process, you know, when it's like, okay, I better do as much like you do your I don't know your car test right. For me as a mastering engineer, it's like, okay, let's just double check. How does it sound at minus 14? Yeah, that's good. Okay, move on. Um, so yeah, I absolutely agree we shouldn't have any targets. It's good to have guidelines, it's good to understand how this stuff affects what we do, but there's no need to aim for it.

Marc Matthews:

Yeah, 100%. I totally agree. Wise words, I think, in the interest of time here, and I think it's important that we touch on AI-assisted mastering, which is an interesting topic of conversation. It's one I've had on the podcast a few times and I find that there are general commonalities in terms of the discussions surrounding this. So, in your experience with AI-assisted mastering, it is becoming more popular. We know this. Where does it fall short? And do you see I'm fairly certain I'm going to know your answer on this do you see it becoming a serious alternative to human mastering engineers?

Ian Shepherd:

serious alternative to human mastering engineers? I think the short answer is it falls short on uh being able to understand emotion and intent. You know it has no empathy. I've used the word empathy so many times in terms of mastering and understanding what the client is going for. Currently, the machines don't have that and I'm I'm personally skeptical about whether they will ever have that.

Ian Shepherd:

You know they can. They can do a convincing impression in terms of text, but, you know, because there are loud songs that are meant to sound sad, right, and there are quiet songs that are meant to sound angry. So it's, I think. You know mastering for me is all about people. I guess that's my big thing about AI mastering is it's not mastering right, because mastering is a human being, a mastering engineer having a conversation, you know, building a rapport, uh collaborating with another person to get the best possible results out of the music, and so you know their, their ai optimization. Or you know it's like, it's like the wizard, the magic wand tool in photoshop or yeah, yeah, yeah it does the best it can.

Ian Shepherd:

But I mean, I had a photograph I took in a studio the day. Um, I clicked the magic wand button and it turned me green, but normally it's a slight improvement, yeah, fine, but in that case just got it completely wrong, did not understand what it was, something about the color balance, or because it had colored lights in studio, you know, and the with the AI tools, I think the other thing kind of even beyond that, is that they don't have any understanding, currently at least, and again I'm skeptical about whether they ever will of context. So I actually did a test for this. I did a talk at a university recently where they were asking specifically about AI and I have a project where the client sent me the ai version, ai mastered version and mastered in air quotes that they already had and they said this sounds all right to me, but I'm pretty sure you can do better. Um, and I ended up running it through. I put it in logic, I ran it through ozone. They'd already done it through one of the online services, I can't remember um, so I had four or five different versions and the interesting thing about it was when you being me, first thing I did was match the loudness right, because I knew some of them were going to be louder than others. Some of them were going to be quieter and that would be deceptive, because turn up the loudness, you hear more bass and treble, you think it sounds better, but it's just louder. So the first thing I did was match the loudness and when you compare them that way, just going from master to master, there were differences between them but they were kind of okay. You know they weren't. I didn't think they were fantastic, but they weren't disastrous for the most part. Then I played from song one to song two, because it was an album.

Ian Shepherd:

Um, and artistically speaking, creatively speaking, the second song was about from most of the ai tools was about 3db too quiet because it was intended to be a big, loud song. It was following a more gentle song but the ai had just gone oh, it needs to be this right without any understanding of what had gone before or after, any context, any of that stuff. So that's not to say they won't figure that stuff out in future, but I think it's a much bigger challenge. I mean because at the end of the day, a lot of what we do in mastering does feel like it could be automatable. You know you turn it up to a certain kind of level. You broadly match the balance, the eq over the frequency range. You know you take a listen to the stereo image, all that kind of stuff.

Ian Shepherd:

But when you get into the how hard should this song hit after the previous one, you know what was the emotional intent, why did they make? You know, and also the, the services don't. I was going to say what? Should it be this distorted? But ai doesn't care about distortion or clicks or thumps. Or you know pianos that have got the polarity flipped. Or you know verse, chorus, balances or any of that stuff. They just you know it's, they go through. So all of that sounds quite negative. I will say you know verse, chorus, balances or any of that stuff. They just you know it's, they go through. So all of that sounds quite negative.

Ian Shepherd:

I will say, you know, the great thing is they're really accessible, they're really affordable. Um, and people are using them in very creative ways. So there are people submitting a song, listening to the master and going that's not quite right, tweaking the mix, resubmitting it, listening to the new master no, it's still not quite right. And they do like multiple passes through and each one is, you know, if they've got a subscription, doesn't cost them any extra money, happens really fast. I think all that stuff is really cool and you know if. If people want to use it of course they absolutely should, and if it helps them get results that they're happy with, then that's fantastic.

Ian Shepherd:

But one reservation I have about that, which is a kind of aspect of mastering we haven't touched on yet, is one of the big challenges in mastering is working on your own stuff. For me, as a mastering engineer, somebody sends me something. I listen to it and go oh yeah, I think that's what you're trying to achieve. Here's how I think you could improve it. I'm a different person. I'm listening on different monitoring that is set up as best I can possibly get it in a room that I've worked in for decades. That gives me a really important perspective, a separate perspective from the person working on it. If you've been working on a song for six months, you've got this story in your head about why the kick drum is a bit boomy and why the vocal doesn't quite sit right there and all the rest of it, and it's very hard to disentangle yourself from all of that, take a step back and say, okay, here's what should happen.

Ian Shepherd:

Years ago I did a thing with I don't know whether you know joe gilda and graham cochran. They had a site called dueling mixes where every month they would both mix the same song and talk about the differences. And I did a little thing with them called Dueling Masters, where they both did their mix of the same song and I then mastered them and I think it was Graham's had a big ring in the kick drum. So within the first 10 seconds of listening I was like oh, that's actually my thought process. Was that not going to translate? If you play that on little speakers or something that's got a bit of a bass resonance, there's a good chance that's going to crack up you know, it's just over the top.

Ian Shepherd:

So I went straight in with a notch and just ducked it out and graham was actually shocked. You know, he was like I really like that. That's part of the character of the sound and all the rest of it. Um, if that, if he was my client for real, we would then have a conversation, and I'm not saying I would force that on him. I'd just say, well, here are my reasons, here's my opinion, here's what I think I should do. But the point for this is that it would never have occurred to him to do that, and immediately I had. So that's an advantage I have is just being a different person and having a different set of experience.

Ian Shepherd:

If you're using AI tools to try and get great results, it's still you in your room, with all of the great qualities and maybe you know, nobody has a perfect room. Um, if, if, if you're not hearing quite enough bass, you're not going to judge things in quite the right way relative to everything else out there and it's going to make it hard to get things to translate. So, even when you think the ai has got a perfect result, it's still your opinion in your room, which, if you're lucky, the room is great and your opinion is absolutely spot on and it's all good, but lots of people are unlucky. Um, you know they, and that's another challenge. You know, I'm just it's so, yeah, short version.

Marc Matthews:

They're great, useful tools, but they're not there yet and I'm not sure they ever will be I would have to agree and I think what you highlighted there about the feedback loop is is very important in terms. If you're not going to get that, I mean you can, like you say, if you've got that unlimited or that subscription and you can keep submitting masters to it and and and then refining your mix, submitting again, but ultimately you're not going to have that human interaction, that rapport, that relationship with the mastering engineer that you highlighted there. And I also think as well you mentioned there about Graham Cochran and the ring in the. Was it the bass drum? I think I think it was the bass drum In terms of he liked that and that's what he wanted, and you that that's a human decision and that's not something you're going to get like.

Marc Matthews:

If you wanted to keep that in, or you're an artist and you wanted to keep that in and you wanted that extra, that you wanted that ring or you wanted that base to, you wanted that extra base, the ar is not going to wrap, the air is not going to acknowledge that. I mean at the moment it may do with prompts further down the line, but I don't think it. I don't. I mean I might get caught out on this in five years time and say you were totally wrong. But I don't think it. I don't think it will. I just don't think, and that's why I don't think it's not going to, isn't it? Ultimately, it's not going to replace mastering engineers.

Ian Shepherd:

You can't even. I mean, it's the conversation right. So for me, I mean, that was my instinct. Graham was shocked by it. He told me he was shocked and I don't remember I probably eased back on it, you know, I probably. Well, okay, well, and that's one of the I mean again, it's a kind of general thing about mastering. How far do you go? You know, that's what the mix was. Here's what I hear in my head.

Ian Shepherd:

And so one of the plugins I developed, um, is called perception ab and it's basically just automatic loudness matching and before and after ab comparisons. Because one of the big things in mastering is you do all this stuff. You almost always make it louder. So when you're comparing with the original mix, it's really important to balance the loudness so you don't just think it's better because it's louder. That's actually a really difficult skill. So Perception does that automatically for you. You just click a button and it's done.

Ian Shepherd:

Quite often when I do that, you know, especially on the first track on an album, you know I'm like, oh, this is fantastic, I'm going to do this, I'm going to do all this. Oh, that's amazing. And I put perception on. Listen to it loud as much. I think, ok, I've overdone it. You know I've got carried away. Might have sounded great to me in the moment, but actually that's not what they were going for. Right, it's not find a balance where I keep all the good stuff, hopefully I get all the improvement and the benefits, without pushing things too far. Um, and yeah, in terms of if.

Ian Shepherd:

If that thing with graham had been a real project, you know, if the client said to me well, where did the ring and the kick drum go? I'd explain why and we'd have a conversation and we'd decide and maybe I'd put it all back in um, or maybe I would. It would be we'd split the difference or maybe they'd'd then decide to go with what I was suggesting. That's a conversation you can't have with a machine. You don't even that often have that much control over. I mean, I have a bit more time for things like Ozone, where it's not kind of done and dusted. It gives you a bunch of suggestions and you can preview because it gives you a bunch of different processes. You say, well, I like what that's doing, I don't like that.

Ian Shepherd:

But then again, one of the big things about mastering engineers you're going to somebody for an opinion. You know if you, if you think your mix is already perfect, you don't need a mastering engineer. You should just, you know, choose the level, find a limited setting that works and release it. When people go for mastering come to me for mastering they because they value my opinion, they think there's something helpful I can bring to the table. Um, if I mean, there's another thing about ai, is there an ai mastering algorithm at the moment where you put the same song in twice and it listens and goes yeah, that's perfect. I've already done that. I don't think there is. I think every time you put it through, it'll squash it a little bit more, it'll, you know I see, yeah, yeah, yeah a bit more and and so so actually is there, even because one of the benefits is to get a bit of an opinion right.

Ian Shepherd:

Rather than going to a person, you give it to the algorithm and you say what do you think this should sound like? And you listen to it. If it sounds good to, you go with it, but if the algorithm is going to keep doing stuff over and over again, I mean the algorithm doesn't have an opinion right. It has a set of rules that it's applying. Yeah, so I don't see it being a problem for mastering engineers in future, and I think it's beneficial for people who can't afford a mastering engineer or don't want to go through that process. And if it encourages loads of people to improve their monitoring and start referencing things more and loudness matching things more, that's all going to be good.

Marc Matthews:

Yeah, most definitely I would agree, I would totally agree.

Ian Shepherd:

But yeah, I think you need to have your eyes open when you're going into it. It's not a situation of like oh, this is just a perfect replacement for a mastering engineer and I saved myself a bunch of money. I think that would be misguided.

Marc Matthews:

Ian, I think that would be misguided. Ian, I realise the time here and I think it's important because I mentioned this right at the beginning that we were going to quickly. I think we'll have to do it quickly touch on the new stable audio feature in YouTube. So I appreciate we've moved on now from the AI mastering conversation, but I think it'd be an important one to quickly talk about because I didn't know it was a thing and I'd hazard a guess and say a large number of the audience are probably in the same boat.

Ian Shepherd:

so maybe if you could just quickly describe what it is and what we should be aware of, I think the first thing I should say the good news is it doesn't apply to music, or it's not intended to apply to music, so it's not a major concern for people, for you, in terms of releasing, uploading your stuff to youtube, right? Um, the it's the. The platform seems to be pretty good at going oh, this is music, I'm going to turn this feature off, so that's good. Um, unfortunately, it's not perfect, so there's stuff that definitely is music and quite often, sadly, it's classical music, where this is probably the worst thing that could happen. Um, so you need to keep your wits about you.

Ian Shepherd:

If, if you're uploading stuff and it sounds really odd, it's worth checking this. Um, the idea of it is. It's a good idea. It's to help people with uh material on youtube that has a very wide dynamic range when they're listening in noisy environments, especially, you know, environments where things are. So if you're trying to watch a feature film, the loud sections are really loud, the dialogue much quieter. If you're on a bus or in a, you know, a tube, a subway, it's possible that you will constantly have to keep adjusting the volume control to hear the quiet stuff. Um, so youtube have added this feature called stable volume and it basically lifts the quiet moments and turns down the loud moment, loudest moments. So it is dynamic range control. It is a compressor basically um, it's not a very sophisticated compressor, it's not a great sounding compressor. It does the job and I actually, like I say I think it's a good idea for, and particularly like feature films could be an example, but also kind of um, more diy content that's up on youtube where maybe you know the person doing it is not a sound person. They're focused on the video. So super loud moments or bits where it's hard to hear the dialogue and the audio quality is not that great, it's beneficial, so that's why they're doing it.

Ian Shepherd:

The problem that I want people to be aware of is there's a good chance it's being applied to videos that I've made, videos that people like you have made, anybody who's trying to where the audio is important but it's not music, right. So if I'm trying to do, I have done this. There's a video of mine. If people go and find that video I mentioned, maybe we could put a link in the show notes for people with the comparisons of loudness so they can hear what I'm saying about the LUFS values.

Ian Shepherd:

I made that video and for weeks or months, I didn't realize that people weren't hearing it the way I intended, because I was saying, listen to the differences here with these two songs, but the loudness was being changed by YouTube, by this stable volume feature, as people were listening. So if anybody here is watching a music production video or an audio production video on YouTube, I'm thinking, well, this is already by this stable volume feature, as people were listening. So if anybody here is watching a music production video or an audio production video on YouTube and thinking, well, this is really interesting, but I'm not 100% sure I can hear it, you need to jump into the settings and try to see if this feature is enabled, because they've just turned it on by default for anything that's not music. And that's why I'm making people aware of it, because obviously I mean honestly it ruins a good number of my videos, because I'm talking about quite subtle differences and you know, as we've been saying all the way through, the loudness has a huge influence on the way people hear things.

Ian Shepherd:

So if it's changing well and if it's adding extra compression, anybody listening is going to know that that's not a great. You know you're not hearing things accurately, basically. So, yeah, it's definitely something to watch out for. That that's not a great, you know, you're not hearing things accurately, basically, um, so, yeah, it's definitely something to to watch out for and if, as I say, hopefully, if you're a musician, it's not going to be affecting the stuff that you're uploading, uh, but it it is worth watching out for. Um, I'm hoping they'll change it. I'm hoping they'll give people you know, somebody like me with the youtube channel the option to just say no, I don't want this for people watching my channel.

Marc Matthews:

That would, would make sense.

Ian Shepherd:

Yeah, that's not how it works currently, unfortunately. So yeah, hopefully that answers the question.

Marc Matthews:

Yeah, it does. Yeah, and now that you've described it again, I can remember why it's stuck in my head, because it was for that reason of the tutorial videos not being consumed the way that they were intended to be, because this setting is on and you'd like to think that they would change it so that, as a as a content creator, you're in control, and then if people want to enable it, then then that's their choice, but by default, your content is uploaded the way you intended it to be, which would make sense.

Ian Shepherd:

um, yeah, and just there's a tiny little important detail there. It doesn't actually change it's, it's the playback that changes it, right? So when I upload stuff, it goes up and it's on the server as I intend yeah, this is an in-browser thing, so if people want to disable it, there's a video on my um, my channel, on my my page, uh, my website, where people can.

Ian Shepherd:

It demonstrates. But basically, if you're in a desktop browser, it's the cog icon at the bottom of the video window. Just click that and you'll see an option to turn it off. If it's on mobile, you have to you go to settings and then it's. I think it's in additional or extra features or something, so it's a bit more buried there, um, and the good news is that once you've turned it off, it's off. Um, so once people have done this, they don't have to worry about it going forwards, unless it accidentally gets switched back on somewhere. Um, but yeah, the big thing is that most people don't know about it that was gonna be my question then.

Marc Matthews:

I was. I was gonna ask you like do you know whether or not it is a uh, once you disable it is then not permanently disabled, but it is in effect until you then re-enable it. But that that's good to know, that that is the case. So, audience listening, be be aware of this when you are watching tutorial videos or something along those lines yeah, absolutely it's so.

Ian Shepherd:

And just again, tiny little detail, it's off in that particular browser, right? So if you're using chrome and you turn it off in chrome, it'll be off in chrome, but it will still be on on your phone, it doesn't it doesn't go with your youtube account or your login details or any of that stuff.

Ian Shepherd:

So wherever you want to listen, if you're bothered about it, you need to turn it off on a you know or a different computer. Or if you use some safari sometimes and chrome other times, or you know opera, whatever it is, um, they all have that. That setting is individual to those playback systems that's interesting.

Marc Matthews:

I I mean, maybe it's because it doesn't fall within my algorithm, but I would have thought that there would have been a push on that from youtube, in terms of letting people know it exists, or google rather google disseminating that information. It's interesting. Interesting, maybe I just wasn't part of that party, I don't know well, there's an information page.

Ian Shepherd:

Yeah, um that. But I mean, this has been youtube's way. They tend to Google's way. They will introduce new features without telling people Watch what happens. Well, no, but it's a tactic, right, it's a blind test, because they then look at the stats.

Ian Shepherd:

Because the concern is people will stop listening to videos because of extreme changes in loudness. So they want to maximize views, which makes sense. So they want to avoid that. So they introduce this feature and they'll just run it and watch. People who have this thing enabled and think, well, do they watch more of the videos? Are they less likely to stop watching? If they do, then it's a good thing, um, and then they'll start rolling it out for everybody else. So there's a kind of couple of factors there. Right, one is there's no, it's not like it wasn't there, and suddenly it is. It's like they'll introduce it for a small number of people, test it and gradually spread it out.

Ian Shepherd:

The other thing to say is they do have an information page. If you google for stable volume, you will find a page where they tell you what it is and all the stuff that I've been telling you. It's not like it's a secret, um. But as you say, I mean another option that I would like, perhaps, that they might do, if they, you know, they don't necessarily want to give channel creators the chance to turn it off, because loudness can be a tactic to try and get certain results, jump scares or make your stuff stick out more than others. So I guess that's why they don't necessarily want to give that option to us straight away.

Ian Shepherd:

Another thing maybe would be if I could tick a box that has a little thing that comes up and says stable volume or not enabled. Would you like to disable? Because people are saying to me oh, put a message at the beginning of all your videos. I could do that, but a, I can't back and go and do it on the stuff that's been going for years, um and b, if they change it in future I'm not going to want it there, because you know everybody says the first 10 seconds of a video is the most important. You want to if the first thing everybody sees is a stable volume?

Marc Matthews:

lecture from me. They're not going to watch that video. Yeah, yeah, I would agree In a way. Yeah, it makes perfect sense what you said there, and then the way you were describing it in my head I'm like, yeah, basically, even if they were to choose a small percentage of the YouTube consumer market, that is still a massive focus group for them to use for these particular implementations of their tech.

Ian Shepherd:

So it does make sense, um, it's logical, but it's not necessarily just. You know, it's a kind of mr spock type issue. I think you know it's. Yeah, it might be logical, but does that? You know? Does that translate to everybody? No, there are specific uses. I mean, asmr videos is another one. Um, the asmr community is up in arms, right, because their whole thing is, you know, tiny little changes yeah and little sounds and the contrast between you know all that kind of stuff.

Ian Shepherd:

It completely messes that up. And classical music as well. It seems like most of the videos that have been in where this is incorrectly applied for some reason. The examples I've seen, the classical, maybe it's just they're the ones that it's most noticeable, um, but those are the ones where dynamics are the most important. You know where it's all about, the, the contrast between the loud and the quiet sections and stuff. So to have those messed with is even more annoying maybe than kind of pop and rock music interesting hopefully they will improve it in in time.

Ian Shepherd:

um can say that you know they are aware of the issue. It's on their radar, but obviously you know there are bigger factors at play and there are priorities within the organisation of where they're going to invest their time. We can only hope that they will listen to us eventually.

Marc Matthews:

Very, very interesting, though I hadn't considered the asmr community. Uh, in that and that I can, I can see why that would be a uh, a, a um, put their nose out of joint for one of the better way of putting it. But uh, yeah, ian, it's been fantastic talking with you today. I realize we've we've gone over and I've taken quite a lot of your time here, so it's been a pleasure having you on the podcast and talking everything, mastering everything in between, and I'm sure there's other bits and pieces we could have spoken of as well and gone on for a bit longer. But again, I just want to say a big thanks for joining me on the podcast today.

Ian Shepherd:

No, my pleasure, I've enjoyed it. It's good talking to you.

Marc Matthews:

Yeah, indeed, indeed, from this episode. At the very least, go on YouTube and check out that feature in particular. And if you ever hear me mention the word target again, feel free to bombard me with whatever it may be that you're going to bombard me with to stop me doing it, because I need to get that word out of my lexicon. Ian, it's been a pleasure and I'll leave you to enjoy the rest of your day. Cheers, buddy.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.