New Mastering Contest

  • Thread starter Thread starter Yareek
  • Start date Start date
SonicAlbert said:
What am I, chopped liver? :(

I think it's a good idea to have a cross section of "judges", not just ME's. The files will be up for all of us to listen to, right? Why not just set up the judging as a poll, and have people vote as they would in any poll?

I knew that I might offend someone with a list, no Sonic Al you're not chopped liver. I have just known these guys and their dedication to the board longer.

You're aces w/ me pal!
 
Hey fin, is that Geomana studio yours? Looks really nice!

Cool that you can do all that with some well-treated rooms and a mix of prosumer and nicer gear. Not too out of reach.
 
Yareek said:
Hey fin, is that Geomana studio yours? Looks really nice!

Cool that you can do all that with some well-treated rooms and a mix of prosumer and nicer gear. Not too out of reach.

Yes it is mine Yareek Thanks. :D
Yep guitar center all the way! my parter at Geo is a Gm there so everythings is factory direct pricing for us. We by it at the the price that they buy it for.
Talk about not too far from reach! lol :p
Now Falcon on the other hand....... :cool:
 
masteringhouse said:
Other good choices (IMHO) would be Bruce, Farview, Rock, Ronan, and Mr. G.

Yeah am i chop liver too!
What the hell!






























KIDDING!

sorry i couldnt resist! :p
 
xfinsterx said:
Yes it is mine Yareek Thanks. :D
Yep guitar center all the way! my parter at Geo is a Gm there so everythings is factory direct pricing for us. We by it at the the price that they buy it for.
Talk about not too far from reach! lol :p
Now Falcon on the other hand....... :cool:

Oh yeah I was just looking at that...good lord that Piano room...

So do you work at Falcon and Geomana then?

That's gotta be a blast.
 
Yareek said:
After all, getting real constructive feedback on a master is more valubale than a chance to win some leftover gear.

You get both!

EDIT: Of course, then he drops "Fairchild" and "Ribbon" on us ;)

I do have a ribbon. I do not have a Fairchild :o


OK, here's my draft categories, comments please:

MASTER IMPACT/LEVEL -- punchiness, proper use of dynamics, proper overall volume for the style of music.

LOW END -- how well the low-end is balanced in the master.

AIR/TOP END -- how well the high-end is balanced in the master.

IMAGING/DEPTH -- how well the soundstage of the master is laid out between the speakers, how well different elements of the track sit in their own space.

TRANSLATION -- how well the master plays on various consumer systems.

TECHNICAL MERIT -- lack of distortion, clipping, effectiveness of transitions (fade-in, fade-out) and "housekeeping" tasks like noise reduction, glitches, etc.

MASTER QUALITY -- subjective assessment. Is the track "radio" or "CD" ready? Does it meet the client's stated goals?



For those unfamiliar with the PMC system, each category is assigned a score of 1-10, as follows:

1 - Yikes!
2 - 3 Poor
4 - 6 Fair
7 - 8 Good
9 - Excellent
10 - Massenburg :)

Each category is weighted equally, and the masterer with the highest total score wins!

Really everybody wins, because everybody gets comments and improves their skillz, :)



My comments on changes from Bear's PMC categories:

Low end and high end are unchanged.

Imaging and Depth are separate categories in mix judging, but I combined them since there is relatively less influence on these elements in mastering than mixing (plus I needed the room!)

I also combined Level and Impact into a single category, with the idea that dynamics processing in mastering is working towards that combined goal.

Translation addresses a key goal of mastering--in Bob Katz' book, he points out the importance of getting the midrange right for playback on a bandwidth-limited system. I sometimes think in PMC judging that if the lows and highs are right, that must mean the midrange is too! But here I mean to focus on that point.

Technical Merit - I never bothered too much with noise or glitches and didn't fret obsessively about clips when judging PMCs, but here let's cross our i's and dot our t's please! Or something like that ;)

Quality is the subjective rating for the PMC, I've added a couple of suggestions, but this is basically a judge's discretion category.
 
Yareek said:
Oh yeah I was just looking at that...good lord that Piano room...

So do you work at Falcon and Geomana then?

That's gotta be a blast.

Yes i work at both, and yes it MOST CERTAINLY is a blast :D
 
ms,

Those are pretty good categories for judging. Could I ask some questions and make some observations and/or suggestions?

First, I'd like to se something in there along the lines of "artistic shaping". That's a lousy term, I know, I'm just using it now until I think of something better. The idea here is that folks can meet all the technical points, but what seperates a technically good master from a great-sounding one is what sonic direction to push the master and what kind of sound to give it. For example, which part of the mix do they accentuate or clean or compress or whatever in order to bring across a "tone" or "feeling". One can be perfectly "balanced" and still sound flat whereas another one might purposely "un-balance" the low end to give it more drive.

Also, there are real limits as to what mastering can do with soundstage imaging if they are not working from stems. This is more of a reflection on the quality of the mix given to the ME than on the job the ME did.

And again, everything there referrs to the intrinsic quality of the master itself, there's nothing that addresses the quality of job performed by the engineer based with the tools they have to work with. Under justthose categories, there's no way for the average member of this board with the average project studio setup to get decent marks. There should (IMHumbleO) be some heavily-weighted category that factors in "degree of difficulty".

Finally, I reserve the right to rank the entries - i.e. to judge - independant of the numbers that I put into those poll categories. Because of the the arbitrary nature of category weighting, it is entirely possible that the entry that I score the highest is not the one I actually think sounds the best.

I know, you hate me don't you. :) Well, if you want me to dumb it down for this process, then you got the wrong guy.

G.
 
SouthSIDE Glen said:
Also, there are real limits as to what mastering can do with soundstage imaging if they are not working from stems. This is more of a reflection on the quality of the mix given to the ME than on the job the ME did.
G.

Glen, could you explain the "stems" portion? I've read on here somewhere, and quite some time ago, about this. Is this where you have a stereo mix, but also have individual tracks to use if necessary? Or am I way off on this?
Ed
 
Dogman said:
Glen, could you explain the "stems" portion? I've read on here somewhere, and quite some time ago, about this. Is this where you have a stereo mix, but also have individual tracks to use if necessary? Or am I way off on this?
Stems are like submixes or subgroups. Sometimes the mixing engineer will pass not only a stereo mixdown to the ME, but also partial mixdowns in the form of stems (or both).

It's my understanding (correct me if I'm wrong) that this contest will be deaing only with mastering a stereo mixdown, which is fine. My point was that in such an instance, there is very little the ME can or should do to affect stereo imaging.

G.
 
SouthSIDE Glen said:
Stems are like submixes or subgroups. Sometimes the mixing engineer will pass not only a stereo mixdown to the ME, but also partial mixdowns in the form of stems (or both).

It's my understanding (correct me if I'm wrong) that this contest will be deaing only with mastering a stereo mixdown, which is fine. My point was that in such an instance, there is very little the ME can or should do to affect stereo imaging.

G.
Ok, that makes sense. Probably exactly what I read, but had forgotten.
Thanks.
Ed
 
SouthSIDE Glen said:
Finally, I reserve the right to rank the entries - i.e. to judge - independant of the numbers that I put into those poll categories. Because of the the arbitrary nature of category weighting, it is entirely possible that the entry that I score the highest is not the one I actually think sounds the best.

I know, you hate me don't you. :) Well, if you want me to dumb it down for this process, then you got the wrong guy.

G.

I certainly appreciate the comments, but let's work out the issues upfront so we can get some measure of consistency in judging, both between judges and contests.

That last comment suggests to me that you don't feel there is enough subjective weighting in the categories. We can weight the categories to address that. I agree with your comment on soundstaging; I reduced it from two categories in the PMC to a single category here. Perhaps you feel that should only have a half-weight? We could then assign the final Quality category a 1.5 weight. As that is strictly subjective, you should be able to use that category to achieve the overall ranking you feel appropriate.

You can also interpret categories more broadly than I have spelled out; those are just guidelines. For example, you could score artistic use of dynamics in both Impact/Level and Quality.

The "degree of difficulty" weighting is novel. The PMCs have the same factors at work, and yes the results unsurprisingly have the mixers with better gear closer to the top. But they are also more experienced. It seems there aren't too many idle rich hanging out here! But I don't know how to implement a gear ranking system, except maybe dollars spent on the gear used :confused:

How about this: we ask people to put themselves in a low or high bracket, based on their own assessment of their experience and gear, and award separate prizes? That way scoring can still be straight across the board, because I think people want to know where they stand.
 
masteringhouse said:
I knew that I might offend someone with a list, no Sonic Al you're not chopped liver. I have just known these guys and their dedication to the board longer.

You're aces w/ me pal!

Hey no problem, I was just having some fun with you. ;)

I'd offer to be a judge but I don't have enough time right now to do the process "justice". I certainly look forward to listening to the entries though!
 
Well, something about weighting and consistency.

Why don't we let each of the judges weigh the categories as they see fit? They can provide a brief explanation of their rationale as well. Because the same judge is judging all the mixes, the judge is the control, not the ranking system. That way you can have three judges that each might give a little more or less weight to something, but they are consistent in doing so.
 
Yareek said:
Well, something about weighting and consistency.

Why don't we let each of the judges weigh the categories as they see fit? They can provide a brief explanation of their rationale as well. Because the same judge is judging all the mixes, the judge is the control, not the ranking system. That way you can have three judges that each might give a little more or less weight to something, but they are consistent in doing so.
Yeah, what I'm unsure about with the whole weighting thing is that weighting is somewhat context-dependant. For example, the weighting on "low freq balance" and "high freq balance" depends one heck of a lot on how well the balance is on the pre-master mix. If the pre-master is already pretty well balanced, then those categories aren't all that importantin mastering (unless the ME screws it up badly ;) ). Same thing with some of the other categories.

I'm just thinking out loud here...but...

Maybe something along the lines of zero weighting factored into the categories, they all are worth equal value. And maybe instead of "degree of difficulty" and "creativity" there might be simply an "Intangibles" category. This category would be used by the judges to actually assign their personal vote of value above and beyond the other rigid technical categories. This value would be explained in the judge's text explanation/discussion/review of the candidate. Does that make any sense?

G.
 
Yareek said:
Well, something about weighting and consistency.

Why don't we let each of the judges weigh the categories as they see fit? They can provide a brief explanation of their rationale as well. Because the same judge is judging all the mixes, the judge is the control, not the ranking system. That way you can have three judges that each might give a little more or less weight to something, but they are consistent in doing so.

We could. I can design that spreadsheet easily. But I believe a simpler system is better than a complex one. The contestants also should know the system up front, and ideally we would have a system that didn't change from contest to contest.

The difference between the first place mix and tenth place mix is often about 10 points (out of 70 total). That is also the amount of points available in the single pure subjective category. Thus, I really believe there is enough flexibility in a fixed weight system to get the final results in the desired order.
 
SouthSIDE Glen said:
there might be simply an "Intangibles" category. This category would be used by the judges to actually assign their personal vote of value above and beyond the other rigid technical categories. This value would be explained in the judge's text explanation/discussion/review of the candidate. Does that make any sense?

That's the last category . . . it's called "Master Quality" and is purely subjective
:o :o :o
 
mshilarious said:
The difference between the first place mix and tenth place mix is often about 10 points (out of 70 total). That is also the amount of points available in the single pure subjective category. Thus, I really believe there is enough flexibility in a fixed weight system to get the final results in the desired order.
Ah, but the difference is in how the points are assigned, and that will make a difference in the total. In fact, if the nominal swing is 10 points, the subjective category is potentially large enough to make the difference between first place and last place.

The weighting system itself is just as subjective as the judging; who says that category A is more important than category B? And with which song?

Plus, I'd like to avoid the possibility - which is entirely real - that the entry to which I give the highest technical points is not the one that ultimately sounds the best. Some kind of subjective value is necessary to reflect how the judge actually thinks the entry sounds, not just whether the engineer did things properly by the book.

YMMMHOETC

G.
 
SouthSIDE Glen said:
Ah, but the difference is in how the points are assigned, and that will make a difference in the total. In fact, if the nominal swing is 10 points, the subjective category is potentially large enough to make the difference between first place and last place.

PMC Judges have been pretty good about using the range. I did some analysis on this thread:

https://homerecording.com/bbs/showthread.php?t=156225
 
mshilarious said:
PMC Judges have been pretty good about using the range. I did some analysis on this thread:

https://homerecording.com/bbs/showthread.php?t=156225
We're in agreement then:

mshilarious said:
I think the control for that has been that the factors are equally weighted.

and

mshilarious said:
yes I imagine the distinguishing characteristics will be, and should be, purely subjective.

Coudn't have said it better myself :)

If statistical and quantitave analysis were good enough to accurately "score" the quality of a recording, we'd all have been replaced by a computer program that could automatically create the perfect recording twenty years ago.

The more I think about it, the more I become convinced that my first instincts were correct and this whole "contest" thing is both misguiding and misguided. Nothing personal intended, it's just not for me. Count me out.

G.
 
Back
Top