RX7 is out and one part of it's that's new fascinates me.

It's called Rebalance and here's what one site says about it.

"Also new is Music Rebalance, which uses an algorithm trained with machine learning to perform source separation by identifying vocals, bass, percussion and other instruments. A user can then work with individual elements of a stereo audio track, making it possible to adjust a mix without multitracks."

Whither mastering now? It would seem this takes the need for a good ear to identify these parts to tweak in a normal mastering process as the algorithm does it for you.