I still think that the person changing has something to do with it. The sound isnāt changing, the person twiddling the knob is - bur we mis-identify the source of the percieved change - as something coming from outside, when the change is actually (only) happening in the mind.
A lot of people on Gearslutz swear by the sound of Harrison Mixbus. I downloaded copy when they did one of their sales and I payed I think 29,- for it but sadly it has just been collecting space on my hard driveā¦
My understanding is that Mixbus used to be special when it was the only game in town that could do inter-channel plugin communication to simulate a mixing desk in front of you where the channels push and pull on each other. Inter-channel plugin communication from VST to VST is still rare, but itās happening from third parties now so it doesnāt need to be a DAW feature so much any more. Airwindows console 5 I think is their latest, and it really does a good job of doing some light saturation and some non-linear summing on busses where whatās happening on one channel can have an effect on all the others.
Brief primer on inter-channel communication: Most console emulators simulate a single channel in isolation, but then when you overload the adjacent channel, nothing happens. With inter-channel plugin communication, the overload of one buss can saturate others as well, etc.
All DAWs are identical. Itās all about the processing you apply. Effects, different instruments will sound different, dIfferent limiters do different things⦠Some of stuff has their own unique perks. But if youāre messing with audio files only thereās 0 difference. Bounce the same audio in each DAW, invert phase of one bounce then play it at the same time with another that came from a different DAW. No sound? Audio files are identical, repeat this with every DAW to find āthatā difference which doesnāt even exist.
Probably the most forward-looking example is Izotope with the masking meter from channel to channel and the visual mixer on the master outs of a mix, not using it to do better analogue saturation at all. I have no idea what Slate does, but I think there is a lot of unexplored potential in this kind of thing.
Most people do the opposite because they sound the same and itās easier to mix in Reaper than FL, but I think you know how to have fun in the studio and thatās what counts
That perfectly describes the effect. No two listens are the same⦠even by the same person listening to the same track on the same equipment. Add to that someone asking āā¦do you hear the difference?ā and the human brain will find something different.
Itās the same reason humans can be tricked by magic⦠suggestion and misdirection .
Doh!!! Iāve done this so many times that Iām slowly (too slowly!) learning to push settings hard to clearly hear the difference before tweaking to taste.
Funny⦠but Iāve always done a version of this āpush hardā when learning to perform a new piece⦠especially with another performer⦠I will intentionally exaggerate different aspects of the piece⦠above and below where it should be. it really helps to zero in on a good clear dynamic range of color, emotion, tempo and of course volume.
Itās also why my tag line and personal Mantra/Motto isā¦
Reality is one thing⦠but perception is everythingā¦
I have a dilemma thatās kind of related to this that someone smarter than me might know about.
I just set up my studio-making stuff on my work PC because my living situation just got a little cramped and Iām having to consolidate temporarily, and things are sounding extremely different through ASIO when Iām looking for low-latency instrument tracking. So when I mix appropriately, my renders are all coming out entirely different than they actually sound unless I flip back to non-ASIO methods of listening in Live. Even FLās ASIO produces the ārealā sound, so the problem is obviously ASIO4ALL. But I still donāt understand why this would happen.
Iāve never had my DAW sound different before, but the difference is pretty extreme. Any ideas? Iāve messed with buffer size and just about every other setting within ASIO4ALL to no avail.
Also, tracking with FLās ASIO (it works with live, thanks broās) produces way too much latency to pull off a basic rhythm riff, so thatās not even an option. DAW sounds, how do they work?
Iām just guessing but I would assume that itās a combination of the algorithm being used to gauge the quality of the sound being produced and the actual hardware components of the pcā¦
Iām just totally guessing because I really have no fucking clueā¦
I donāt think so, For example if you take load a wav file into FL Studio for example and duplicate it into Ableton, there is absolutely no difference in its tonal integrity. In this case where the difference of night and day lies is in the methods of processing, I think.
As for the VST VS. Hardware: Abso-fucking-loutely. Sure Arturia VSTās sound just like their hardware counterpart through meticulous engineering but like anything synthetic, It cannot replicate the feeling.
Could it be headroom? Maybe FL studio Asio does math in 32-bit and Asio4All only does 24? Then, anything over 0 clips instead of moving the mantissa as in 32-bit.
Thatās possible, since the only thing that does it is ASIO4ALL. I wonder if I can capture the weirdness if I record the master internally. To the bat cave
Edit: just as I thought, this canāt be recreated on renders or internal resampling. Outputting the audio to something might work