Affinity:
I am aware of these apps and even recommend them to others. For myself, workflow related, I have to stay with Adobe (grumbling and mumbling sometimes) for a while.
"more, more mentality":
I understand you, but I don't fully agree. For example, there is no "upgrade tax" that's windows / linux only (while the changes in MacOS will make it difficult to use some older software). Is new hardware more expensive than it was years ago? No, in the contrary. It's actually cheaper as soon as you look away from Apple. Their prices don't have to do anything with what that hardware actually costs - it's what they can get.
And it isn't the graphic industry (directly) that's responsible for all those tech firms to produce faster and better processors, gpus etc. each year: It's the game industry and the gamers who want more immersion, photo-real quality that's sometimes downright creepy. As I said before: Without them we wouldn't have such tech at our hands because for 2d it's simply not really needed, 3d is a market too small to invest that much in development. For me it's simply this: With my actual hardware I can't produce what I want and - more important - to compete.
Hardware and software got cheaper, now with a tendency to get very expensive at the high end market. Which is ok with me. To be honest, I don't really care if a render takes 30 seconds or 10 minutes. I do care if that same render takes 2 hours, though. For a reasonable amount (see KurtF's example) we can get a reasonably professional rig, something with what small time potatoes like me still can compete.
@Radian:
Nope, it isn't a standalone renderer, it's actually quite integrated into blender, C4d and Modo, while a real standalone renderer would work with anything. It's not really developed anymore at the moment, not production ready, and I don't know if it ever will be. Too much "can't do".
It started years ago and was very promising (still is in a way) and the big advantage is simply this: It works with AMD cards and Nvidia. I do hope they will develop it much further, with the side-effect that it could be integrated into Cheetah.
But even if it was production ready and Blender only: I HATE Blender. I probably can tolerate it for special things that I need a few times a year tops (like fluid sim), but to work daily in it would make me want to change my plans to something not 3d related.
By the way, AMD denoising exists, but quality wise it's far behind Nvidia Optics, other Nvidia related denoisers (Maxwell doesn't use optics but something of their own nvidia only) and even Intel denoiser which sometimes gives better results than Nvidia Optics.
Another by the way: I belong to that minority who doesn't think it's that good an idea to outsource everything to the expensive GPUs, so, all in all, I prefer CPU rendering over GPU rendering. The actual development is going back from the overpriced gpus to something like Apple Silicone, meaning there simply will not be a GPU anymore in the near future.
For the moment Intel embree is a compromise. It's not as fast as optics, but it shows that cpu rendering is far from dead.
For the moment I'd prefer an nvidia card, the new 3070 roughly as fast as the 2080ti for a fraction of the cost (some 550 bucks there around), but the coming AMD gpus will be faster (which is outside of 3d of no interest for me); as I have to wait a bit for the availabilty of cpus, I've not fully decided what I will do. Certainly no high-end gpu.