Part of it is my own expertise. I work as an EE and design Workstations for a living. In the past I have done graphics, and memory subsystems, currently I do CPU and power subsystems, and I have a better than average understanding of what impacts performance in a system. The short answer is always, it depends on the type of workload your apps have.
As far as my recent investigation into a new graphics card. Besides TC, I run a lot of photo editing and drawing software. I really wanted to take advantage of my 10bit displays. When you look at consumer graphics cards, they only support 10bit in full screen games. This has been asked and confirmed multiple times in various NV and AMD forums. Even though there may be an enable 10bit check box in the driver setup, when it is used is limited. This is one of the differentiations used between consumer and pro graphics.
When it comes to the pro graphics like Quadro and FirePro, mostly what people are after is ISV certification. This means the drivers have been thoroughly tested and vetted by various software vendors and are pretty much guaranteed to work for that application, consumer type software makers don't do this.
When it comes to using the GPU, both cards ( GTX and Quadro ) make this available, both support. With my older gtx 1070, I can watch in task manager that TC and one of my other programs actually utilize the GPU. I looked at various benchmarks and the RTX 2080 constantly out performs the RTX 4000.
So I will circle back to 10bit, because of the things I do, 10bit display would be real nice, and I would have gladly bought an RTX 4000. Problem is, 10 bit display is not a transparent thing, the software has to be written to do it. Both 10bit output and GPU support. I contacted the vendors that do the software I have and none do. One said they were working on it ( GPU support and possibly 10bit output ), but for now, none do.
None of my software does ISV certification or 10 bpc output, the ones that support GPUs work fine with the GTX cards.
That is how I concluded that a faster RTX 2080 would likely be a better choice than an RTX 4000 today.
I just went through the exercise of looking at graphics cards. I was going to get and RTX 4000 to replace my GTX 1070. What I found was there was no advantage to getting a Quadro card. You would be better off getting the RTX 2080 which is faster and cheaper.
Where is your source material for this conclusion?