My point is - when you're running the machine from a 4K monitor sitting in front of it with a desktop resolution of 4K, it's fine.
But as soon as that monitor is disconnected, the display adapter 'falls back' to a lower resolution. This is why when you RDP into it, there's no performance hit. Because you're seeing it in that lower resolution.
In my case, I had a 4K monitor plugged into it, and so that meant I was RDP'ing into it at the full 4K resolution.
So it had to render each 4K frame (60 per second) to send to me, regardless of what 'scale' option I set.
You can see this for yourself if you have your monitor connected to it and turned on - at the full 4K desktop resolution - and then try to RDP into it while the console is open.
It was immediately obvious to me that as soon as I disconnected that HDMI cable, I happened to already be logged in via RDP, and I saw the RDP feed change resolution, and the CPU/GPU usage plummet.
To test this, I reconnected the HDMI cable, and connected via Parsec this time, because that app lets you actually change the rendering resolution of the server. I selected a lower desktop resolution for it, and viola - single digit usage.