Correct, AMD has been fine to use with
Blue Iris for at least 5 years now. Intel was the undisputed best choice for Blue Iris from about 2010 to 2017 because AMD was simply not competing effectively. AMD released the Zen 1 architecture in 2017 which made them less of a terrible choice, but even then, Blue Iris could use Intel's Quick Sync Video for hardware accelerated decoding, which offered an efficiency gain that you could not match with an AMD CPU.
Blue Iris changed the game when it added support for sub streams in version 5.2.7 (May 1, 2020), almost exactly 5 years ago. Now if you care at all about efficiency, you configure Blue Iris to use sub streams on all your cameras. This makes the video decoding workload so small that hardware accelerated decoding doesn't matter anymore, at least not enough to make up for its occasional malfunctions. For some users, sometimes, it would cause video corruption or instability or both. Now it is advisable to use sub streams and leave hardware accelerated decoding turned off, and this is what really what takes away Intel's advantage in Blue Iris.
So honestly the 9900K is still a very powerful performer for Blue Iris and I would advise not upgrading if it isn't giving you any problems. You could take out the 1080ti card because it is probably just wasting power unless you need it for AI acceleration. The Ryzen 9 9950X3D is comically overkill. In fact if you find yourself with an overpowered CPU like that running Blue Iris, you should probably turn on efficiency/eco modes in the bios, such as a lower TDP limit, to reduce the power consumption.