I noticed on Blue Iris web page it states it recommends a Nvidia Graphics adapter. The current system running Blue Iris is an i7 11700K 64GB of DDR4 3200 I have main program running on an M.2 with 4X WD Purple 10TB drives for Storage. Is it really needed to add a Nvidia Display adapter for any real reason? I have several cards laying around that I could use for the purpose 970, 1070, 2070, and a 2080Ti. I am just not sure if it would be of any benefit to add a card, the system is only used for BI and nothing else will take place with the system. The system is only 8 Cameras with the possibility of adding up to 4 more Cameras in the future.