DeepStack Case Study: Performance from CPU to GPU version

i've been happily running Deepstack via AITool until a couple months ago upgrading BI which apparently broke the AITools trigger/flagging into BI. So am looking now at trying to figure out how to get what I was doing in AITools done in BI without the added app.

I had the Deepstack GPU version installed as I installed a Nvidia GTX 1050 card (640 CUDA Cores but only 2GB).

Is this only for Deepstack running on Windows vs the Docker GPU version?

Yes it is intended to be the Windows version. Someone can probably figure out how to Docker it, but I know many tried and gave up.
 
Yes it is intended to be the Windows version. Someone can probably figure out how to Docker it, but I know many tried and gave up.

Thanks. Will see if I can find a dummies guide to setting up Deepstack for BI!
 
Thanks. Will see if I can find a dummies guide to setting up Deepstack for BI!
hit me up when you find that app for older guys with attention deficit disord
hey look a squirel
 
Okay, tackling the initial install. Have installed 11.7 CUDA. For CUDNN though, am stuck at a couple things in the explanation:

3.1.3. Installing Zlib
Zlib is a data compression software library that is needed by cuDNN.
Procedure
  1. Download and extract the zlib package from ZLIB DLL. Users with a 32-bit machine should download the 32-bit ZLIB DLL.
    Note: If using Chrome, the file may not automatically download. If this happens, right-click the link and choose Save link as…. Then, paste the URL into a browser window.
  2. Add the directory path of zlibwapi.dll to the environment variable PATH.

What does "Add the directory path of zlibwapi.dll to the environment variable PATH" mean exactly?

Also, I seem to have an NVidia Corporation directory in both Program files and PRogram Files (x86), and in Program Files, I also have a NVIDIA GPU Computing Toolkit directory.

For installing cudnn, the instructions state:

  1. Copy the following files from the unzipped package into the NVIDIA cuDNN directory.
    1. Copy bin\cudnn*.dll to C:\Program Files\NVIDIA\CUDNN\v8.x\bin.
    2. Copy include\cudnn*.h to C:\Program Files\NVIDIA\CUDNN\v8.x\include.
    3. Copy lib\cudnn*.lib to C:\Program Files\NVIDIA\CUDNN\v8.x\lib.

Would that imply the NVidia Corporation directory within the Program Files directory, and I leave the x86 one alone?

Lastly, I don't have a cudnn directory within any of my currently available directories, was it supposed to have been created when I installed CUDA?
 
Okay, tackling the initial install. Have installed 11.7 CUDA. For CUDNN though, am stuck at a couple things in the explanation:



What does "Add the directory path of zlibwapi.dll to the environment variable PATH" mean exactly?

Also, I seem to have an NVidia Corporation directory in both Program files and PRogram Files (x86), and in Program Files, I also have a NVIDIA GPU Computing Toolkit directory.

For installing cudnn, the instructions state:



Would that imply the NVidia Corporation directory within the Program Files directory, and I leave the x86 one alone?

Lastly, I don't have a cudnn directory within any of my currently available directories, was it supposed to have been created when I installed CUDA?

Still stuck on the above steps... found a video that seems to imply you just copy the bin files from the CUDNN downloaded zip directory into this directory here, is that correct?

1663333850793.png

Do I need to install the zlib data compression software library for this as noted in 3.1 here? I can't seem to access my Evironmental variables/PATH, seems to be greyed out even though I have Admin rights on the Windows user:

1663334012342.png

Edit: Ah, figured out the env variables thing, need to hit the windows key+pause/break key to get to the system page, then hit advanced system settings, then click on Environmental Variables and I can edit there. Just confirming whether I need to do this?

Edit again .... just decided to create an NVidiaTools\Zlib directory and add the path into the environmental variables, presumably doesn't hurt anything even if not needed.
 
Last edited:
You can use the attached script to install cuDNN and zlib. This script is from the CodeProject.AI project but will work for other CUDA AI

Beautiful, thank you!
 
Hmmm...Seem to be getting a huge amount of "nothing found" alerts coming through to my phone. Shouldn't only confirmed alerts make it through?