Answer by Mike Hollinger (186) | May 03 at 09:02 AM
Hey! Neither IBM nor NVIDIA support that particular stack of software, and we’ve never tested that particular combination, not will we ever test it.
The gaming-class GPUs are different in a variety of subtle ways from our supported Tesla GPUs that we know and love.
It might work - but - you’ll be on your own and should expect no further support or information from us, NVIDIA or any of the other vendors involved in the stack.
If you’re doing a demo like this, why not spin up a single GPU VM in IBM cloud with the Inference container? That stack -is- supported and is something we regularly demo.
A virtual server with the following should be sufficient:
1 GPU
Any number of virtual CPUs
32 GB RAM
100 GB storage
Ubuntu or RHEL (up to you)
Unable to upload zip files including jpeg files into PowerAi Vision 1 Answer
For image classification, what are the heat maps based on? 1 Answer
Use of PowerAI for prototyping and continious API functioning 2 Answers
PowerAI-vision missing docker images 2 Answers
PowerAI-Vision API inference call fails from a Windows 10 environment 1 Answer