Why the Volatile GPU-Util always 0%

I'm traing a deep learning network with tensorflow on centos7. There are two graphic card in my computer. During training, for one graphic card, the volatile GPU-Util is always 0%. Is there something wrong? Is it means only one graphic card is calculating and the other one works like a memory? Here is the information during training.enter image description here