470 words
2 minutes
Nvidia System Management Interface

I. nvidia-smi — NVIDIA System Management Interface#

Overview: nvidia-smi is the NVIDIA System Management Interface (系统管理接口). It is used to monitor and manage GPU device status, including GPU memory usage, GPU utilization, temperature, and power consumption.

1. Show Overall GPU Status#

Terminal window
nvidia-smi

Most commonly used to quickly check whether GPUs are idle or busy.


2. Monitor in Real Time#

Refresh every second:

Terminal window
nvidia-smi -l 1

3. List All GPUs#

Terminal window
nvidia-smi -L

4. Show Processes Using GPUs#

pmon = process monitor

Terminal window
nvidia-smi pmon

Refresh every 2 seconds:

Terminal window
nvidia-smi pmon -d 2

1) pmon Column Reference#

ColumnFull NameMeaning
pidProcess IDThe Linux process ID using the GPU
typeProcess typeGPU workload type: C = Compute (CUDA), G = Graphics, C+G = Both
smStreaming Multiprocessor utilizationPercentage of GPU compute cores being used by the process
memMemory controller utilizationPercentage of GPU memory bandwidth used by the process
encEncoder utilizationUsage of the NVENC video encoder
decDecoder utilizationUsage of the NVDEC video decoder
jpgJPEG engine utilizationUsage of the hardware JPEG decoder/encoder
ofaOptical Flow Accelerator utilizationUsage of the hardware optical-flow engine (video/vision tasks)
fbFrame Buffer memoryAmount of GPU VRAM used by the process (in MB)
ccpmCompute & Copy Engine / Protected MemoryInternal GPU engine / protection state info; often 0 on most systems
Key columns to watch: pid — shows which process is using the GPU. sm — indicates whether GPU cores are actively computing. fb — VRAM usage in MB; shows how much memory is consumed. mem — memory bandwidth utilization; indicates I/O pressure on GPU memory.

2) Example Output#

[xli49@ghpc008 ~]$ nvidia-smi pmon -i 0 -s um
# gpu pid type sm mem enc dec jpg ofa fb ccpm command
# Idx # C/G % % % % % % MB MB name
0 - - - - - - - - - - -
0 - - - - - - - - - - -
[xli49@ghpc008 ~]$ nvidia-smi pmon -i 0
# gpu pid type sm mem enc dec jpg ofa command
# Idx # C/G % % % % % % name
0 - - - - - - - - -
0 - - - - - - - - -

All dashes (-) indicate GPU 0 is currently idle — no processes are running on it.


5. Custom Query of GPU Information#

Terminal window
nvidia-smi --query-gpu=name,memory.used,utilization.gpu --format=csv

Commonly used for scripts, logging, and automated monitoring.


6. Log GPU Status to a File#

Terminal window
nvidia-smi -l 5 -f gpu.log

Records GPU information every 5 seconds and appends it to gpu.log.


💡 One-line Takeaway
Use nvidia-smi for a quick snapshot, nvidia-smi pmon to watch per-process GPU activity in real time, and --query-gpu for scriptable, structured output — focus on pid, sm, fb, and mem for the most actionable signals.
Nvidia System Management Interface
https://lxy-alexander.github.io/blog/posts/tools/nvidia-system-management-interface/
Author
Alexander Lee
Published at
2026-02-11
License
CC BY-NC-SA 4.0