@tongyx361 Check this out : Data Parallel Multi GPU Inference
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Data Parallel Multi GPU Inference | 9 | 4892 | September 15, 2023 | |
| Accelerator OOM | 2 | 1346 | July 5, 2023 | |
| `Accelerator.prepare` utilize only one GPU instead of all the 8 available GPUs and raises "CUDA out of memory" | 3 | 2951 | July 19, 2024 | |
| Loading a HF Model in Multiple GPUs and Run Inferences in those GPUs | 10 | 9970 | October 16, 2024 | |
| Docs Clarification: Is prepare() inefficient for models that are frozen? | 0 | 224 | January 22, 2024 |