Open in app
Privalov Vladimir
67 Followers
About

Sign in

67 Followers
About
Open in app
Image for post
Image for post

Clearing CUDA memory on Kaggle

Privalov Vladimir

Privalov Vladimir

·Jan 9

Sometimes when run PyTorch model with GPU on Kaggle we get error “RuntimeError: CUDA out of memory. Tried to allocate …”

Clear memory with command:

torch.cuda.empty_cache()

Check CUDA memory

!pip install GPUtil
from GPUtil import showUtilization as gpu_usage
gpu_usage()

Output

| ID | GPU | MEM |
------------------
| 0 | 0% | 0% |
  • Kaggle
  • Gpu
  • Cuda
  • Deep Learning
  • Pytorch

More from Privalov Vladimir

More From Medium

MNIST-GAN: Detailed step by step explanation & implementation in code

Garima Nishad in Intel Student Ambassadors

Abstractive Text Summarization with NLP

Mars Xiang in The Startup

Atmospheric and Glint Correction of Sentinel-2 Imagery for Marine and Coastal Machine Learning

Rachel Keay in UK Hydrographic Office

Document Translation Using Attention, EAST, and Tesseract

Swarup Barua in The Startup

Federated Learning: A collaborative approach of machine learning

Pragati Baheti

Upper Confidence Bound for Multi-Armed Bandits Problem

Amit Ranjan in Analytics Vidhya

Understanding Language using XLNet with autoregressive pre-training

Maggie Xiao

Why BERT has 3 Embedding Layers and Their Implementation Details

___

About

Help

Legal

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store