Open in app
Privalov Vladimir
67 Followers
About

Sign in

67 Followers
About
Open in app
Image for post
Image for post

Clearing CUDA memory on Kaggle

Privalov Vladimir

Privalov Vladimir

·Jan 9

Sometimes when run PyTorch model with GPU on Kaggle we get error “RuntimeError: CUDA out of memory. Tried to allocate …”

Clear memory with command:

torch.cuda.empty_cache()

Check CUDA memory

!pip install GPUtil
from GPUtil import showUtilization as gpu_usage
gpu_usage()

Output

| ID | GPU | MEM |
------------------
| 0 | 0% | 0% |
  • Kaggle
  • Gpu
  • Cuda
  • Deep Learning
  • Pytorch

More from Privalov Vladimir

More From Medium

Setting Your Own Federated Learning Test Case

Musketeer in The Startup

Text Classification using DeepPavlov Library with PyTorch and Transformers

Vasily Konovalov in DeepPavlov

Dive into Deep Learning

Shanika Perera in The Startup

Describe-to-Detect(D2D) — A Novel Approach for Feature Detection

Bala Manikandan in The Startup

Build your Own Object Detection Model using TensorFlow API

Alakh Sethi in Analytics Vidhya

Regular Dose of Machine Learning

Aditya Kumar in Nerd For Tech

Book Review: The Hundred-Page Machine Learning Book

Steven Finkelstein in The Startup

CNN model With Residual Layers for Landscape Classification

Om Amitesh in The Startup

About

Help

Legal

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store