Qiita Teams that are logged in
You are not logged in to any team

Log in to Qiita Team
Community
OrganizationAdvent CalendarQiitadon (β)
Service
Qiita JobsQiita ZineQiita Blog
19
Help us understand the problem. What is going on with this article?
@chin_self_driving_car

nvidia-smiでGPUのメモリ使用量を継続的に監視する

More than 1 year has passed since last update.

nvidia-smiでGPUのメモリ使用量を継続的に監視する

データセット訓練中に、CUDAのメモリが足りずにエラー
一体どこでそんなにGPUのメモリを食っているんだ!
nvidia-smiを連打する?

いいや、もっといい方法があるぞ

#0.5秒毎にnvidia-smiの情報を更新して表示する
watch -n 0.5 nvidia-smi

#上記+変更点をハイライト
watch -d -n 0.5 nvidia-smi

実行結果

Screenshot from 2019-10-22 15-46-59.png

では〜

19
Help us understand the problem. What is going on with this article?
Why not register and get more from Qiita?
  1. We will deliver articles that match you
    By following users and tags, you can catch up information on technical fields that you are interested in as a whole
  2. you can read useful information later efficiently
    By "stocking" the articles you like, you can search right away
chin_self_driving_car
自動運転が大好き!燃える自動運転エンジニア! 気に入ったらフォロー、Likeしてね★ コメント、ご指摘募集しています:)

Comments

No comments
Sign up for free and join this conversation.
Sign Up
If you already have a Qiita account Login
19
Help us understand the problem. What is going on with this article?