0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

Jupyter + Matplotlib > 横軸の表記がscientific exponential notation(例: 1e7)になるのを抑止する > ax1.get_xaxis().get_major_formatter().set_scientific(False)

Last updated at Posted at 2017-07-31
動作環境
GeForce GTX 1070 (8GB)
ASRock Z170M Pro4S [Intel Z170chipset]
Ubuntu 16.04 LTS desktop amd64
TensorFlow v1.1.0
cuDNN v5.1 for Linux
CUDA v8.0
Python 3.5.2
IPython 6.0.0 -- An enhanced Interactive Python.
gcc (Ubuntu 5.4.0-6ubuntu1~16.04.4) 5.4.0 20160609
GNU bash, version 4.3.48(1)-release (x86_64-pc-linux-gnu)

TensorFlowの学習step数を増やしたことにより、グラフの横軸の表記が1e7のようなscientific exponential notationになってしまうようになった。

qiita.png

元の数字の表記方法は以下で見つけました。
https://stackoverflow.com/questions/14711655/how-to-prevent-numbers-being-changed-to-exponential-form-in-python-matplotlib-fi

ax1.get_xaxis().get_major_formatter().set_scientific(False)

Jupyter code.

check_result_170722.ipynb
%matplotlib inline

# learning [Exr,Exi,Eyr,Eyi,Ezr,Ezi] from ADDA
# Jul. 31, 2017

import numpy as np
import matplotlib.pyplot as plt


def moving_average(a, n=3) :
    # from
    # https://stackoverflow.com/questions/14313510/how-to-calculate-moving-average-using-numpy
    ret = np.cumsum(a, dtype=float)
    ret[n:] = ret[n:] - ret[:-n]
    return ret[n - 1:] / n


def add_lineplot(filepath, ax1, alabel):
    data1 = np.loadtxt(filepath, delimiter=',')
    input1 = data1[:,0]
    output1 = data1[:,1]

    # moving average
    NUM_AVERAGE = 500
    output1 = moving_average(output1, n=NUM_AVERAGE)
    for loop in range(NUM_AVERAGE - 1):
        output1 = np.append(output1, 1e-7)  # dummy

    ax1.plot(input1, output1, label=alabel)
    
fig = plt.figure(figsize=(10,10),dpi=200)
ax1 = fig.add_subplot(2,1,1)
ax1.grid(True)
ax1.set_xlabel('step')
ax1.set_ylabel('loss')
ax1.set_yscale('log')
# ax1.set_xlim([0, 3000000])
ax1.set_xlim([0e6, 3e7])  # Jul. 31, 2017
ax1.set_ylim([1e-5, 1.0])

# --- learning rate=0.001, batch_size=4
FILE_PATH1 = 'RES_170728_t0716//log_learn.170727_t2308'
# --- learning rate=0.0001, batch_size=4
# FILE_PATH1 = 'RES_170728_t2119/log_learn.170728_t0720'
# FILE_PATH1 = 'RES_170728_t2119/log_learn.170728_t0751'
# FILE_PATH1 = 'RES_170728_t2119/log_learn.170728_t0821'
# FILE_PATH1 = 'RES_170728_t2119/log_learn.170728_t0850'
# FILE_PATH1 = 'RES_170728_t2119/log_learn.170728_t0921'
# FILE_PATH1 = 'RES_170728_t2119/log_learn.170728_t0951'
# FILE_PATH2 = 'RES_170728_t2119/log_learn.170728_t1050'
# --- batch_size=2
FILE_PATH2 = 'RES_170729_t0744//log_learn.170728_t2311'
# --- batch_size=20
# FILE_PATH3 = 'RES_170728_t2311/log_learn.170728_t2208'

# for [10x30x100] 2017/07/29 18:45
FILE_PATH3 = 'log_learn.170731_t2045'

# for learning rage=0.001, 30x100x100
add_lineplot(FILE_PATH2, ax1, alabel="batch size=2")
add_lineplot(FILE_PATH1, ax1, alabel="batch size=4")
add_lineplot(FILE_PATH3, ax1, alabel="batch size=2")

ax1.legend()

ax1.get_xaxis().get_major_formatter().set_scientific(False)

qiita.png

step=3e7の頃にはlossは1e-6になるだろう(hopefully)。

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?