0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 3 years have passed since last update.

「ValueError: Graph disconnected」Inputレイヤーの扱いミス

Last updated at Posted at 2021-01-25

多次元の数値から連続値を予測するNNを実装。
データの形を正すために、poolingやreshapeを使用していたらValueError...。

エラー文
ValueError: Graph disconnected: cannot obtain value for tensor Tensor("input_state:0", shape=(?, 10, 22), dtype=float32) at layer "attention_vec". 

The following previous layers were accessed without issue: []
google翻訳
ValueError:グラフが切断されました:レイヤー "attention_vec"でテンソルTensor( "input_state:0"、shape =(?, 10、22)、dtype = float32)の値を取得できません。

次の前のレイヤーは問題なくアクセスされました:[]

同じようなエラーに関する質問を発見。

KerasのModelクラスは入力にあたる部分をInputレイヤーで定義したものを使う必要があるようです。

とあり、
l_input = Input(~~~)
と定義しておいて
その下3行もl_input = ...とInputレイヤーの上書きをしていたことが原因でした。

改善後のコード
    def attn_block(self,inputs):
        
        
        
        return output_attention_mul

    def build_network(self):
        nb_dense_1 = self.dim_state * 10
        nb_dense_3 = self.dim_action * 10
        nb_dense_2 = int(np.sqrt(nb_dense_1 *
                                 nb_dense_3))

        l_input = Input(shape=(self.time_steps, self.dim_state,),
                        name='input_state')
        l = GlobalAveragePooling1D()(l_input)
        l = Reshape((22,1))(l)
        x = self.attn_block(l)
        l_cnn_1 = Conv1D(filters=16,kernel_size=2,
                          activation='tanh',
                          name='hidden_1')(x)
        l_cnn_2 = Conv1D(filters=16,kernel_size=2,#16,(2,2),#padding='same',
                          activation='tanh',
                          name='hidden_2')(l_cnn_1)
        l_dense_3 = Dense(nb_dense_3,
                          activation='tanh',
                          name='hidden_3')(l_cnn_2)
        l_dense_3 = Flatten()(l_dense_3)
        l_mu = Dense(self.dim_action,
                     activation='tanh',
                     name='mu')(l_dense_3)
        l_log_var = Dense(self.dim_action,
                          activation='tanh',
                          name='log_var')(l_dense_3)

        self.model = Model(inputs=[l_input],
                           outputs=[l_mu, l_log_var])
        self.model.summary()
0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?